“The surprising thing we find is that, essentially, you can use the largest model to help you automatically design the smaller ones”
Hey, how do we get a clickbait title out of this?
“The surprising thing we find is that, essentially, you can use the largest model to help you automatically design the smaller ones”
Hey, how do we get a clickbait title out of this?
The real solution is to solve the power imbalance. What percentage of creative media is controlled by the already obscenely wealthy? We don’t want “non infringing proprietary models” to be the only legal models, because then the only ones with access to such powerful tools are the ones that can afford the Adobe art tax.
We need to hold our governments accountable to hold the oligarches accountable for imbalancing the power struggles to an unethical degree. The common people have received no benefit from technological improvement based productivity gain in the past 50 years and this will only get worse until it is fixed in drastic fashion
The common people need a GUARANTEE to benefit from productivity increases. Unions are also good, but nothing is being done about unethical anti-union campaigning from those with already imbalanced amounts of power and influence.
Yadda yadda. Going after open source models ain’t gonna help. I’m fine pushing for special forgiveness for open models, but don’t just put the ball into the hands of the people who can afford proprietary datasets.
give us a way to fix the issue without relying on the idiots at the top being decent human beings.
if you can fix that issue then we wouldn't have so much of a problem.
i'd expect AI to help through information processing for research and engineering. current AI tools are already useful to many as co-pilot tools. not everyone is creative enough to get use out of AI, but we are moving towards being able to dictate and gesture in natural language to optimize some things that may have taken a lot more time. it's also valuable for certain efforts in optimization and engineering. does everyone hate alphafold now too?
i think a lot of the AI hate right now is from the fact that it takes thought and creative use to get the most out of available tools. as we all learned, if it isn't already "AGI" it's 100% useless for everything forever.
that's definitely the best bet, although i feel AI tools being used by the people who actually want to fix the environment are going to have more success than those asking the people in power to change the system or themselves.
although if you have ideas on that front i'm all for it.
i just don't believe the anti-AI fanti-nerdAI-bro fad is really helping… anyone, in any way.
bitcoin never had a use other than "will become valuable?"
many (myself included) believe this technology will probably be the only one that will develop fast enough to actually help with the climate crisis.
optimizing research and academia as well as environmental issues through information processing. people are excitedly talking about automated proof-checking and context finders that can sift through hundred of papers while you check your coffee. this stuff is good for science and science is good for environmentalism. maybe go after the politicians and companies that are not possibly going to be a benefit in the struggle against environmental collapse.
why do people keep relating it to bitcoin? because it uses GPUs? that's literally the only connection.
somehow people have associated it with crypto and NFTs as if they are even mildly related. perhaps because those things are easier to hate, so why not associate them.
hey, that's a better critique or commentary than in the onion article.
while i don't doubt people are trying to shove AI into a lot of places it's not optimal yet, (which is entirely fair and reasonable to point out) i don't think that's a fair reason to poo-poo any use or positivity about AI in any context.
rather, it's become a really big fad to hate on AI and insult anyone who uses it. i mean, the technology is still young, but the stuff it's already doing was "impossible" and "never going to happen" a few years ago. now we are developing things like text to 3d, which makes me excited for a future environment where you can dictate design and animation for entire animated experiences/movies.
independent creatives will have a blast with it. salty onion article writer will be angrily yelling at his computer.
the sentiment being any positive opinion on AI? yes, like i said i'd forgive it if were funny or clever.
it is literally just "people who like this thing are bad and dumb and useless and the world hates them."
really top quality satire. they sure did show how useless AI is and how dumb the fans are.
maybe they could at least target the failure use-cases? some bad business AI ideas that are doomed to fail?
nope, just reddit comment quality insults.
Salty writer fears being made obsolete by beep boop. Insults every AI enthusiast as well successful engineers and scientists.
i hate how popular it's become to hate on AI amongst people who know little to nothing about it.
Id forgive it if it were clever or funny, but this is really just obviously salty ad hominem strawmanning by someone who doesn't understand or appreciate the technology
Guess what fam, we are in the copilot tool phase. You can learn how to use these new tools AND learn how to be creative. Maybe then you could ask it to critique the humour in your satire article. Perhaps it would be more clever than "people who like this thing I don't like are dumb, and can't be creative or better than me In any way, because I'm cooler than AI will ever be!!! You nerds are stooooopid!!"
Because that's how it read.
… Basically the day it was created.
So you also equate French nobility with nazi victims?
Nobody is being called out for their race or sexual preferences or body they were given at birth. The people being targeted here are defined only as the ones who have plundered the world to fill their pockets regardless of the cost. It is the very act of power hungry and despotic rule that leads to this call. Remember that every family member and friend we lose to the meat grinder could have lived happily if not for the ones who deem us unworthy of human treatment. Every home burnt to the ground as the fires worsen, because they refuse to let environmental care affect their overflowing coffers.
How many more people should die and suffer for their wont?
This is not a comparable situation, you silly person.
Can we talk more about deceptive patterns? I had my computer go down recently, and I was reminded just how bad mobile is.
Pick a game at random and I'll show you a dozen direct and intentional manipulations to get you into the habit and environment they want you to be in for optimal resource extraction. I miss when games were an artform rather than a human habit adjusting set of professionally designed manipulations that can annoy you into the right mindset to give money. Not to mention the advertisements which range from absolute fabrication to actual scam.
I’m just glad for some positive results training ai on ai content. I assume theres still a lot of ways we can improve or adjust large bodies of data with AI.
Reminds me of the article saying open ai is doomed because it can only last about thirty years with its current level of expenditure.
And you are the only voice of reason in this thread.
“Make up shit that makes OpenAI look bad” is like tech article gold right now. The amount of times i am seeing “look what ChatGPT said!!!” As if prompter intention is completely irrelevant to model output.
Objectivity doesn’t exist anymore. It’s just really popular to talk shit about ai right now.
Like when Altman effectively said “we should only regulate models as big or bigger than ours, we should not regulate small independent or open source models and businesses” to Congress, which was followed by endless articles saying “Sam Altman wants to regulate open source and stamp out smaller competition!”
I have no love for how unopen they’ve become, but at least align criticisms with reality please.
Absolutely magical.
this is a difficult one.
for people (as well as myself) to understand nuance and the complicated nature of communication and interaction. our brains are good at filling in gaps of information, which is difficult for us to perceive. there is a complexity and sparsity of interpretations and perspective which we are largely incapable of realizing. this is largely due to the excess of knowledge and experiences in the world, which can be combined or perceived in countless different ways. we are especially ignorant to what we are ignorant of.
this means we exist in a high-dimensional battlefield ball of misunderstanding, misinterpretation, and unintended inability to convey what was intended.
when we say something to someone, we expect they understand what we mean, but often their interpretations of the words you use can vary highly in ways you could not have predicted from your perspective. as well you may fail to realize the existence of several things that the other party understands or believes, which influences their perspective on countless possible things that have influenced their interpretation of your words in a way that you can't understand, and wouldn't know to discover.
at the same time many people are more susceptible to statistically ensured trend setting. this is mostly popular with bad actors who don't mind saying whatever they know will "work" instead of trying to convince people of what is true or reasonable.
TLDR: we are more confident than we should be for almost everything. we also suck at communicating for reasons that are too complex to fully see or interpret. be patient and reasonable, as we are all missing information. a good mediator helps find gaps in perspective. try not to be controlled by your emotion or instinctual reactions to situations. be critical when interpreting new information.
I believe it will require a level and pace of informational processing that is far beyond what humans will accomplish alone. just having a system that can efficiently sift through the excess existing papers, and find correlations or contradictions would be amazing for development of new technology. if you are paying attention to any environmental sciences right now, it's terrifying in an extremely real and tangible way. we will not outpace the collapse without an intense increase in technological development.
if we bridge the gap of analogical comprehension in these systems, they could also start introducing or suggesting technologies that could help slow down or reverse the collapse. i think this is much more important than making sure sarah silverman doesn't have her work paraphrased.
Personally I find this stupid. If we have robots walking around, are they going to be sued every time they see something that's copywrited?
It's this what will stop progress that could save us from environmental collapse? That a robot could summarize your shitty comedy?
Copywrite is already a disgusting mess, and still nobody cares about models being created specifically to manipulate people en mass. "What if it learned from MY creations" asks every self obsessed egoist in the world.
Doesn't matter how many people this tech could save after another decade of development. Somebody think of the [lucky few artists that had the connections and luck to make a lot of money despite living in our soul crushing machine of a world]
All of the children growing up abused and in pain with no escape don't matter at all. People who are sick or starving or homeless do no matter. Making progress to save the world from immanent environmental disaster doesn't matter. Let Canada burn more and more every year. As long as copywrite is protected, all is well.
How about an art director using Disney/Warner money to direct a bunch of interns? The artists are being used as a tool for someone else to make their art without the effort that work should require. Does it belong more to the interns that worked on each piece? Or the director who had the vision and direction? while you might not care for simple prompt direction, or want to take credit for anything you've made with these tools, even easy work made with a powerful tool can be interpreted for its own merit, and could give smaller creators an effective "team" to compete with people who have endless resources.
You can also spend time and effort in conjunction with these tools to create something specific to what you had envisioned. Does this lack value due to the medium?
I think art is a complex concept with high subjectivity, but this type of selectivity happens every time a new tool or medium is introduced. Judge each work as you will, but don't go around claiming "this thing isn't art" because of reasons that lose meaning or truth in any other medium or context.
As always, the problem is our economic system that has funneled every gain and advance to the benefit of the few. The speed of this change will make it impossible to ignore the need for a new system. If it wasn’t for AI, we would just boil the frog like always. But let’s remember the real issue.
If a free food generating machine is seen as evil for taking jobs, the free food machine wouldn’t be the issue. Stop protesting AI, start protesting affluent society. We would still be suffering under them even if we had destroyed the loom.