Auf YouTube findest du die angesagtesten Videos und Tracks. Außerdem kannst du eigene Inhalte hochladen und mit Freunden oder gleich der ganzen Welt teilen.
It’s a hype bubble. AI had been around for a while and will continue to be. The problem is specifically Large Language Models. They’ve been trained to SOUND human, but not to actually use that ability for anything more useful than small talk and bullshit. However because it SOUNDS charasmatic, and that is interesting to people, companies have started cramming it into everything they can think of to impress shareholders.
Shareholders are a collective group of people who are, on average, really more psychologically similar to crows than other humans - they like shiny things, have a mob mentality, and can only use the most basic of tools available, in their case usually money. New things presented in a flashy way by a charasmatic individual are most attractive to them, and they will seldom do any research beyond superficial first impressions. Any research they actually do generally skews towards confirmation bias.
This leads to an unfortunate feature of capitalism, which is the absolute need to make the numbers go up. To impress their shareholders, companies have to jangle keys in front of their faces. So whenever The Hip New Things comes along, it’s all buzzwords and bullshit as they try and find any feasible way to cram it into their product. If they could make Smart Corn 2.0 powered by Chat GPT they would, and sell it for three times as much in the same produce isle as normal corn. And then your corn would tell you this great recipe if knows where the sauce is made with a battery acid base.
In most recent memory, this exact scenario played out with NFTs. When the NFT market collapsed as was inevitable, the corporations who swore it would supercharge their sales all quietly pretended it never happened. Soon something new was jangled in front of the shareholders and everybody forgot about them.
Now that generative AI is proving itself to just be a really convincing bullshitter, it’s only a matter of time until it either dies and quietly slinks away or mutates into the next New Things and the cycle repeats. Like a pandemic of greed and stupidity. Maybe they’ll figure out how to teach Chat GPT how to check and cite verified sources and make it actually do what they currently claim it does.
I guess it depends on if they can make it shiny enough to impress the crows.
I think we’re in an ai bubble because Nvidia is way over valued and I agree with you that often people flock to shiny new things and many people are taking risk with the hope of making it big…and many will get left holding the bag.
But how do you go from NFTs, which never had widespread market support, to the market pumping a trillion dollars into Nvidia alone? This makes no sense. And to down play this as “just a bullshitter” leads me to believe you have like zero real world experience with this. I use copilot for coding and it’s been a boost to productivity for me, and I’m a seasoned vet. Even the ai search results, which many times have left me scratching my head, have been a net benefit to me in time savings.
And this is all still pretty new.
While I think there it is over hyped and people are being ridiculous with how much this will change things, at the very least this is going to be a huge new tool, and I think you’re setting yourself up to be left behind if you aren’t embracing this and learning how to leverage it.
The AI technology we’re using isn’t “new” the core idea is several decades old, only minor updates since then. We’re just using more parallel processing and bigger datasets to brute force the “advances”. So, no, it’s not actually that new.
We need a big breakthrough in the technology for it to actually get anywhere. Without the breakthrough, we’re going to burst the bubble once the hype dies down.
The landmark paper that ushered in the current boom in generative AI (Attention is all you need (Vaswani et al 2017)) is less than a decade old (and attention itself as a mechanism is from 2014), so I’m not sure where you are getting the idea that the core idea is “decades” old. Unless you are taking the core idea to mean neural networks, or digital computing?
It’s a hype bubble. AI had been around for a while and will continue to be. The problem is specifically Large Language Models. They’ve been trained to SOUND human, but not to actually use that ability for anything more useful than small talk and bullshit. However because it SOUNDS charasmatic, and that is interesting to people, companies have started cramming it into everything they can think of to impress shareholders.
Shareholders are a collective group of people who are, on average, really more psychologically similar to crows than other humans - they like shiny things, have a mob mentality, and can only use the most basic of tools available, in their case usually money. New things presented in a flashy way by a charasmatic individual are most attractive to them, and they will seldom do any research beyond superficial first impressions. Any research they actually do generally skews towards confirmation bias.
This leads to an unfortunate feature of capitalism, which is the absolute need to make the numbers go up. To impress their shareholders, companies have to jangle keys in front of their faces. So whenever The Hip New Things comes along, it’s all buzzwords and bullshit as they try and find any feasible way to cram it into their product. If they could make Smart Corn 2.0 powered by Chat GPT they would, and sell it for three times as much in the same produce isle as normal corn. And then your corn would tell you this great recipe if knows where the sauce is made with a battery acid base.
In most recent memory, this exact scenario played out with NFTs. When the NFT market collapsed as was inevitable, the corporations who swore it would supercharge their sales all quietly pretended it never happened. Soon something new was jangled in front of the shareholders and everybody forgot about them.
Now that generative AI is proving itself to just be a really convincing bullshitter, it’s only a matter of time until it either dies and quietly slinks away or mutates into the next New Things and the cycle repeats. Like a pandemic of greed and stupidity. Maybe they’ll figure out how to teach Chat GPT how to check and cite verified sources and make it actually do what they currently claim it does.
I guess it depends on if they can make it shiny enough to impress the crows.
I think we’re in an ai bubble because Nvidia is way over valued and I agree with you that often people flock to shiny new things and many people are taking risk with the hope of making it big…and many will get left holding the bag.
But how do you go from NFTs, which never had widespread market support, to the market pumping a trillion dollars into Nvidia alone? This makes no sense. And to down play this as “just a bullshitter” leads me to believe you have like zero real world experience with this. I use copilot for coding and it’s been a boost to productivity for me, and I’m a seasoned vet. Even the ai search results, which many times have left me scratching my head, have been a net benefit to me in time savings.
And this is all still pretty new.
While I think there it is over hyped and people are being ridiculous with how much this will change things, at the very least this is going to be a huge new tool, and I think you’re setting yourself up to be left behind if you aren’t embracing this and learning how to leverage it.
The AI technology we’re using isn’t “new” the core idea is several decades old, only minor updates since then. We’re just using more parallel processing and bigger datasets to brute force the “advances”. So, no, it’s not actually that new.
We need a big breakthrough in the technology for it to actually get anywhere. Without the breakthrough, we’re going to burst the bubble once the hype dies down.
The landmark paper that ushered in the current boom in generative AI (Attention is all you need (Vaswani et al 2017)) is less than a decade old (and attention itself as a mechanism is from 2014), so I’m not sure where you are getting the idea that the core idea is “decades” old. Unless you are taking the core idea to mean neural networks, or digital computing?