One of the stupidest things I’ve heard.
I also would like to attach the bluesky repost I found this from:
https://bsky.app/profile/leyawn.bsky.social/post/3lnldekgtik27
What kind of fluff “journalism” is this?
It sucks but they do have an audience. I have older family members who swear ChatGPT has a “personality” because it will reply when they thank it.
thanks. my first thought was, “are you fucking kidding me?”
but this is what all the money wants us to think about “AI”, which is definitely not intelligence. they want everyone to accept that pattern recognition is indistinguishable from intelligence.
edit - alcohol makes me talk in cicles
I wish ads felt pain when I skipped them
Does my phone feel pain when I drop it?
Why don’t you ask it?
Can our AI fall in love with a human? Scientists laughed at me when I asked them but I found this weird billionaire to pay me to have sex with his robot.
The pride of cancelling my 20 year subscription continues to swell.
Gemini in it’s current form? No, but it is a fair question to ask for the future
Yeah, twenty years from now at the very least.
A little too optimistic
Yeah, but it’s like fusion. It’s always 20 years away for the last 60 years.
Realistically, as a dev who watched AI develop from cheap parlor tricks to very expensive and ecosystem crunching fancy parlor tricks that mangers think will replace all of their expensive staff who actually know how to design and create:
Modern “AI” is fundamentally incapable of actual thought. They are very advanced and impressive statistical engines, but the technology is incapable of thinking at a fundamental level.
Before we even get close to have this discussion, we would need to have an AI capable of experiencing things and developing an individual identity. And this goes completely opposite of the goals of corporations that develop AIs because they want something that can be mass deployed, centralised, and as predictable as possible - i.e. not individual agents capable of experience.
If we ever have a truly sentient AI it’s not going to be designed by Google, OpenAI, or Deepmind.
Yep, an AI can’t really experience anything if it never updates the weights during each interaction.
Training is simply too slow for AI to be properly intelligent. When someone cracks that problem, I believe AGI is on the horizon.
AGI?
Artificial General Intelligence, or basically something that can properly adapt to whatever situation it’s put into. AGI isn’t necessarily smart, but it is very flexible and can learn from experience like a person can.
I don’t see any reason why this can’t be discussed. I think people here are just extremely anti AI. It is almost like forcing AI on people was a bad idea.
i don’t even understand why it’s worth discussing in the first place. “can autocomplete feel?” “should compilers form unions?” “should i let numpy rest on weekends?”
wake me up when what the marketers call “ai” becomes more than just matrix multiplication in a loop.