I went to CES this year and I sat on a few ai panels. This is actually not far off. Some said yah this is right but multiple panels I went to said that this is a dead end, and while usefull they are starting down different paths.
Its not bad, just we are finding it’s nor great.
I’m a software developer and I know that AI is just the shiny new toy from which everyone uses the buzzword to generate investment revenue.
99% of the crap people use it for us worthless. It’s just a hammer and everything is a nail.
It’s just like “the cloud” was 10 years ago. Now everyone is back-pedaling from that because it didn’t turn out to be the panacea that was promised.
Misleading title. From the article,
Asked whether “scaling up” current AI approaches could lead to achieving artificial general intelligence (AGI), or a general purpose AI that matches or surpasses human cognition, an overwhelming 76 percent of respondents said it was “unlikely” or “very unlikely” to succeed.
In no way does this imply that the “industry is pouring billions into a dead end”. AGI isn’t even needed for industry applications, just implementing current-level agentic systems will be more than enough to have massive industrial impact.
I used to support an IVA cluster. Now the only thing I use AI for is voice controls to set timers on my phone.
I use chatgpt daily in my business. But I use it more as a guide then a real replacement.
LLMs are good for learning, brainstorming, and mundane writing tasks.
Yes, and maybe finding information right in front of them, and nothing more
Analyzing text from a different point of view than your own. I call that “synthetic second opinion”