I asked it directly. It didn’t know and stated it has never had version numbers. I pointed out that news articles differentiate 1.0 and 2.0. It agreed but didn’t say what it was. I asked it again directly, it said it was 2.0.
Hard to believe something that feels like it’s lying to you all the time. I asked it about a topic that I’m in and have a website about, it told me the website was hypothetical. It got it wrong twice, even after it agreed it was wrong, and then told me the wrong thing again.
Can you ask perplexity.ai your question about ceramic firing and see what you get? Perplexity offers prompts to move you along towards your answer.
Hard to believe something that feels like it’s lying to you all the time. I asked it about a topic that I’m in and have a website about, it told me the website was hypothetical. It got it wrong twice, even after it agreed it was wrong, and then told me the wrong thing again.
Just tried it out, withe some questions about ceramic firing in a electric kiln. Seems to have similar accuracy to chatgpt, maybe closer to gpt4.
It’s not clear when using it what version it’s on, so this may have been Claude 1, I’m unsure where to check.
I asked it directly. It didn’t know and stated it has never had version numbers. I pointed out that news articles differentiate 1.0 and 2.0. It agreed but didn’t say what it was. I asked it again directly, it said it was 2.0.
Hard to believe something that feels like it’s lying to you all the time. I asked it about a topic that I’m in and have a website about, it told me the website was hypothetical. It got it wrong twice, even after it agreed it was wrong, and then told me the wrong thing again.
Can you ask perplexity.ai your question about ceramic firing and see what you get? Perplexity offers prompts to move you along towards your answer.
Is this what they consider hallucinations?