• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    Why would the steps be literal when everything else is bullshit? Obviously the ‘reasoning’ steps are AI slop too.

  • paraphrand@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    It’s bullshitting. That’s the word. Bullshitting is saying things without a care for how true they are.

      • antifuchs@awful.systems
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        It’s kind of a distinction without much discriminatory power: LLMs are a tool created to ease the task of bullshitting; used to produce bullshit by bullshitters.