Meta “programmed it to simply not answer questions,” but it did anyway.

  • CileTheSane@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    3 months ago

    Hallucination is also wildly misleading. The AI does not believe something that isn’t real, it was incorrect in the words it guessed would be appropriate.