• 3 Posts
  • 7 Comments
Joined 1 year ago
cake
Cake day: March 19th, 2024

help-circle



  • AmbiguousProps@lemmy.todaytoLemmy Shitpost@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    edit-2
    1 day ago

    It won’t be an improvement, just another way for people to fall in line and not think for themselves. LLMs don’t know anything - they can (and do) confidently tell users something very incorrect, and only correct themselves after a user points it out. What about the users who don’t push back and just trust what the slop machine is telling them?