• taladar@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    13 hours ago

    I experimented with quite a few local LLMs too and granted, some perform a lot better than others, but they all have the same major issues. They don’t get smarter, they just produce the same nonsense faster (or rather often it feels like they are just more verbose about the same nonsense).