• kat_angstrom@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 months ago

    It lacks cohesion the longer it goes on, not so much “hallucinating” as it is losing the thread, losing the plot. Internal consistency goes out the window, previously-made declarations are ignored, and established canon gets trounced upon.

    But that’s cuz it’s not AI, it’s just LLM all the way down.

      • kat_angstrom@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Depends on complexity and the number of elements to keep track of, and varies between models and people. Try it out for yourself to see! :)

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        Its kind of an exponential falloff, for a few lines it can follow concrete mathematical rules, for a few paragraphs it can remember basic story beats, for a few pages it can just about remember your name.