• 0 Posts
  • 500 Comments
Joined 2 years ago
cake
Cake day: July 14th, 2023

help-circle









  • kibiz0r@midwest.socialtoTechnology@lemmy.worldThe Copilot Delusion
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    23 days ago

    So if library users stop communicating with each other and with the library authors, how are library authors gonna know what to do next? Unless you want them to talk to AIs instead of people, too.

    At some point, when we’ve disconnected every human from each other, will we wonder why? Or will we be content with the answer “efficiency”?






  • First thought: Damn, that’s crazy they went ahead and did it to get the footage even though they knew how bad it was.

    Then: Well, I guess the fishermen were gonna do it no matter what, huh?

    Wait, aren’t the fishermen worried that this footage could ruin their livelihood?

    Wait… Maybe they believe that the legality of this practice already has ruined their livelihood, and they want it to stop but can’t compete unless regulation forces everyone to stop…




  • I don’t believe the common refrain that AI is only a problem because of capitalism. People already disinform, make mistakes, take irresponsible shortcuts, and spam even when there is no monetary incentive to do so.

    I also don’t believe that AI is “just a tool”, fundamentally neutral and void of any political predisposition. This has been discussed at length academically. But it’s also something we know well in our idiom: “When you have a hammer, everything looks like a nail.” When you have AI, genuine communication looks like raw material. And the ability to place generated output alongside the original… looks like a goal.

    Culture — the ability to have a very long-term ongoing conversation that continues across many generations, about how we ought to live — is by far the defining feature of our species. It’s not only the source of our abilities, but also the source of our morality.

    Despite a very long series of authors warning us, we have allowed a pocket of our society to adopt the belief that ability is morality. “The fact that we can, means we should.”

    We’re witnessing the early stages of the information equivalent of Kessler Syndrome. It’s not that some bad actors who were always present will be using a new tool. It’s that any public conversation broad enough to be culturally significant will be so full of AI debris that it will be almost impossible for humans to find each other.

    The worst part is that this will be (or is) largely invisible. We won’t know that we’re wasting hours of our lives reading and replying to bots, tugging on a steering wheel, trying to guide humanity’s future, not realizing the autopilot is discarding our inputs. It’s not a dead internet that worries me, but an undead internet. A shambling corpse that moves in vain, unaware of its own demise.



    1. Fuck AI
    2. This judge’s point is absolutely true:

    “You have companies using copyright-protected material to create a product that is capable of producing an infinite number of competing products,” Chhabria said. “You are dramatically changing, you might even say obliterating, the market for that person’s work, and you’re saying that you don’t even have to pay a license to that person.”

    1. AI apologists’ response to that will invariably be “but it’s sampling from millions of people at once, not just that one person”, which always sounds like the fractions-of-a-penny scene
    2. Fuck copyright
    3. A ruling against fair use for AI will almost certainly deal collateral damage to perfectly innocuous scraping projects like linguistic analysis. Even despite their acknowledgement of the issue:

    To prevent both harms, the Copyright Office expects that some AI training will be deemed fair use, such as training viewed as transformative, because resulting models don’t compete with creative works. Those uses threaten no market harm but rather solve a societal need, such as language models translating texts, moderating content, or correcting grammar. Or in the case of audio models, technology that helps producers clean up unwanted distortion might be fair use, where models that generate songs in the style of popular artists might not, the office opined.

    1. We really need to regulate against AI — right now — but doing it through copyright might be worse than not doing it at all