• CriticalMiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    73
    arrow-down
    1
    ·
    5 months ago

    Nothing says “We’re confident in the software we’re selling” like willing to work for exposure in hopes that somebody shills $20 for a subscription.

    • 555@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      4
      ·
      5 months ago

      Exposure = They get to keep the data they get.

      Data = money

      They’ve found a way to make it work I’m sure.

          • nave@lemmy.ca
            link
            fedilink
            English
            arrow-up
            17
            arrow-down
            3
            ·
            edit-2
            5 months ago

            I mean Apples the one saying it. I doubt OpenAI wants to piss them off.

            • steal_your_face@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              5 months ago

              Yeah there’s definitely a contract, but open ai could determine it’s more profitable to void the contract and pay for lawyers and a settlement. Probably unlikely though to be fair.

              • nave@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 months ago

                They probably have a deal similar to DuckDuckGo:

                As noted above, we call model providers on your behalf so your personal information (for example, IP address) is not exposed to them. In addition, we have agreements in place with all model providers that further limit how they can use data from these anonymous requests that includes not using Prompts and Outputs to develop or improve their models as well as deleting all information received once it is no longer necessary to provide Outputs (at most within 30 days with limited exceptions for safety and legal compliance).

      • bionicjoey@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        5 months ago

        This seems so obvious I’m amazed Apple isn’t charging them for the “exposure”

        • Telodzrum@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          5 months ago

          Apple needs them just as badly as OpenAI needs every iPhone user. They’re horribly behind in this “AI” bubble.

    • whodatdair@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      5 months ago

      With the sheer amount of money that the rich are throwing at OpenAI via investment firms, they don’t need nor want to charge imo. The fact that they’re being built into Apple’s ecosystem and are getting name-dropped to people inside of iOS is kinda what their investors want.

      It’s the age old “walmart opens and operates at a loss for 2 years to force others out of business, then jacks the price” model.

      Investors want them to cement this as The AI company & brand so that once it gets giant and starts to be profitable just by being the biggest gorilla in the room, the shares they bought are worth more.

      So what I’m trying to say is that our version of capitalism is perfect and makes lots of sense and is in no way insane and degenerate.

      • Clent@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        5 months ago

        That might be their internal reasoning but Apple will very quickly move to have these capabilities in house. Apple has been working on machine learning for a while but they don’t collect data so they are unable to build these LLMs.

        For now it makes sense for Apple to leave the liability of basing these LLMs on copyrighted data. If OpenAI losses those court battles, they take the hit for services rendered to Apple. None of that liability transfers to Apple.

        Meanwhile, Apple is going about this the Apple way by encouraging developers to integrate their apps into new frameworks being added. This gives them access to user data directly from the source allowing them to build personalized models.

        These models will likely be far more useful to the day to day mundanity of life than the hallucinogenic encyclopedia that is ChatGPT.