- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
Remember, this probably isn’t an either or thing. Both Apple and Google have been offloading certain AI tasks to devices to speed up response time and process certain requests offline.
Yep, though Google is happy to process your data in the cloud constantly while Apple consistently tries to find ways to achieve it locally, which is generally better for privacy and security but also cheaper for them too.
Yea thats why they look trough your images for “cp”
Who looks at images?
Ok have to correct myself, they crawled back from this 2 weeks ago due to backlash, but i doubt they wont do it at least in a similar way or hidden like they did with reducing power on older devices to “save battery” https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
No, Google has simply been blatantly lying about this to convince you to buy new phones. It’s very easy to prove, because as soon as you disable any network connections, these functions cease to work.
Just because a certain requests don’t work offline, that doesn’t mean that Google isn’t actually running models locally for many requests.
My pixel isn’t new enough to run nano. What are some examples of offline processing not working?
I wouldn’t be surprised if the handshake between Pro and Nano was intermingled for certain requests. Some stuff done in the cloud, and some stuff done locally for speed - but if the internet is off, they kill the processing of the request entirely because half of the required platform isn’t available.
Yeah? It does.
What a thought provoking reply.
I dunno what you expect me to say. It’s not complicated.
You’re really going to say that Google isn’t doing anything locally with Tensor? That’s just silly.
No that is not what I said.