The only reason I can think of is for more on device ai. LLMs like ChatGPT are extremely greedy when it comes down to RAM. There are some optimizations that squeeze them into a smaller memory footprint at the expense of accuracy/capability. Even some of the best phones out there today are barely capable of running a stripped down generative ai. When they do, the output is nowhere near as good as when it is run in an uncompressed mode on a server.
For the user? Not at all. For the companies that want their spying/tracking apps to run and take your precious data 24/7? Yes, this way dozens of apps can track you even if you open a hundred more afterwards and forget about them, they can live forever deep down those 24gb
Just curious, is 24GB of RAM in a smartphone useful for anything?
It will let you run more advanced local AI. I’m looking forward to running private LLMs.
Doubt it, I don’t use so much even on my gaming PC.
deleted by creator
The only reason I can think of is for more on device ai. LLMs like ChatGPT are extremely greedy when it comes down to RAM. There are some optimizations that squeeze them into a smaller memory footprint at the expense of accuracy/capability. Even some of the best phones out there today are barely capable of running a stripped down generative ai. When they do, the output is nowhere near as good as when it is run in an uncompressed mode on a server.
Mostly caching I guess, so less cold starting of apps
For the user? Not at all. For the companies that want their spying/tracking apps to run and take your precious data 24/7? Yes, this way dozens of apps can track you even if you open a hundred more afterwards and forget about them, they can live forever deep down those 24gb
It will allow future developers to create even less optimized apps and not worry about how resources are used.