I think Microsoft’s planned recall feature where they screenshot everything you do so that it can be analysed by AI isn’t as bad as everyone makes it sound. It’s only bad because Windows is closed source and nobody can verify if what they say is true.
But if Microsoft aren’t lying and none of the data ever leaves your PC (which is supported by the fact that you need a pretty beefy machine to use it) then it is one of the more privacy friendly shit they’ve done recently. And I think they were fully aware that they could only sell “thing that records everything you do” if they could convince people that it doesn’t share that data. Guess they failed.
If it were open source I might even think about using it myself. If the hardware and subsequently power requirements weren’t so absurdly high.
The big thing here is consent. If you run it yourself, i.e. opt into it. Then it’s consensual.
Microsoft has demonstrated over a long period of time they are happy to force “optional” anti consumer things into people through
- Bad defaults
- Silent updates changing settings
- Nag screens
- More nag screens that pop up randomly hoping you misclick
- Deceitful UI (Yes! Ask me later!)
Oh! You have misunderstood the whole concept of privacy. I have a thought experiment for you:-
Let’s assume Microsoft is not lying 🤥. The data (screenshot) remains on device, which is passed to some AI model like Image-to-text etc. This model generates text on-device. But no where Microsoft guarantee’s that the text generated or output from those AI models won’t be sent to the Microsoft. They only say the screenshots and AI models remain on-device, but the output/metadata can be sent to Microsoft.
That is the issue. Earlier there were many apps where Microsoft couldn’t pry because they were encrypted etc. Now they don’t need to break any encryption they just need metadata. That’s easy to transfer and use.