It is becoming near impossible to find relevant information from search engines. Duckduckgo, SearXNG, Bing, Google, and so many more mainstream engines have a significantly high noise to signal ratio, and it is getting worse.
Here are a collection of the best search engines I know, please add more to the list.
- Forum Search Engine: https://crowdview.ai/
- Non-commercial Search: https://search.marginalia.nu/
- Libre Meta Search Engine: https://librey.devol.it/
- Golden Age Search Engine: https://www.wiby.org/
- Yandex: https://yandex.com/
If no more high quality search engines exist, would it be possible to host your own?
EDIT: Some new discoveries. The addon uBlacklist and filters can block super SEO sites from appearing in search.
Literally the first paragraph on their website
The Patriot Act and Snowden’s leaks have shown companies will go against their privacy policy to appease governments. Search engines especially are targeted by five eyes with the PRISM program where copies of all your data, linked to your payment, are sent to Five Eyes and stored. Gag orders and legal threats prevent disclosure, as has been done with prior tech companies who have tried to push back against this.
Be wary of trusting corporations with your data as monetization is a powerful incentive.
Lemmy is different though right?… right?
With lemmy you are trusting whatever instance your account is on, and really any federated instances since they could choose to hold onto your posts and comments
I don’t know if I believe that. It’s a paid service, so the only way to enforce that unpaid users cannot search is to take a search request and check if it is coming from your account. Same with basic things like rate limiting requests. You literally need to associate your requests to an account to make basic functionality like this work.
If they do this but just don’t log it, then that means there is no way for their devs to ever debug issues users have or to monitor their services. I’m highly skeptical.
Also, “trust us” is something I’ve heard too many times.
That’s not the same as logging.
They just need to check the session of the user on the fly during the search operation. Once the search is done they don’t need to persist any record linking the search and the user.
Thanks, edited the comment!
and am I supposed to believe such a bold claim? the only reason they give is “trust me, bro. I pinky promise I’m not logging anything”.
You have one account, every search query you make is associated with that account. And even if they aren’t selling that ultra sensitive data, I’m sure they are keeping logs to prevent abuse and fix bugs which could be used when a third party gains access to their servers (malicious actors, law enforcement, etc).
And that’s assuming that Kagi is not mining and or selling any data themselves, which is a bold assumption given how little we know about their proprietary product. If at least they published the source code, but no. I’m supposed to trust a proprietary black box which could potentially be linking every search query back to me.
I don’t trust or ever plan on getting Kagi, but in their defense, the “trust me bro” is a large portion of privacy services. I use Mullvad VPN and think they have a great reputation that have proved themselves. I have no however, personally checked the servers to verify myself what’s running, so I am trusting then. Even when running open source software, I know none of us here have actually looked into every line of code of our browsers or our phones to see what’s all running. It’s simply unfeasible, so trust and reputation is still required at the end of the day.
That’s absolutely true. The problem is that, to make use of VPN services, it’s required to have an account or other identifier.
But that’s no true for search engines. If I wanted to, I could make completely anonymous searches using SearXNG or DDG from different IPs and they would not have any way to correlate the search queries.
That’s not true with Kagi and it’s a completely unnecessary privacy risk you’re taking when using it.