While I can see the plus side of being able to identify bots, I don’t think the WEI is the right way to do it, and Google definitely isn’t the right company to be handling it
Plus how do you spot the difference between a good bot and a bad bot? Web crawlers from search engines are for example inherently good, so they should still be able to operate, but if it is easy to register a good bot in WEI, it is also easy to register a bad bot. If it is hard to register a good bot, then you’re effectively gatekeeping the automated part of the internet (something that actually might be Google’s intention).
Just wait until Google implements Web Environment Integrity.
We should already be in the streets and we’re not.
While I can see the plus side of being able to identify bots, I don’t think the WEI is the right way to do it, and Google definitely isn’t the right company to be handling it
Plus how do you spot the difference between a good bot and a bad bot? Web crawlers from search engines are for example inherently good, so they should still be able to operate, but if it is easy to register a good bot in WEI, it is also easy to register a bad bot. If it is hard to register a good bot, then you’re effectively gatekeeping the automated part of the internet (something that actually might be Google’s intention).
I was thinking the same thing about Google wanting their bots to be the only ones allowed to crawl and index the internet.
A bot that only reads your website is good, one that posts things or otherwise changes your database less so.
I just wish everyone would switch to Firefox.
It is because Chrome has a monopoly, is close enough to monopoly.