I use Firefox and Firefox Mobile on the desktop and Android respectively, Chromium with Bromite patches on Android, and infrequently Brave on the desktop to get to sites that only work properly with Chromium (more and more often - another whole separate can of worms too, this…) And I always pay attention to disable google.com and gstatic.com in NoScript and uBlock Origin whenever possible.

I noticed something quite striking: when I hit sites that use those hateful captchas from Google - aka “reCAPTCHA” that I know are from Google because they force me to temporarily reenable google.com and gstatic.com - statistically, Google quite consistently marks the captcha as passed with the green checkmark without even asking me to identify fire hydrants or bicycles once, or perhaps once but the test passes even if I purposedly don’t select certain images, and almost never serves me those especially heinous “rolling captchas” that keep coming up with more and more images to identify or not as you click on them until it apparently has annoyed you enough and lets you through.

When I use Firefox however, the captchas never pass without at least one test, sometimes several in a row, and very often rolling captchas. And if I purposedly don’t select certain images for the sake of experimentation, the captchas keep on coming and coming and coming forever - and if I keep doing it long enough, they plain never stop and the site become impossible to access.

Only with Firefox. Never with Chromium-based browsers.

I’ve been experimenting with this informally for months now and it’s quite clear to me that Google has a dark pattern in place with its reCAPTCHA system to make Chrome and Chromium-based browsers the path of least resistance.

It’s really disgusting…

  • JonEFive@midwest.social
    link
    fedilink
    arrow-up
    31
    ·
    edit-2
    1 year ago

    Keep in mind that basic bots don’t render or process certain page elements - like javascript. So VPN plus noScript/uBlock plus obscured data plus no preexisting cookies and possibly unique fingerprint from all your previous interactions (depending on your privacy settings)… It all adds to possible bot behavior. In my mind, getting caprcha’d is a good thing. It may mean google has low confidence that it knows who I am.

    • ExtremeDullard@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      11
      arrow-down
      4
      ·
      edit-2
      1 year ago

      In my mind, getting caprcha’d is a good thing. It may mean google has low confidence that it knows who I am.

      That is possibly the most unique outlook I’ve read about today.

      There’s nothing good about captchas: it’s an insult to human intelligence, it’s forced unpair labor and each time I get one, I want to murder someone.

      In a normal world, your statement would be utterly insane. But in our dystopian surveillance economy society, it’s actually a rational and interesting point of view, and one that turns captchas into a useful indicator of how well you manage to evade said corporate surveillance.

      Interesting. Thank you for that.

      However, If you’re right and Googles serves fewer captchas to those they can track better and not just those who run Chromium as I suspect, it also means privacy-enhanced Chromium-based browsers don’t hold a candle to Firefox. That’s not great news considering Chromium is the new de-factor standard and some websites only work okay in Chromium.

      • droans@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        You’ve never operated a public-facing website, have you?

        In the past 24 hours alone, I’ve had at least 344 bot attempts on my personal site. A handful are harmless crawlers but most are hoping to hit a vulnerability.

        Captchas are necessary to prevent malicious bot activity. It’s unfortunate that it also means it’ll be a pain for users.