Software engineer Vishnu Mohandas decided he would quit Google in more ways than one when he learned that the tech giant had briefly helped the US military develop AI to study drone footage. In 2020 he left his job working on Google Assistant and also stopped backing up all of his images to Google Photos. He feared that his content could be used to train AI systems, even if they weren’t specifically ones tied to the Pentagon project. “I don’t control any of the future outcomes that this will enable,” Mohandas thought. “So now, shouldn’t I be more responsible?”

The site (TheySeeYourPhotos) returns what Google Vision is able to decern from photos. You can test with any image you want or there are some sample images available.

  • dan1101@lemm.ee
    link
    fedilink
    English
    arrow-up
    28
    ·
    23 days ago

    I tried various photos, any of my personal photos with metadata stripped, and was surprised how accurate it was.

    It seemed really oriented towards detecting people and their moods, the socioeconomic status of things, and objects and their perceived quality.

    • Hackworth@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      23 days ago

      It’s probably a vision model (like this) with custom instructions that direct it to focus on those factors. It’d be interesting to see the instructions.

    • aramis87@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      23 days ago

      I gave it two pictures of my cat and it said that she looked annoyed in one picture and contemplative in the other, both of which were true.