• Gamma@beehaw.org
    link
    fedilink
    arrow-up
    60
    ·
    edit-2
    2 months ago

    Vizor explained that Ricochet uses a list of hardcoded strings of text to detect cheaters and that they then exploited this to ban innocent players by simply sending one of these strings via an in-game whisper. To test the exploit the day they found it, they sent an in-game message containing one of these strings to themselves and promptly got banned.

    Vizor elaborates, “I realized that Ricochet anti-cheat was likely scanning players’ devices for strings to determine who was a cheater or not. This is fairly normal to do but scanning this much memory space with just an ASCII string and banning off of that is extremely prone to false positives.”

    This is insane, they had an automatic script to connect to games and ban random people on loop so they could do it while away

    • renegadespork@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      24
      arrow-down
      1
      ·
      2 months ago

      a list of hardcoded strings

      Violating a core programming tenet right off the bat. I wonder how much money Activision payed for this software…

      • ramjambamalam@lemmy.ca
        link
        fedilink
        arrow-up
        10
        ·
        2 months ago

        We and the hacker have no idea if this list is config driven or truly “hard coded” i.e. a const in the source code. It’s hardly an indicator of violating a core programming tenet.

  • I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    53
    ·
    2 months ago

    So, simply receiving “aim bot” as a whisper (private) message was enough to get permabanned. FUCKING JEE-NIUS ANTICHEAT, GREAT JOB, GUYS!!!

      • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
        link
        fedilink
        arrow-up
        18
        arrow-down
        1
        ·
        edit-2
        2 months ago

        Heh. GTA V used to be fun even when a cheater would show up and because you could just use a rocket launcher on them to keep them ragdolled forever so they couldn’t use their cheat menu (or any menu). They’d have to alt-F4 to quit since being ragdolled closes any open menus.

    • Evotech@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      2 months ago

      It requires the server to verify all inputs. It’s doable, path of exile does that and most arpgs. But it leaves them very open to lag and desync issues, most games will prioritize a smooth experience.

      Cod is peer to peer I think, clients host the server, very cheap for the company. But obviously you need to give the client a lot more information.

    • TheDorkfromYork@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      Yes, but not through standard methods. Even AI aimbot can be filtered, but the amount of RND required is likely to much for a single studio to bear alone. I believe we are more likely to see neural network trained bots largly replacing real players using an off the shielf model. Just a guess, not an expert.

      • Doomsider@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        2 months ago

        There is already a solution using relatively simple analytics and building a profile of the player. It becomes very easy to find cheaters because it is easy to analyze how fast and directionally they aim. It is obvious when someone is using macros for instance or a aimbot.

        The problem is this does not require intrusive programs that are essentially spyware for your OS. This is what attracts the big studios to these solutions not their effectiveness.

        There is a workable solution but let’s be honest. Cheaters are often whales and spend a lot of time and money on the game. It is bad engagement to send them away.

        Big studios already recognize this. So to be blunt they allow a certain amount of cheating because they don’t want to really solve the problem.

        • TheDorkfromYork@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          2 months ago

          I was speaking to the long term, 5-10 year in.the future. Analytics is a current solution and as far as I know works well. I was just talking vaguely about long term problems and solutions.

          • CleoTheWizard@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            2 months ago

            I think the best thing I’ve heard for long term solutions is to fix a lot of the cheating using server side solutions. In a game like CoD, that means the server doesn’t send you player positions unless you absolutely need to know them.

            The other thing honestly is just increasing the investment required to cheat. That could mean that in order to play competitive game modes, you need to have signed in at least once for 4 weeks straight and played the game. Or you need to be a certain level. Issue hardware bans and IP bans to people. Require phone number verification.

            What those things do as barriers is actually increase the potency of current detection methods. This should also carry over to accounts. I’m not sure why steams VAC ban system isn’t more popular. As in accounts need to be flagged as a whole when cheating in just one game is found.

            There are many solutions but it’s just not a big deal for companies as the prior person said. Plenty could be done to at least make cheating harder and cost more time/money. But that won’t happen

            • Jerkface (any/all)@lemmy.ca
              link
              fedilink
              arrow-up
              2
              ·
              2 months ago

              I’m not sure why steams VAC ban system isn’t more popular. As in accounts need to be flagged as a whole when cheating in just one game is found.

              Presumably because this opens players to significantly damaging abuse from server operators. Players aren’t the only ones who fuck around.

              • CleoTheWizard@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                2 months ago

                I don’t mean individual servers. What I more meant was let’s say a game uses a standardized anti-cheat. Like EasyAntiCheat or Battleye or similar. And whoever runs your game service (Steam, PSN, Xbox) can vet these anti cheat programs and allow them to create a record on your account of cheating.

                And obviously these things get false flags so you can account for that, give people strikes and allow appeals. And games would have the option of banning you for: having too many strikes total, violating only a specific anti-cheat X times, or ignoring this system except to place extra suspicion and resources on those already having strikes.

                Also having an account tied to hardware is a no brainer and I’m surprised that this doesn’t get employed often. I know IDs can be spoofed but that’s another barrier potentially.

              • CleoTheWizard@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                2 months ago

                They use a hybrid system now and only use peer to peer when dedicated servers aren’t enough, so they could just swap to purely dedicated servers.

                However ignoring that, even a peer to peer system can do similar tricks if you don’t isolate the host peer to just one machine. That can even be done by spot checking with a company owned server. You use the server as a verification peer and have it as a backup host to the assigned peer. If your verification peer gets different ram values or what not, you shut the server down at the very least and place that peer on a suspicion list.

                But even if they went the cheap route, just distribute the peer network. Let’s say that you have a game of 12 people. You could make it so that each peer is only assigned a certain part of the simulation and players (with overlap on assignments) and cannot track the entire simulation. It’s more complicated than a single server hiding info from you, but they could at least make it to where you’d need multiple infected peers to take over a lobby.

          • Doomsider@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            I think you were spot on about training a neural network with player data. It is already happening without a doubt.

  • Defaced@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    2 months ago

    But you know, according to EA Linux is worse than guys like this deliberately causing disruptions in service to legit players.