HELLDIVERS™ 2

HELLDIVERS™ 2

View Stats:
Dusty64 Apr 1 @ 7:59pm
A New Era Of Anti Cheat
A simple idea, instead of using the current anti-cheat program or any others that scan the RAM of a computer, which seem to fail miserably most of the time, why not implement something in the game that if anything is changed it will activate multiple layers of encrypted anti-cheater algorithms to ensure the people cheating are humiliated, confused, and beaten into submission.
Similar code was implemented in older games, some Nintendo 64 games for example, that would activate unkillable monsters that would either one shot you or just keep beating you till you die. With the help of moderators on standby in case of reports, this can be very effective.
< >
Showing 1-6 of 6 comments
Moderators on standby is your big answer, it takes resources and active people monitoring claims so you don't accidently hit legit players not to mention if it's in an open lobby. Also with .dll files being easy to implement for modding, having admin monsters would mean people could spawn and cheat with then just as easy currently.

This is also probably some what on Sony as they like to appease investors so it's not all in the devs, it's probably the cheapest method allowed. Considering EAC is probably off the table as it's owned by Epic and they are in lawsuits against each other rofl
Originally posted by Dusty64:
A simple idea, instead of using the current anti-cheat program or any others that scan the RAM of a computer, which seem to fail miserably most of the time, why not implement something in the game that if anything is changed it will activate multiple layers of encrypted anti-cheater algorithms to ensure the people cheating are humiliated, confused, and beaten into submission.
Similar code was implemented in older games, some Nintendo 64 games for example, that would activate unkillable monsters that would either one shot you or just keep beating you till you die. With the help of moderators on standby in case of reports, this can be very effective.

n64? hmm, had this issue a lot in turok2 with the cerebral drill, sometimes hexen64.
good ol times
Originally posted by Dusty64:
A simple idea, instead of using the current anti-cheat program or any others that scan the RAM of a computer, which seem to fail miserably most of the time, why not implement something in the game that if anything is changed it will activate multiple layers of encrypted anti-cheater algorithms to ensure the people cheating are humiliated, confused, and beaten into submission.
Similar code was implemented in older games, some Nintendo 64 games for example, that would activate unkillable monsters that would either one shot you or just keep beating you till you die. With the help of moderators on standby in case of reports, this can be very effective.
As hilarious as it is, it's not very effective.

Your end response, I think, is on point. Detect > Ban gives out too much information. Softbans will usually go unnoticed a bit longer, but even those will eventually be figured out. This would also extend to unkillable mobs and the like. But... it could be leveraged to create unfavorable conditions in-game that are a bit more covert.

The larger issue is that once what's happening has been worked out, the next step is to locate the code and circumvent it - whether it's detection or response. In simple terms that means that nothing that happens client side can be trusted. If you have the client reporting on whether it's cheating, someone can create code that reports that it's not cheating even if it actually is.

I like where you're head is at though. In some POC anticheat code I added a few things to specifically troll people trying to reverse engineer or debug.

Now, if the checks are all running server-side... you may have a bit more luck. But you'd probably still want to avoid being too overt in your response. As soon as someone works out that their cheat has been detected, they're going to try to circumvent the detection.
Originally posted by Calv:
Originally posted by Dusty64:
why not implement something in the game that if anything is changed

This is easier said than done and is exactly what third-party anti-cheat exists in the first place.

Originally, all games did this internally, but then someone comes along with a way to bypass or defeat the "check if anything changed" function.
So the devs have to add another check, and then that gets defeated, and so on and so on.
Games started off with primarily checksum based systems, some going further than others (like examining the number of variables, functions, and overall lines of code). Few hashed entire files.

Like I said, fun times, because when you knew what it looked for you could entirely rewrite in-game code and logic.

Third-party anti-cheat exists because developers weren't effective at catching the issues themselves, and has only persisted in the current state because gaming often lags behind other industries.

The issue is these solutions are relying on a client to accurately report information. If I were to cheat though, I would also be looking at altering the results of the scans so my activity isn't detected. Which creates and endless cycle.

This would work better if everyone leased virtual computers (and yes, some companies were pushing the idea), but still suffers from the same problem. IT's caused by the tunnel vision on methods, rather than outcomes.

The way forward is server side checks, validations, and where possible machine learning. It limits the ability for anyone else to tamper with detection or results. All of which could be paired with whatever response - bans, in-game trolling, etc.
Originally posted by Salt Engineer:
Originally posted by Dusty64:
A simple idea, instead of using the current anti-cheat program or any others that scan the RAM of a computer, which seem to fail miserably most of the time, why not implement something in the game that if anything is changed it will activate multiple layers of encrypted anti-cheater algorithms to ensure the people cheating are humiliated, confused, and beaten into submission.
Similar code was implemented in older games, some Nintendo 64 games for example, that would activate unkillable monsters that would either one shot you or just keep beating you till you die. With the help of moderators on standby in case of reports, this can be very effective.
As hilarious as it is, it's not very effective.

Your end response, I think, is on point. Detect > Ban gives out too much information. Softbans will usually go unnoticed a bit longer, but even those will eventually be figured out. This would also extend to unkillable mobs and the like. But... it could be leveraged to create unfavorable conditions in-game that are a bit more covert.

The larger issue is that once what's happening has been worked out, the next step is to locate the code and circumvent it - whether it's detection or response. In simple terms that means that nothing that happens client side can be trusted. If you have the client reporting on whether it's cheating, someone can create code that reports that it's not cheating even if it actually is.

I like where you're head is at though. In some POC anticheat code I added a few things to specifically troll people trying to reverse engineer or debug.

Now, if the checks are all running server-side... you may have a bit more luck. But you'd probably still want to avoid being too overt in your response. As soon as someone works out that their cheat has been detected, they're going to try to circumvent the detection.

This is actually pretty important to online games function in how bans are handled and why botting is such a mass issue, "why don't devs do anything" because banning every 6 months is more effective and takes them longer to understand what caused the ban, rather then them hitting it daily them working a way around it constantly having no way to stem the tide. Or daily bans means daily information gathering which is vital to bypassing.
Dusty64 Apr 2 @ 6:16pm 
Like what Salt Engineer said, server side scans can be done with redundancy to catch changes or "illegal" activity.
Also, if there is a decent anti cheat program running not many moderators would be required if at all, but with all systems moderators are always required, even if it's just one person, but even that one 'Human' may soon not be necessary with the development of AGI.
Furthermore, like what I was saying in the original post, the "Protocall" as I'll call it, I was imagining being very subtle, almost like the game is suffering from a bug, that would cause a wide variety on nasty dubious things to be done to the "modder" to, in essence, beat them into submission.
< >
Showing 1-6 of 6 comments
Per page: 1530 50