My Ultim anti-cheat idea

In order to cheat, cheat user need "offsets".

But if you build code that compil game code just before the game start, creating a new cgame DLL, cheaters will got it in the ass.


i'am sure there are already many app which modify offest DLL, this is not a hard think to do.

This system will be more reliable than any tzac or punkbuster.

no lags in game.

Cheaters will be sent to old and ugly aimbot/wallhack which are very obvious(graphic based, no DLL hooking).

image: 251px-Day-198--Computer-Genius





Watch out, i'am pretty good.
Comments
42
There are ways to get the offsets "on the fly", usually they're used to provide mod support for every mod without getting the offsets separately for them by hand, as I understand. rshook amongst others has this functionality, I believe.

Also, recompiling the game code for every game would cause time spent in loading screen to increase drastically, we're talking about at least five times longer loads.

E: Also, mere recompiling won't change the offsets, you also need to actually change something within the code. This would lead to different problems, such as people being able to use whatever cgame DLLs they want to since there would be no way to detect whether it's one created by the "anticheat" or one tailored by the player.
You will sure find an "on the fly" method to find offsets, but fact is that cheat makers spend hours and days to find them manually.

So i'am not sure about the possibilty of "on the fly method".

If they could find them that easy, then they will do it now.


Compiling a code take at most 5 seconds, not more.

And i'am sure they are still app to modify DLL offsets without re-compiling sources.


Edit: "such as people being able to use whatever cgame DLLs they want to since there would be no way to detect whether it's one created by the "anticheat" or one tailored by the player. "

nah, it wont happen. there are checksums to check if the dll are the same, and when they are not then the good is downloaded.
Parent
I'll respond to the rest of the message after this LoL game, but:
Quotenah, it wont happen. there are checksums to check if the dll are the same, and when they are not then the good is downloaded.

You can't tell an "anticheat"-modified DLL from a user-modified DLL judging by just the checksums.
Parent
You said that cheaters will be able to use the own DLL because of my trick(which is a totaly wrong affirmation).

And i'am telling you there is yet a system in ET that make you cant use a pk3 or a DLL if chechsums are differement from the ones who are on server side.
Parent
Cheksums can always be altered to match another. Could always replace files inside the pak0, and also alter map files to remove walls etc...
Parent
Oh, yeah, you mean like a centralized system? That'd mean that everyone has to download the compiled cgame from the server at each match start, which would be very cumbersome.

And, that still leaves in the option that the user just sends a faked checksum to the server, similar to cheats that allow you to use an "unpure" client (ie. stuff that bypasses ETPro's integrity failure -message, and lets you use whatever .pk3 files you want to).

And still, finding the checksums "on the fly" would be possible. I can't remember the name cheatcoders used for that method, but in essence you just sort of debug the client on the fly to find the proper functions to hook, instead of using preset offsets.

E: The reason you don't see this technique used in most free cheats is that it's simply more complex to do than just using preset offsets.
Parent
Everyone has yet to download files from server when checksums arent the same. It occures automatically(not if you turned the download cvar to 0). Nothing new here.

Sure, they will stil be able to load fake DLL, then a system to check the good one is loaded will be needed. You are right on this.

But finding offsets on the fly isnt that easy, that why there are still so many people cracking programs, looking for offset manually, using debuger.
They will bother that much if automatically finding them was that easy to do.
Parent
the most work is searching and analyzing the method signature not the offset itself...
Parent
method signatur?
Parent
int lololol(int a, char* b, bool c) is a function signature (method for you Java nerds). These are not explicitly stated in an executable, but if you are a nerd with lots of free time on your hand, you can infer it from the assembly code :)
Parent
subroutine (for you common nerds :P )
Parent
i call this a prototype.

and many fonctions have the same prototype(same arguments).
Parent
The declaration of the function with no definition is a prototype.
Parent
If the checksum is the same, that would mean the file is identical, and thus also the offsets must be identical, and you have accomplished exactly nothing :)
Parent
not exactly. the same checksum doesn't mean its the same file.
Parent
Then your checksumming algorithm doesn't serve its purpose.
Parent
What about becoming an english genius?
we, french, make english spelling mistakes in purpose because we are pround of our own language.
Parent
no, u make mistakes, because u are stupid
Parent
omg mistakes on zeh internet!!
Parent
I'm sure some hacker fag could go past that
ET IS DEATH
go fuck urself
hey pc wizz, everything / anything can be bypassed
not when you find the one and only right code
Parent
i have no clue what ur talking about, dunno about codes etc.. if pentagon was hacked, ur saying a stupid ac cannot be?!
Parent
offsets can be obtained by patterns and don't need hard coded addresses, also compiling the code won't achieve anything the structure will still remain the same - what you are thinking of is something called polymorphic code which is capable of dynamically changing, this is very tricky to implement, requires a virtual machine which makes code a lot slower, can be considered a virus by AVs, and ultimately can be bypassed (though it would be harder) -- but as ever one clever coder can share knowledge with the more noob ones
Why do you need a VM for polymorphic code?
Write instructions to some memory address, jump to that address. Don't need a VM for that?
Parent
wiki: "In computer terminology, polymorphic code is code that uses a polymorphic engine to mutate while keeping the original algorithm intact... When the code is executed this function reads the payload and decrypts it before executing it in turn." http://en.wikipedia.org/wiki/Polymorphic_code

Every time the code is loaded into memory it is mutated a little differently and a unique encryptor/decryptor is created (i called this a virtual machine) - the code needs to be run through the decryptor in order to spit bytes out that the machine can understand. Though if the polymorphic code doesn't encrypt it isn't necessary, though some kind of 'vm' is needed to take the machine bytes and morph them (add junk bytes, mutate code) but it still can be read by the machine so a decompiler isn't necessary though code execution is probably not as fast or optimised anymore.
Parent
QuoteEvery time the code is loaded into memory it is mutated a little differently and a unique encryptor/decryptor is created (i called this a virtual machine) - the code needs to be run through the decryptor in order to spit bytes out that the machine can understand.

The decryptor you're talking about might just as well be the x86 instruction set. Mind you, there's a lot of ways to write the same thing in assembly. Yes you need some kind of engine that generates the instructions, but calling this thing a VM is quite a stretch.
Parent
well it depends if cryption is used, a VM is something that sits between intermediary bytes that the machine can't understand, and the machine itself - like the VM used in Java or .NET
Parent
The problem being adressed here was the fact that the memory locations of certain functions/variables remain constant in memory when being run. You proposed polymorphic code to fix this. However, simply encrypting the code and then decrypting it again would result in the memory map looking exactly the same (with the exception of the encryption key). You need to actually move things around, add a few NOPs here and there, so that not only no binary file is the same, but the memory footprint of each running instance need also be unpredictable. Encrypting/decrypting and then changing the key does not do that.
Parent
This would be true but polymorphic code is mutated in a different way every load (causing offsets to change) and then a function is decompiled/decrypted on every execution:

"Encryption is the most common method to hide code. With encryption, the main body of the code (also called its payload) is encrypted and will appear meaningless. For the code to function as before, a decryption function is added to the code. When the code is executed this function reads the payload and decrypts it before executing it in turn." (wiki)
Parent
You're missing the fact that after decryption, the code remains exactly the same. Meaning, once decrypted, any two instructions have the same number of bytes between them. This kind of encryption/decryption scheme is useful when you want the BINARY FILES to be different, e.g. to fool AV programs. However, the memory map when running the program will look the same.
Parent
no it won't, the memory map is different every time its the whole point of polymorphic code!
Parent
Hm. If so, I must have understood it wrong. This is how I understand it, please correct me on what I'm wrong at.

You have a bunch of instructions I that you want to make unreadable to a third party (e.g. naive antivirus which looks for patterns in files). You put these instructions into a blob B, and you encrypt this blob with some key K.
Now, the entry point to your program is a decryption engine that decrypts your blob B to some location L in memory. Then, you re-encrypt your blob B from your in-memory image at L with a new key K2 and store this in the binary file again. Finally, you jump the progam counter to L, which obviously causes the instructions I to be executed.

The result of this is
1. A rewritten binary file, now encrypted with the key K2
2. The instructions I that are executed post-decryption are the same
3. The memory from L to L+sizeof(B) is the same

which means that the memory map looks the same, but the BINARY FILE differs.

And before you say that the example in the article has the dummy operations on C to mess with the memory map, if you look closely this doesn't change the encrypted instructions, it just makes the decryption loop harder to detect by an AV.

My points earlier (maybe not clear) was that
1. You don't need a VM for this,
2. Encryption/decryption completely misses the goal of moving the offsets around

Now, if you instead add dummy operations (NOP, writes to dead variables, unneeded reads, etc) to random places in between the original instructions, you can change the offsets required by cheats to read/write. This would only require you to do a linear traversal of the instructions once every time you start the program, and then you jump to the start of your altered memory and you're golden. No need for anything like a VM..
Parent
you have packing and polymorphism confused, packing is when the file is encrypted and is decrypted on Entry Point by a decryption routine. Polymorphic code on the other hand disassembles code on the fly, the code is mutated when it is loaded into memory in a unique way every load, jump or call instructions are inserted into every function using the decompiler, the function jumps to a decryption routine which has been created and loaded uniquely based on this load's unique encryption routine.

"Technically, polymorphic means self changing. However, it can mean different things based on its context.

In an object-oriented sense, polymorphism is when a function call is decided during run time, not compile time. This makes the code self-changing because the same piece of code can call different functions (same variable, different objects).

In encryption terms, it means code that is different every time it is run, often relating to a virtual machine. This can be a result of a random seed being used to encrypt and decrypt certain pieces of code."

"...Though for hacks and viruses, what most people consider polymorphic is editing assembly at runtime."
Parent
Quoteyou have packing and polymorphism confused, packing is when the file is encrypted and is decrypted on Entry Point by a decryption routine. Polymorphic code on the other hand disassembles code on the fly, the code is mutated when it is loaded into memory in a unique way every load, jump or call instructions are inserted into every function using the decompiler, the function jumps to a decryption routine which has been created and loaded uniquely based on this load's unique encryption routine.


Wait, now you're talking about self-modifying code (http://en.wikipedia.org/wiki/Self-modifying_code) (editing the assembly currently being run on runtime) as opposed to polymorphic code (http://en.wikipedia.org/wiki/Polymorphic_code)

QuoteIn an object-oriented sense, polymorphism is when a function call is decided during run time, not compile time. This makes the code self-changing because the same piece of code can call different functions (same variable, different objects).

This is, of course, not our context

QuoteIn encryption terms, it means code that is different every time it is run, often relating to a virtual machine. This can be a result of a random seed being used to encrypt and decrypt certain pieces of code.

This is what I described in the previous post.

Quote"...Though for hacks and viruses, what most people consider polymorphic is editing assembly at runtime."

Source? If this is a good definition of polymorphism in this sense, then we actually agree.
Parent
If the code is compiled before it is launched, it is implied that some kind of source code representation must be present, and it would only be a matter of time before the cheat makers found a way to alter aforementioned source code prior to launch. So no, although a nice twist, that wouldn't help prevent cheating. It would be interesting for some other reasons though, such as fully leveraging the processor-specific optimizations. Not that it would make a huge deal though.
Back to top