-
Notifications
You must be signed in to change notification settings - Fork 259
Description
Several architectures/compilers have a weird compatibility mode where they'll emit 64bit instructions (and 64bit ELF types, relocations, and so on), but use 32bit sign-extended pointers. The most notable example is Intel x32 (e.g. gcc -mx32
), but MIPS64 and AArch64 also have similar features, and I've seen all of them. Basically, the ISA is 64bit, but longs and pointers stored in memory are 32 bits. Whenever a pointer is loaded, it's sign-extended to 64 bits. MIPS64 in system mode also uses a high memory range (> 0xffffffff800000
I believe) so that kernel pointers get sign-extended meaningfully (e.g. sx.q(0x80c00000) => 0xffffffff80c00000
). Stack analysis completely gives up when it sees a sign extension on the stack pointer, that's the biggest problem for me. The next biggest issue is that I can't use actual pointer types anywhere (because they're the wrong size), so type propagation and xreffing (e.g. a global pointer to a function) don't work out well.
I'm not really sure what the best solution would look like, but just wanted to note that I've seen this in 5 very different places now and haven't had a great workaround for binja.