One thing that I thought was interesting was when my CS professor was showing us how to use GDB, where he exploits the fact that there is always a "0" right outside the bounds of a dynamically allocated integer array in order to prove a point in debugging a program. For instance:
int *array = new int ;
int a = array;
And now, a will be equal to 0. My question to you guys is, why is this so in every single execution of the program? I just thought it was extremely peculiar behavior that happens too often for it to be a coincidence...