blacklantern
March 1st, 2006, 08:31 AM
For one of my projects, I'm sending packets from a server to a client using UDP protocol. I've also been given a troll program that all packets have to pass through. I can set the troll to drop, garble, and delay a percentage of those packets.
Right now, I'm only setting the troll to garble a percentage of my packets. To combat this, I'm implementing Cyclic Redundancy Check in my code. However, the reference CRC code I've been given uses a bitwise AND & for all characters (btw, I'm working in C). The messages I'm sending are in the form of structures, which I clear out with Null characters before every use. It's an all purpose message, one I'm using for ACKS, new transfer indicators, etc. So, there are some times where I don't even bother to change some members of the struct from nulls, because I know that the receiver won't be reading those fields. However, with the binary AND (&) that I'll be using in my CRC code, I'm wondering - how does that operator respond to NULL characters?
Right now, I'm only setting the troll to garble a percentage of my packets. To combat this, I'm implementing Cyclic Redundancy Check in my code. However, the reference CRC code I've been given uses a bitwise AND & for all characters (btw, I'm working in C). The messages I'm sending are in the form of structures, which I clear out with Null characters before every use. It's an all purpose message, one I'm using for ACKS, new transfer indicators, etc. So, there are some times where I don't even bother to change some members of the struct from nulls, because I know that the receiver won't be reading those fields. However, with the binary AND (&) that I'll be using in my CRC code, I'm wondering - how does that operator respond to NULL characters?