I'm using an integer to track a few different things. I'm using bits 1, 2, & 4 to track the environment in which the script runs, and 128, 256, & 512 to track what was passed on the command line. If the 128 bit is 1, I want to ignore 256 & greater, but NOT ignore anything below 128. I can't stop processing when 128 gets set to one because I don't know in what order the switches will be passed. I can solve the issue with If/Thens, or using two integers, or re-evaluating the integer after the command line has been processed and clear the other bits.
I'm wondering if there isn't something more math-based I can use so that I don't have to modify a bunch of code should I add another command line switch.
I got it! Once I'm done setting bits, I did a Bitwise And with my integer and 255 (the bits 128 and below) and I eliminate everything to the left of 128. Whew!