Good way to convert Hex to bit values

I have a module that is returning a Hex value. Each character in the string represents 4 distinct values, reading left to right in the bits.

So for Hex A (1010)
setting 1 = On
setting 2 = Off
setting 3 = On
setting 4 = Off

This is what I came up with - wondering if there’s a more efficient way of doing this. Obviously there needs to be more to the code to store/process the results, but I was more interested in this case in the “deciphering” aspect.

(I’ve never really had to deal with bit type operations before, and (a) figured there’s a better way than this and (b) the longer I’m in micro .NET world, sooner or later I’ll need to know this better any ways)

           // these can be variable length in the final product
            var retString = "A3";
            for (int i = 0; i < retString.Length; i++)
                var c = Convert.ToInt32(retString.SubString(i,0), 16);
                var x1 = c & 8;
                var x2 = c & 4;
                var x3 = c & 2;
                var x4 = c & 1;

Converting the entire string at once (assuming you have a known length) and then doing the bit shifting would be a bit faster than a for loop.

Also don’t declare your variables inside a loop that’s asking for garbage collection to fire way more than needed.

the only GC issue I see is the substring that creates a string each loop. I see no problems with declaring ints inside a loop as they will live on stack, not heap, therefore making no difference if you declare then inside or outside the scope of the loop.

I think something like this will perform faster, using bitmagic to convert the nibbles to a value (supporting both upper and lowercase characters)

var retString = "0123456789ABCDEFabcdef";

for (int i = 0; i < retString.Length; i++)
    byte c = (byte)retString[i];
    if ((c & 0x40) != 0) c += 9;
    c &= 0x0F;

    var x1 = c & 8;
    var x2 = c & 4;
    var x3 = c & 2;
    var x4 = c & 1;

Wouter, would you add some comments to this section? I’m lost and I really want to understand what you’re doing here.

0x40 is ASCII "@ " and is essentially the start of the alphabetic characters, “A” being 0x41, “a” being 0x61. “0” is 0x30, “1” is 0x31…

(c & 0x40) is a bitwise AND so anything that is greater than or equal to 0x40 and less than 0x80 returns a non-zero value. If that’s the case, add 9 to it (so “A” becomes 0x4a, “a” becomes 0x6a, but “0” thru “9” aren’t altered).

Next, AND out the leading hex digit by bit masking with 0x0f, so “A” becomes 0xa, and “a” becomes 0xa as well, “0” becomes 0x0.

hey presto, you have your index into the array


1 Like

Thank you Brett :slight_smile:

It’s simple: if you mask a ASCII number with 0x0F, you have the value of it (as ‘0’ == 0x20, ‘1’ == 0x21 and so on)
ASCII ‘a’ to ‘f’ and ‘A’ to ‘F’ result in a number starting from 1 to 6 after masking them with 0x0F, so you need to add 9 to them.

Thanks Skew and Wouter.

Learned something new today at 9:10AM … I can now take the rest of the day off :slight_smile:

[& Brett.] Ditto!