I am working on a little program that needs to convert long Ascii character strings of Hex (ie/  4A2345FC5E32......)  to a seiries of Binary bytes. 

I can figure out how to do this, but I was wondering if I should be needing to code routines to do these mundane tasks or am I just re-inventing the wheel.  Does windows provide something already designed for this?

Posted on 2005-10-01 01:43:19 by dicky96
    Do you mean a series of binary ASCII bytes?  Such as 32 ASCII binary byte representations for each DWORD?  By the way, what happened to the previous 95 dickies?  Ratch
Posted on 2005-10-01 09:19:58 by Ratch
What I mean is I have a long string of text data like this


which is of course a series of Ascii character codes in memory (cos it's readable on screen)

34h 41h 33h 32h 33h 36h.....................

This is clearly visible anyway if I open the file in Ultredit and select "Hex Edit"

I need to convert this string to actual binary bytes - so 4A is a single byte 4Ah (or 01001010) and not two bytes 34h 41h

I can figure how to convert that for 0-9 I just AND with 0Fh - for A to F use a little look up table and then do each nibble in turn, shift and combine to make a byte value (well that's the way I would do it but hey I'm just a learner.......)

But what I was asking is am I reinventing the wheel here and there is some function that would just convert the whole string for me?


PS what happened to the other 95 is a long story  :P

Posted on 2005-10-01 12:43:58 by dicky96
I don't really think you are reinventing the wheel. Anyway, just watch out for the endian issue.
Posted on 2005-10-01 12:46:59 by roticv
OK thanks for that. 

I don't see the "endian issue" will affect me here as the data is actually the ascii representation of a log of serial data comms between two devices and as such each byte is just a byte, not part of a word or double word etc. 

But cheers for reminding me about it - I still can't figure why Intel did that byte order thing backwards other than it must be something to do with californian grass  :shock:

Posted on 2005-10-01 13:02:36 by dicky96
Easy as Pi to do.

Just use a 1111b LUT

You have the ascii string

Take the first byte "F"
Convert it to a binary number 0-15
--Do this by 1 cmp subtract for Just upper case letter compatibility or a few compares for upper and lower case letter compatibility.
Ok "F" now = 15 binary
Use your lookup table and get a 1111b back from it
Do the same thing with the "E" and get a 1110b back
Shift the 1111 to the left 4 bits and OR in the 1110b and you have yourself a binary byte from an ascii hex string.
Posted on 2005-10-01 17:49:14 by r22