I understand how to calculate binary digits but I need to be put in the right direction (Books, Web sites and etc ) to learn to program machines in assembly language.
Any help will be definitely appreciated...
Posted on 2011-09-03 17:06:33 by UNEE0X
Jonathan Bartlett's "Programming From The Ground Up" might be a good place to start.

http://savannah.nongnu.org/mail/?group=pgubook

Best,
Frank

Posted on 2011-09-04 01:04:06 by fbkotler
Thanks Frank.
Posted on 2011-09-09 13:41:25 by UNEE0X
No problem! Since this was originally posted in the "unix" section, I did not mention that Jonathan's book is for Linux, and uses (G)as. It will be less helpful to people using Windows (might be worth a read anyway)...

Best,
Frank

Posted on 2011-09-09 14:24:49 by fbkotler
Thanks again Frank,
For a better understanding of how computer processing and communication is executed, i'm looking for material that explains how and why assembly language is converted into machine code step by step.
Programming from the ground up is excellent so far, but i'm only on page 20...
Thanks for any help in advance....
Posted on 2011-09-10 12:48:01 by UNEE0X
Converting assembly language to machine code is trivial:
Assembly language is just a mnemonic representation of the actual instructions in machinecode (basically just the 'human-readable names').
So in a nutshell, an assembler just converts each mnemonic to its respective binary code.
If you want to know what mnemonics there are, and how they are encoded, just look at the instructionset reference for the CPU you're using. Sadly x86 is completely nightmarish in its instruction encoding, so I wouldn't recommend starting with that one.
If you've seen one, you've seen 'em all.. so perhaps you should look at the Zilog Z80 manual: http://www.zilog.com/docs/z80/um0080.pdf
It has a nice breakdown of the types of instructions and different encodings, and then gives a number of examples where it breaks down the encoding of an instruction.

That covers the 'how'.
The 'why' is rather simple: a CPU can only run machine code. If you don't convert assembly language (or any other language for that matter) to machine code, it won't work.
The reverse question might make a bit more sense: why do we use assembly to program, rather than machinecode? Well, the answer to that is already given more or less above: it's quite tedious to encode instructions to machinecode by hand, so someone decided to write a tool that did it automatically, and the assembler was born.
Assembly language already existed before then, but it was mostly used as a tool to write out a program on paper, before actually encoding it into machine code.

Since assembly is nothing but a direct translation of machine code, there is nothing to gain by encoding it manually, unlike most other languages. For most languages, a compiler is used to convert to machine code... and the quality of the code depends on how optimal the conversion is. Generally compilers can be outperformed by a well-skilled assembly programmer (but not by just ANY programmer, writing in assembly doesn't make code faster automatically. It's only faster if you can optimize better than the compiler can).
Posted on 2011-09-10 13:53:10 by Scali