Hello asm community,

To be backward compatible, the Intel microprocessors, use the Real and Protected modes, plus the 16 bit and 32 modes.

My question is how can the microprocessor be made to operate in these modes?

Regards,
leopardus.
Posted on 2007-01-15 03:32:05 by leopardus
...and 64-bit (or "long") mode.

How is it possible? Check the intel architecture PDFs, the "system programming" volume. Basically 32bit mode is entered by setting bit 1 of the CR0 control register - but you need a fair amount more code than just that to actually do anything.
Posted on 2007-01-15 03:37:25 by f0dder
A quick analysis shows that 16-bit mode is just a 'cut' of 32-bit mode. everything works essentially the same, but the default settings 'fit' the 16-bit applications. Memory addressing, for example: there is a limit set in GDT, so that 16-bit apps cannot access full 32-bit addresses, but at the same time there is no General Protection Fault raised if a 16-bt application actually tries to access the memory outside the range specified in GDT. 32-bit mode works the same way on 64-bit CPUs (by setting the default values to 'fit' 32-bit applications): 32-bit address translation to 64-bit canonical format is being done automatically while in 32-bit mode.
The 'mode' itself is just a single bit, to make turn it on globally and just a single bit controls it locally (per application -- the "G" bit in segment description, iirc). By default, segments have this bit set to 0.

What I'm trying to say is that switching from 16-bit to 32-bit mode (or from 32-bit to 64-bit) is not really 'changing the CPU operational mode' (like: switching from one virtual cpu to another). It's just changing the default oprating parameters so that instructions start to behave like in 32-bit, memory references are treated like 32-bit, etc.

This is exactly how the so called "unreal mode" works: you change some parameters, but not all of them, so the result is that some things behave like 32-bit, and some like 16-bit. This would be impossible if the processor was changing its operational 'mode' (because this way it would be able to switch it to _EITHER_ 16-bit _OR_ 32-bit, but not 'mix' would be possible). It is just changing its operational 'parameters' (like "instruction length", "max addressable memory cell", "address translation", etc.).

I hope you get the point ^^
Posted on 2007-01-15 05:35:04 by ti_mo_n