Hi, neigh sure if this is the most appropriate forum for this. I was just wondering, why are there only four general purpose registers (and just 8 registers in total?). Why not twice as many, or half as many...?
:confused:
Posted on 2004-04-09 17:01:31 by adamjjackson
Because x86 is ancient technology, and rather than doing "the right thing" while they could, intel kept expanding it in a patchwork-like way. This means we're stuck with a CPU that supports 16bit crap, whose 32bit protected mode format is "weird-a$$" because it has to support 286-style 16bit protected mode, we have a silly low amount of registers, we can't specify target-register on operations, many operations are bound to special registers (mul/div etc).

It also means the CPU electronics are a lot more complicated than need be, and also partially that we have to support a lot of crappy legacy hardware and weirdness.

But well... it works :P
Posted on 2004-04-09 17:17:02 by f0dder
On the plus side, it makes instructions quite compact :alright: Although the usefulness of the one-byte xchg reg,(e)ax instructions is debatable.
But why do you refer to only 4 of the registers as "general purpose"? Each of the 8 registers has its own unique uses, but most instructions can operate on any register.
Posted on 2004-04-09 17:27:55 by Sephiroth3
And the 16-bit stuff is an extension of an 8-bit architecture. The 16-bit stuff was designed back when silicon was much more expensive than it is now. We're talking top of the line 1Kbit or 4Kbit static RAM. (Look carefully...that's K and bits.)
Posted on 2004-04-09 17:28:18 by tenkey
Nah, the "16-bit stuff" has always been there, right from the 8086. In fact, there are many instructions which are only defined for 16-bit operands, such as push, pop, call far, jmp far, and some instructions which have one byte encodings for 16-bit register operands only (INC, DEC, XCHG AX).
Posted on 2004-04-09 17:44:09 by Sephiroth3
Before the 8086 was the 8088 (no 16-bit address bus on this baby).
Posted on 2004-04-09 18:27:02 by Sentient
I agree with the decision to keep backward compatibilty. Sure it presented problems and there were trade-offs, does this make my life not worth living ?? I really could not give a rat's A** how hard it is for Intel to make sure that my old software still runs or the complications in making a new C compiler. I paid money for my software and if they are expecting me to throw it all out everytime they want me to upgrade my CPU, well they'll be waiting quite a while. After all, with all the bells and whistles and cool CPU munching features, I still just use Wordpad for letters and everything else I use personally (outside of programming) would run just fine on a 386. So unless Intel plans on paying me to replace my software, they better make the new stuff compatible or die. They figured that out pretty quick when AMD extended the x86 family into 64 bits.

Long and short of it, I don't care about the troubles of Intel, they make hordes of cash selling those CPU's and in the end programmers make up a very small percentage of users and assembly programmers who actually have to deal with this stuff an even smaller subset of that. If they can only figure out how to put in 8 GP registers then that is all there is to work with, if they have to break compatibility to have more then they can try and I'll be bidding on the liquidation contract for the bankruptcy disposal.
Posted on 2004-04-09 18:59:13 by donkey
They should have toasted backwards compatibility when they introduced the 386, and written an emulator. At least they shouldn't have kept 16bit pmode crap, hardly anybody used it.

It's not about making life easier for any language or programmer, be it C or Asm (although that could have been a consequence), it's more about having more efficient chips - faster and less powerhungry. Of course that's far too late by now because intel decided to keep on patching and patching. Would have been nice with a less crippled architecture, though.

As for the whole AMD thing, I think it's a lousy thing to do. We finally had a chance to break free from IA32 when doing the 64bit jump, and what does AMD do? Idiots. The thing to keep in mind here is that all major and important software would be ported to native 64bit, while the processors would be more than fast enough to run 32bit legacy software (just look at the old 64bit Alpha architecture - the 32bit x86 -> 64bit Alpha JITer was so efficient that some code actually ran faster on the Alpha than the 32bit processors available at that time).

But oh no, AMD had to hang on to IA32 since they don't have what it takes to design an architecture of their own. Sigh.
Posted on 2004-04-09 19:25:35 by f0dder
Simple, the x86 is CISC architecture, CISC architectures need much more complex circuitry to be implemented and some microcode ROM, thus there isnt much space left for registers. If you want more registers try and get working with a RISC architecture, they usually have huge amounts of registers, or work with the Motorola 68K series.
Posted on 2004-04-09 19:44:21 by x86asm

Before the 8086 was the 8088 (no 16-bit address bus on this baby).

You mean 16-bit data bus :grin:
Posted on 2004-04-09 19:45:53 by x86asm
Hi f0dder,

I completely disagree. From a non-programming standpoint, that is as a normal everyday user, I could care less about the difference in architecture. For me the difference is that what used to take 20 microseconds now takes 3 microseconds, nothing there I could ever notice. In business, speed is critical but the P4HT machines they are installing with Linux are more than fast enough for any forseeable future need. Processing power is for corporations, web servers and gamers, corporations can afford the new software, web servers don't have to run Intel chips, that leaves gamers and that part of the community holds little interest for me.

What does interest me is the thousands of dollars I have spent on software over the years. All of it which still runs perfectly fine on my P4, P3, 486 etc... AMD did the right thing, they responded to what the customers want not what some conglomerate thinks they need. The same thing happened to IBM in the late 80's with the PS2, they thought the ISA architecture was too limiting and decided they knew better than their customers, well I'm not typing this on a PS2 15 years later.

Again, I don't care much about the problems of Intel or what would be better, I care about leveraging my existing investment in software. Just to make sure I don't need any more power or CPU features I think I'll check my CPU idle time, oh 99%, better upgrade.
Posted on 2004-04-09 20:08:38 by donkey
Well donkey, from what you described, you don't need 64bit technology (nor does a lot of people). So try to think of it this way instead:

for those people who need 64bit technology, and want the "latest and greatest", wouldn't it be nice to have a machine that ran a few percent faster without paying extra? (As in, when already shelling out for a 64bit machine). Since such people want "latest and greatest", backwards compatibility shouldn't be much of an issue, as long as there's drivers for the most recent & high-performance stuff, and the few legacy apps they need can run with reasonable performance.

I'm sure the systems would run better without x86 legacy junk - look at some of the other CPU architectures around, they do a lot more with a lot less MHz.

Ok, back when the switch was made to 386 processors, there probably wasn't enough brute power & tech knowledge to run legacy code in a JITer, and by today there's very large amounts of legacy code. The CPUs are *fast* though.

I personally think 64bit computers for the desktop are a bit silly, you don't really *need* them unless very heavy things - heavy databases, heavy video processing, etc. However, whether we like it or not, 64bit is coming to the desktop, and it will succeed because of media hype and because it's backed by the latest MS operating systems.

In the context of that, I think it would be nice having a clean system next time I'm forced to upgrade (which will probably take a while, since my current system is pretty okay). AMD spoiled that by their move (I bet intel had design plans for their counter-CPU all the time, but weren't going to manufacture it unless a hybrid-64 CPU was released by AMD).

...

At least other junky parts of the PC architecture are getting swapped out. Not exactly in a backwards-compatible way, but as long as you have an OS with support, it'll work (because, thankfully, most people have moved away from programming the hardware directly). I'm talking about things as replacing ISA/VLB/PCI/AGP bus with PCI-X or whatever it was, using the new timers (APIC?) instead of the PIT, Local APICs instead of the old interrupt controllers, DDR ram, SATA harddrives, USB for keyboard/mice, etc. Too bad there's still a lot of legacy junk & emulation left behind.

This might not be stuff that "common" people can feel, but if you push your system even a bit (running heavy data processing, massive file editing, hell even todays monster games), it does make a difference.

And, well, perhaps most important: I don't like the "feel" of legacy that's more than a few decades old, especially when the legacy restrictions and workarounds are silly. I'm sure a lot of OS programmers will agree on this point :)
Posted on 2004-04-09 20:45:31 by f0dder
Hi f0dder,

I will probably buy an AMD 64 bit machine, a few percentage points mean much less to me than the pile of software on my desk. Hell, my PeachTree home accounting was written for Windows 3.1 and it does everything I need and much more, it runs perfectly fine under Win2K and I could probably run it on a 286 since that was the machine I had when I bought it. I have never upgraded it because I have never had cause to want more than what it does.

The marketing strategy of Intel is to make you think that you need or want a feature, the last useful feature added to the x86 family was MMX. After all let's face it SSE and SSE2 are virtually unused and probably won't be. What it boils down to is that Intel needs to make something that you will want to buy, and in reality there isn't a hell of alot left that they can add anymore. The have reached a plateau where functionality has met consumer demands and they need a new angle. 64 bit is the big hope for them, if it fails they have nothing left to offer. For myself I buy new machines for fun, but I also realize why, and I am perfectly aware that I don't need the extra power. I am not easily duped into the bandwagon mentality that the chip manufacturers like to promote in order to fool unwary users into upgrading for no reason. For example I have a friend who has a P4, 1 Gig with an 80 Gig drive and he bought it to replace his PMMX and all he does is browse the internet.

Here, we have a skewed view of the world outside of the dev cycles. In reality, outside of gamers, most people jump on the net once or twice a week, balance their check book and pay a few bills with their computer. I was one of them not too long ago, before I got into my current obsession, and I am increasingly aware that I now spend more and more time doing furniture than programming so I will probably be one again. That group represents the vast majority of users but when you only move in communities that attract power users you tend to forget that fact.

64 bit will catch on and become the standard, of that I have no doubt. But it will not be because it is better or faster, it will be because some blue guys dance around and make the working stiffs think they need it.
Posted on 2004-04-09 21:34:21 by donkey

After all let's face it SSE and SSE2 are virtually unused and probably won't be.

I think DirectX uses it, but dunno how much or how efficiently. I certainly do like the SSE optimizations in DivX encoding though, and I'm pretty sure the people who set up a "pretty big" Pentium4 cluster at a danish university like the instruction set too. It might not be used all that widely, but where it is, it does kick some serious behind.


it boils down to is that Intel needs to make something that you will want to buy,

True, same goes for just about any company in this business.


and in reality there isn't a hell of alot left that they can add anymore.

For Mrs. and Ms. Doe, indeed not - they'll just have to be slapped around with some hype. Gamers will always need more though, and it's not only because coders are sloppy - a lot of interesting things are being added to games these days. I wish gameplay was one of them, though :rolleyes:. Brute MHz force is also interesting for media encoding though, and since DV cams and DVD burners are affordable now, more and more "normal" people are messing with this. Not to mention all the people encoding DVDs into DivX for more or less legit reasons. (Legit reason: storing all your DVDs as DivX on a server on the LAN, then playing them on your KiSS DVD/DivX player in the living room. Yep, I know people doing this.)


That group represents the vast majority of users but when you only move in communities that attract power users you tend to forget that fact.

I haven't forgotten it :) - my idea was that power users would appreciate a bit faster systems, programmers would appreciate a less clumsy architecture, and the 95% regular people wouldn't be able to feel the speed hit on their legacy apps running on their wildy overdimensioned beast of a computer :)


64 bit will catch on and become the standard, of that I have no doubt. But it will not be because it is better or faster, it will be because some blue guys dance around and make the working stiffs think they need it.

Indeed, we fully agree here. Some power users and programmers will enjoy 64bit (yay, larger address space, now I fully memory map that 8gig file, or those 10 1-gig files - yeah, I've wanted to do things like this myself), etc. But it's only a few percent of us, and frankly I would rather see a clean 32bit system. But that's the dreamworld ;)
Posted on 2004-04-09 21:55:33 by f0dder
To all interesteds,
A better architecture has been here for a long time now. The Motorola 68000 series does not suffer from register starvation (16 registers). Plus it can do a memory to memory transfer without going through a register first. And what about the Power PC? Ratch
Posted on 2004-04-09 22:12:01 by Ratch

To all interesteds,
A better architecture has been here for a long time now. The Motorola 68000 series does not suffer from register starvation (16 registers). Plus it can do a memory to memory transfer without going through a register first. And what about the Power PC? Ratch


A good example Ratch !

How many Mac users understand the difference in architecture ? I know people who have Imacs and have bought PC software for them, not even understanding that the OS is different. I have an Imac on my desk, it's my wife's, and I use it for almost all of my graphics and dvd stuff as well as many other things. But what is the most common difference noticed when a PC user tries a Mac ? Beleive it or not it's the way the mouse tracks, it appears more stable on the Mac and people like that for drawing.
Posted on 2004-04-09 22:28:14 by donkey

The same thing happened to IBM in the late 80's with the PS2, they thought the ISA architecture was too limiting and decided they knew better than their customers, well I'm not typing this on a PS2 15 years later.
They were ahead of their time. (Plus they were trying to go to proprietary hardware, like their mainframe business.) The Pentiums I run don't have ISA, they run PCI, AGP, and some kind of SDRAM interface (PC-100, PC-133, or a DDR SDRAM interface).

I do have to agree that at a certain level, it really doesn't matter what the underlying machine is, as long as it runs your software, and runs it well. Consider the fact that in 1983 (2 or 3 years after the IBM PC was released), CP/M machines (8-bit) were still being sold. I'll mention Apple II in passing, because I don't have a clear memory of when they were hot.

Originally posted by Sephiroth3
Nah, the "16-bit stuff" has always been there, right from the 8086. In fact, there are many instructions which are only defined for 16-bit operands, such as push, pop, call far, jmp far, and some instructions which have one byte encodings for 16-bit register operands only (INC, DEC, XCHG AX).
Although they are not binary compatible, there is a remarkable similarity between the 8080/Z80 processors and the 8086.

The ability to refer to two 8-bit registers as a single 16-bit register. Or to put it another way, the ability to refer to each half of a 16-bit register by name. Keep in mind that Intel defined a mapping of 8080 registers to 8086 registers for code migration.

The "string" instructions of the Zilog Z80 and its use of implicit address and counter registers.

The Intel/Zilog 8-bit processors also had 16-bit pushes, pops, incs, and decs in one byte.

And the "far" features aren't 16-bit features. They're "segmented memory" features.

Originally posted by Sentient
Before the 8086 was the 8088 (no 16-bit address bus on this baby).
No, the 8086 preceded the 8088. Intel designed a "16-bit processor" and then replaced the 16-bit BIU (bus interface unit) with an 8-bit version for the '88. The execution unit is the same on both processors, and it is 16-bit. There was an article or paper which mentioned Intel redoing the prefetch queue of the '88 (during development) because it was too busy when it was the same size as the '86 queue.
Posted on 2004-04-10 00:07:47 by tenkey

Hi, neigh sure if this is the most appropriate forum for this. I was just wondering, why are there only four general purpose registers (and just 8 registers in total?). Why not twice as many, or half as many...?
:confused:
So...to answer this question...

Actually, this thread probably belongs in Heap.

Only four registers have names for byte values. Their history...

8080:

7 registers: A, B, C, D, E, H, L

3 register pairs: B & C, D & E, H & L
Zilog syntax named them BC, DE, and HL everywhere.
Intel syntax used only HL, and only in instruction names: LHLD, SHLD, XTHL, and PCHL

Intel mapping of 8080 to 8086:

A -> AL
B -> CH
C -> CL
D -> DH
E -> DL
H -> BH
L -> BL

which means

BC -> CX
DE -> DX
HL -> BX

Round up 7 bytewide registers to a power of 2 (meaning 8), and reuse the register field(s) to address 16-bit registers, yielding 8 wordsize registers.
Posted on 2004-04-10 01:20:42 by tenkey

But what is the most common difference noticed when a PC user tries a Mac ? Beleive it or not it's the way the mouse tracks, it appears more stable on the Mac and people like that for drawing.

USB mouse, or PS/2 mouse with reports per second set to 200 (only works on win2k+ or special drivers/tweak-utils on 9x) will fix this :)

Btw, macs might be better architecture, but they lack windows ;) (not necessarily a bad thing, though - considering things like a OS upgrade is as simple as dragging the CD icon to the harddrive icon).
Posted on 2004-04-10 04:15:59 by f0dder
I'm going to sound ignorant here but, why not add in new registers anyway? How does that affect backwards compatibility? For example a new/extra instruction set with new/extra registers wouldn't affect old software would it? And then may provide benefits for newer software...?
:notsure:
Posted on 2004-04-10 04:30:15 by adamjjackson