Afternoon, Steve.
That must be a fine bottle of malt you've got there :grin: .
Afternoon, Scali.
Isn't the problem with these chips including legacy x86 is that we don't actually need all of that?
Couldn't they cut the legacy x86 stuff out of the 64bit chips?:confused:
i.e. get rid of segment/offset addressing (have the chip start in flat mode).
It would mean DOS and old Windows OSes could no-longer run on these machines, but so what? Most people purchasing systems with these chips in them would be installing late-model OS and software. This means no Win9x OS, no DOS, no Win9x software.
Afternoon, Randy.
I certainly don't like the idea of having to wait 15 minutes for software/ data files to load into ram whenever I start my PC. Unless, of course, the idea becomes prevalent to never, ever turn your PC off (once the major applications/data are cached then there will no-longer be a problem with startup loadtimes). I, for one, would not like to hear the fans churning all-night-long:tongue: .
I also do not see the typical domestic user using their PC to edit full-length motion homevideos:eek: .
Multimedia businesses (and semi-professionals :grin: ) might find that useful, however most people I know only ever use a home PC for email, letters, Net browsing, and MS Hearts (and we've had the capability to edit video/ sound cheaply since Win95).
I think Scali is mainly talking about the normal domesticated home user - not powerusers or businesses.
Also:
Apart from the extended addressing capability a 64bit will bring, what else would be improved?
We've already got fpu, mmx, sse, sse2, etc for helping speed up calculations (usually for multimedia applications).
With 64bit general-purpose registers what would/ could we possibly do with them? It's not just a matter of "wait until software is built for them to see what software will be built for them". We are designers and programmers here. I'd like to know what sort of coding we're likely to be getting into in the next few years.
Cheers,
Scronty
That must be a fine bottle of malt you've got there :grin: .
Afternoon, Scali.
Isn't the problem with these chips including legacy x86 is that we don't actually need all of that?
Couldn't they cut the legacy x86 stuff out of the 64bit chips?:confused:
i.e. get rid of segment/offset addressing (have the chip start in flat mode).
It would mean DOS and old Windows OSes could no-longer run on these machines, but so what? Most people purchasing systems with these chips in them would be installing late-model OS and software. This means no Win9x OS, no DOS, no Win9x software.
Afternoon, Randy.
I certainly don't like the idea of having to wait 15 minutes for software/ data files to load into ram whenever I start my PC. Unless, of course, the idea becomes prevalent to never, ever turn your PC off (once the major applications/data are cached then there will no-longer be a problem with startup loadtimes). I, for one, would not like to hear the fans churning all-night-long:tongue: .
I also do not see the typical domestic user using their PC to edit full-length motion homevideos:eek: .
Multimedia businesses (and semi-professionals :grin: ) might find that useful, however most people I know only ever use a home PC for email, letters, Net browsing, and MS Hearts (and we've had the capability to edit video/ sound cheaply since Win95).
I think Scali is mainly talking about the normal domesticated home user - not powerusers or businesses.
Also:
Apart from the extended addressing capability a 64bit will bring, what else would be improved?
We've already got fpu, mmx, sse, sse2, etc for helping speed up calculations (usually for multimedia applications).
With 64bit general-purpose registers what would/ could we possibly do with them? It's not just a matter of "wait until software is built for them to see what software will be built for them". We are designers and programmers here. I'd like to know what sort of coding we're likely to be getting into in the next few years.
Cheers,
Scronty
Isn't the problem with these chips including legacy x86 is that we don't actually need all of that?
Couldn't they cut the legacy x86 stuff out of the 64bit chips?
i.e. get rid of segment/offset addressing (have the chip start in flat mode).
It would mean DOS and old Windows OSes could no-longer run on these machines, but so what? Most people purchasing systems with these chips in them would be installing late-model OS and software. This means no Win9x OS, no DOS, no Win9x software.
Couldn't they cut the legacy x86 stuff out of the 64bit chips?
i.e. get rid of segment/offset addressing (have the chip start in flat mode).
It would mean DOS and old Windows OSes could no-longer run on these machines, but so what? Most people purchasing systems with these chips in them would be installing late-model OS and software. This means no Win9x OS, no DOS, no Win9x software.
Apparently people prefer to have support for 8086 code and DOS 1.0 on their new PCs.
And if you make the step to abandon legacy support anyway, you might aswell go all the way, and go Itanium.
It can run legacy software, albeit with the help of a JIT. But one can argue that legacy software isn't such a big issue, since you ran it on your old system aswell, which was much slower. The thing is that you have to make the move at the right time.
Intel is slightly late atm. Their Itaniums can get about 1.5 GHz Xeon performance on x86 code. Had they had these a while ago, when people were upgrading from < 1.5 GHz Xeons, there would not have been a problem at all.
There still isn't much of a problem if you ask me, if you can accept the performance of a 1.5 GHz Xeon for your legacy code. As long as your performance-critical stuff is native, there should be no problem (who cares if IE or Word run on 'only' a 1.5 GHz Xeon? :)).
Apple made the move at the perfect time. The PowerPC was so much faster than the 68k at the time, that even the emulation of 68k was not a big issue, it was still 'fast enough', and the gain of native code was so large that it was hard to pass up.
And ofcourse it helped that 68k stopped there and then, it would never become faster anymore. PowerPC was the only CPU with a future. Intel wanted to create the same situation for PCs, I think. But AMD spoilt that for now.
I think Scali is mainly talking about the normal domesticated home user - not powerusers or businesses.
Yes and no. I mean home users and the average office user (IE, Outlook, Office, etc). I think that is by far the largest group of computer users. The other groups are small specialist groups I guess (CAD/CAM, audio/video engineering, databases, etc).
They generally don't use regular PC systems anyway, but something slightly more powerful (the so-called workstations, not necessarily x86, not necessarily Windows, and not necessarily 32 bits).
Afternoon, Scali.
Yes and no. I mean home users and the average office user (IE, Outlook, Office, etc). I think that is by far the largest group of computer users. The other groups are small specialist groups I guess (CAD/CAM, audio/video engineering, databases, etc).
They generally don't use regular PC systems anyway, but something slightly more powerful (the so-called workstations, not necessarily x86, not necessarily Windows, and not necessarily 32 bits).
I concur.
Most of the database users I've met access the database on a server over a network. It ends up being the server and network speed which is the bottleneck - not their own workstation. The workstation's client only sends commands and receives results. No (not much) actual number/data crunching gets done on the workstation.
Ditto for multimedia producing professionals.
Take Weta for example. The video manipulation and adjustments gets done on their servers, not the workstations, AFAIK.
Cheers,
Scronty
Yes and no. I mean home users and the average office user (IE, Outlook, Office, etc). I think that is by far the largest group of computer users. The other groups are small specialist groups I guess (CAD/CAM, audio/video engineering, databases, etc).
They generally don't use regular PC systems anyway, but something slightly more powerful (the so-called workstations, not necessarily x86, not necessarily Windows, and not necessarily 32 bits).
I concur.
Most of the database users I've met access the database on a server over a network. It ends up being the server and network speed which is the bottleneck - not their own workstation. The workstation's client only sends commands and receives results. No (not much) actual number/data crunching gets done on the workstation.
Ditto for multimedia producing professionals.
Take Weta for example. The video manipulation and adjustments gets done on their servers, not the workstations, AFAIK.
Cheers,
Scronty
Using a powerful server is one way, yes.
But I was referring to the standalone type of workstations.
Basically those are very high-end PCs or other proprietary hardware.
For exampe, HP sells single and dual Itanium workstations, with ATi FireGL display cards (see here: http://www.hp.com/workstations/index.html). It's not quite big server/mainframe material yet, it still fits under your desk. But it's well out of the range of regular PCs. The FireGL alone is about as expensive as an entire PC fitted with a Radeon (the FireGL's mainstream cousin) :).
There are also Xeon and Opteron-based workstations ofcourse.
But anyway, such users aren't the ones we're talking about here, I suppose. Since they are already using 64 bit PCs, or multi-CPU, or whatever. But they have a reason to (although they often use 64 bit CPUs because they happened to be available anyway, not because they actually require the extra addressing. And unlike the x86-world, they didn't have a 32 bit alternative which would offer more performance).
But I was referring to the standalone type of workstations.
Basically those are very high-end PCs or other proprietary hardware.
For exampe, HP sells single and dual Itanium workstations, with ATi FireGL display cards (see here: http://www.hp.com/workstations/index.html). It's not quite big server/mainframe material yet, it still fits under your desk. But it's well out of the range of regular PCs. The FireGL alone is about as expensive as an entire PC fitted with a Radeon (the FireGL's mainstream cousin) :).
There are also Xeon and Opteron-based workstations ofcourse.
But anyway, such users aren't the ones we're talking about here, I suppose. Since they are already using 64 bit PCs, or multi-CPU, or whatever. But they have a reason to (although they often use 64 bit CPUs because they happened to be available anyway, not because they actually require the extra addressing. And unlike the x86-world, they didn't have a 32 bit alternative which would offer more performance).
Scali,
When I bought that 486 in 1990 a Trident 8900 was so hot, it sizzled. I think I still have it in my ancient computer junk even though i don't think I have a board that could use it. The real big deal was the $1500 board then I had to pay for the 486 after that and the 300 meg full height ESDI compete with controller card cost me about $2700.
Still, it did the job for years and I ran it until late 1994 when the ESDI controller card started to fail. I still own the 2 ESDI disks but cannot get a WD1007a ESDI card to read them.
When I bought that 486 in 1990 a Trident 8900 was so hot, it sizzled. I think I still have it in my ancient computer junk even though i don't think I have a board that could use it. The real big deal was the $1500 board then I had to pay for the 486 after that and the 300 meg full height ESDI compete with controller card cost me about $2700.
Still, it did the job for years and I ran it until late 1994 when the ESDI controller card started to fail. I still own the 2 ESDI disks but cannot get a WD1007a ESDI card to read them.
Trident was notorious for its dreadful performance and generally being useless in anything but mode 13h.
Absolute budget crap. I had a Paradise ISA card back in those days, and later upgraded to a VLB Diamond SpeedStar Pro.
Trident? ugh. They couldn't even play Doom with the window set to largest, in 320x200. Even my Paradise ISA card could do that (and that one came out of my old 386sx, because I only upgraded the motherboard at first, had to save a bit extra for a decent VLB card. Imagine how pathetic Tridents were :))
And you say you should buy decent hardware?
Absolute budget crap. I had a Paradise ISA card back in those days, and later upgraded to a VLB Diamond SpeedStar Pro.
Trident? ugh. They couldn't even play Doom with the window set to largest, in 320x200. Even my Paradise ISA card could do that (and that one came out of my old 386sx, because I only upgraded the motherboard at first, had to save a bit extra for a decent VLB card. Imagine how pathetic Tridents were :))
And you say you should buy decent hardware?
Some things to say...
Firstly, randall.hyde is off about how video editing works. Last time I used video editing software, basically the software batched my operations, and when I was done, it would render all operations to a new file (yes mostly linearly).
Yes, but that is called rendering. That is not editing.
The caching also doesn't involve any 'data moving' in this case, other than the first time to initialize the cache. After that, it's static.
Really?
As for the 3d avatars, I already explained how they would be implemented, and how this would not take a lot of memory. I find it rude to just ignore that, and reiterate the same point, which is already proven wrong.
Proven, eh? And what "proof" do you offer? Have you implemented this? I want to see your code!
And synthesizer samples are very much streaming data. If you do random access on them, you would simply get white noise, or something. You start at a certain point in the sample, and then play linearly, much like with video. And this means that the same caching scheme would work.
How little you know about this. Just because you can post a flippant remark doesn't make you right. I would *love* to see you stream all the instruments for a 101-piece orchestra from a disk subsystem when you're employing gigasamples. For an individual instrument, the individual samples (usually about 30 seconds' worth) are long and spread out all over the place. Guess what happens when you change notes? Random access! Guess what happens when you don't play the full 30 seconds? Random access (to the end of the sample to get the decay). And guess what happens when you playing multiple instruments, chords, and notes all over the place? Random access like mad!
Yes, you're absolutely right. It's just a linear data stream. Not!
That's the point. I am not against 64-bit, but I am against x86-64, because it's just a hackjob, which will effectively set us back 10 years again, with all the legacy support holding it down, while there are sleek, modern 64-bit CPUs on offer aswell. Most of your monologue is about moving forward, and taking advantage of new technology. Then you should actually agree with my standpoint, that we should USE new technology, and not x86 with 64-bit extension patched on, since it is much slower and more expensive than a pure 64-bit one.
Yes, we *should* use new technology. But that ignores the economics of the situation. People aren't going to reinvest in new software. And they aren't going to pay big bucks for hardware that runs their current software *slower* than their old software.
.NET would fix the software problem.
No, because it still requires people to purchase or upgrade their software to take advantage of the new hardware. If they're willing to switch over to .NET, why not just get new software in the first place? Gee, that make Linux and StarOffice start to look pretty good from a ROI point of view.
And I'd rather stick with 32-bit x86 for a bit longer (while we still can), if that means we will eventually move to a GOOD 64-bit system. Going x86-64 now will mean that we will remain locked-in with x86 for longer, and no more direct excuse to move away from it (after all, we already have 64-bit then). And that is why I do not appreciate AMD's move to go x86-64, especially since Intel apparently was NOT going to go down that road.
And I would therefore like to ask... WHY do you have a Macintosh? Because you wanted to choose the best technology for the job? I would like that freedom as well.
No one is taking that freedom away from you. You can go buy a 64-bit Itanium, HP, Sparc, MIPS, etc., today. The problem, though, is that the marketplace won't follow you in that direction. You're not asking for the freedom to do what you want to do, you're asking for the freedom to force your goals on everyone else. What about *their* right to freely choose where they want to go? What about AMD's? AMD stepped in and filled a market void. They are successful because that's what people want. If they were not successful at this, why did Intel follow their lead and announce 64-bit extensions to the x86? The problem is that Intel was trying to force *their* vision down people's throats (just as they tried with the iAPX432) and the people decided against that by voting with their wallets and their enthusiasm.
The good news is, just wait a few years and we'll see how it all turns out. I'm old enough to remember when 4GB was considered "infinity". I can remember technical papers at conferences trying to decide what on earth people would with with their machines when they had 256MB installed. I'm old enough to remember Macintosh users saying things like "this audio editing application is really neat, but I can't get very much audio on my 800K floppy disk."
Yes, *current* users can get by with a PII and 128MB of RAM. Think they could get by with 32MB of RAM? I happen to be old enough to remember with people were saying the same thing about 128MB, and 32MB, and even 640K, for that matter. Today's users might be happy with 128MB running today's apps, but once tomorrow's apps become available, it will be a different thing altogether.
Cheers,
Randy Hyde
Morning, Scronty,
That's not how those systems work. When you turn on the system, they begin caching up all your most recently-used data files and executables in RAM. As you modify files (or executables, for that matter), the changes are written back to disk on a scheduled basis. The disks are used as backing store, memory is used for primary store. The only trick to powering down is making sure that all pending writes are completed first (a problem that exists with modern OSes, I might add). There were a lot of papers on this type of OS optimization in the late 1990s (when I was actively engaged it OS research).
I also do not see the typical domestic user using their PC to edit full-length motion homevideos:eek: .
Multimedia businesses (and semi-professionals :grin: ) might find that useful, however most people I know only ever use a home PC for email, letters, Net browsing, and MS Hearts (and we've had the capability to edit video/ sound cheaply since Win95).
I think Scali is mainly talking about the normal domesticated home user - not powerusers or businesses.
Those are power users *today*. But follow the curve, tomorrow's home users are the ones running applications that today's power users dream about. Gee, 10-15 years ago we didn't have browsers. Think of the power it takes to run today's IE reasonably well. That would have been a *very* powerful machine just five years ago. Part of this is crap coding, of course. But the functionality in modern browsers is rather impressive these days. Soon, we'll see the equivalent of Apple's iTunes for video (iVideo?). And, unlike audio, people might actually want to watch multiple video streams concurrently (obviously, this can't be done well with audio, but it's quite reasonable for video). This is an example of a "domesticated home user", not just a power user or business. Indeed, integration of the home entertainment system with computers is one of the next big killer apps, and that's where having lots of memory will come in. Don't think home users will want a database server? Think again. They'll connect three or four TVs throughout their house to a single "video on demand" server so they can watch, interactively, any movie on their server (with each TV watching a different video stream). Complete with fast forward, rewind, nX speed, content switching, and so on. Having done just a *little* work on a video on demand system (granted for cable companies, not home use), I can tell you that the way to get high-quality video is to buffer up a *ton* of data in memory and feed the streams from memory. This avoids the crazy random access patterns you get trying to feed multiple video streams from the same hard drive. It is easy for me to imagine a typical home system playing back three or four video streams while recording two or more from cable simultaneously. Though the aggregate data rate of the disk drive *might* support all this (or, you could limit the number of video streams to what the disk drive could support), that abililty would be blown to bits with the random access patterns you'd encounter processing multiple video streams. The head movements alone would kill you. This is a perfect example of where a "home" user could make use of more than 4GB of memory.
Also:
Apart from the extended addressing capability a 64bit will bring, what else would be improved?
We've already got fpu, mmx, sse, sse2, etc for helping speed up calculations (usually for multimedia applications).
With 64bit general-purpose registers what would/ could we possibly do with them? It's not just a matter of "wait until software is built for them to see what software will be built for them". We are designers and programmers here. I'd like to know what sort of coding we're likely to be getting into in the next few years.
Cheers,
Scronty
Hard to say. To me, the big advantage of the 64-bit architecture is the address bus. Being able to process 64-bit integers with one instruction is slightly useful, but I don't think it will have that great an impact on software. Yes, some algorithms will be improved. For example, block moves will probably run twice as fast :-). Some large data type operations (e.g., multimedia) might be improved. But overall, most applications are still processing bytes or words (browsers, email, Hearts). Given that most data structures are not "bit limited" today, the primary reason I see for 64-bit arithmetic is so that you can manipulate 64-bit pointers to memory. Then again, I may be suffering from the same "32-bit myopia" that I'm accusing others of :-)
Cheers,
Randy Hyde
Originally posted by Scronty
I certainly don't like the idea of having to wait 15 minutes for software/ data files to load into ram whenever I start my PC. Unless, of course, the idea becomes prevalent to never, ever turn your PC off (once the major applications/data are cached then there will no-longer be a problem with startup loadtimes). I, for one, would not like to hear the fans churning all-night-long:tongue: .
I certainly don't like the idea of having to wait 15 minutes for software/ data files to load into ram whenever I start my PC. Unless, of course, the idea becomes prevalent to never, ever turn your PC off (once the major applications/data are cached then there will no-longer be a problem with startup loadtimes). I, for one, would not like to hear the fans churning all-night-long:tongue: .
That's not how those systems work. When you turn on the system, they begin caching up all your most recently-used data files and executables in RAM. As you modify files (or executables, for that matter), the changes are written back to disk on a scheduled basis. The disks are used as backing store, memory is used for primary store. The only trick to powering down is making sure that all pending writes are completed first (a problem that exists with modern OSes, I might add). There were a lot of papers on this type of OS optimization in the late 1990s (when I was actively engaged it OS research).
I also do not see the typical domestic user using their PC to edit full-length motion homevideos:eek: .
Multimedia businesses (and semi-professionals :grin: ) might find that useful, however most people I know only ever use a home PC for email, letters, Net browsing, and MS Hearts (and we've had the capability to edit video/ sound cheaply since Win95).
I think Scali is mainly talking about the normal domesticated home user - not powerusers or businesses.
Those are power users *today*. But follow the curve, tomorrow's home users are the ones running applications that today's power users dream about. Gee, 10-15 years ago we didn't have browsers. Think of the power it takes to run today's IE reasonably well. That would have been a *very* powerful machine just five years ago. Part of this is crap coding, of course. But the functionality in modern browsers is rather impressive these days. Soon, we'll see the equivalent of Apple's iTunes for video (iVideo?). And, unlike audio, people might actually want to watch multiple video streams concurrently (obviously, this can't be done well with audio, but it's quite reasonable for video). This is an example of a "domesticated home user", not just a power user or business. Indeed, integration of the home entertainment system with computers is one of the next big killer apps, and that's where having lots of memory will come in. Don't think home users will want a database server? Think again. They'll connect three or four TVs throughout their house to a single "video on demand" server so they can watch, interactively, any movie on their server (with each TV watching a different video stream). Complete with fast forward, rewind, nX speed, content switching, and so on. Having done just a *little* work on a video on demand system (granted for cable companies, not home use), I can tell you that the way to get high-quality video is to buffer up a *ton* of data in memory and feed the streams from memory. This avoids the crazy random access patterns you get trying to feed multiple video streams from the same hard drive. It is easy for me to imagine a typical home system playing back three or four video streams while recording two or more from cable simultaneously. Though the aggregate data rate of the disk drive *might* support all this (or, you could limit the number of video streams to what the disk drive could support), that abililty would be blown to bits with the random access patterns you'd encounter processing multiple video streams. The head movements alone would kill you. This is a perfect example of where a "home" user could make use of more than 4GB of memory.
Also:
Apart from the extended addressing capability a 64bit will bring, what else would be improved?
We've already got fpu, mmx, sse, sse2, etc for helping speed up calculations (usually for multimedia applications).
With 64bit general-purpose registers what would/ could we possibly do with them? It's not just a matter of "wait until software is built for them to see what software will be built for them". We are designers and programmers here. I'd like to know what sort of coding we're likely to be getting into in the next few years.
Cheers,
Scronty
Hard to say. To me, the big advantage of the 64-bit architecture is the address bus. Being able to process 64-bit integers with one instruction is slightly useful, but I don't think it will have that great an impact on software. Yes, some algorithms will be improved. For example, block moves will probably run twice as fast :-). Some large data type operations (e.g., multimedia) might be improved. But overall, most applications are still processing bytes or words (browsers, email, Hearts). Given that most data structures are not "bit limited" today, the primary reason I see for 64-bit arithmetic is so that you can manipulate 64-bit pointers to memory. Then again, I may be suffering from the same "32-bit myopia" that I'm accusing others of :-)
Cheers,
Randy Hyde
Yes, but that is called rendering. That is not editing.
It's the way video-editing programs work. What's your point?
Really?
Yes, instead of caching 1 thumbnail for each interval, you cache a small videoclip. Simple, no?
Proven, eh? And what "proof" do you offer? Have you implemented this? I want to see your code!
Ofcourse I have implemented this. All you need is motion capture and bone animation. It's been done for years in the movie industry, and now also in the game industry.
Guess what happens when you change notes? Random access! Guess what happens when you don't play the full 30 seconds? Random access
Random would imply that you don't know anything about the behaviour. Here you know at least that you will start at the start of the sample, then play it linearly, until a certain point. It's quite easy to apply a decent caching scheme optimized for such cases.
The whole argument is rather moot anyway, since specialized hardware will be so much better at this than a simple PC.
I use dedicated hardware almost exclusively for sound processing in my studio.
But that ignores the economics of the situation. People aren't going to reinvest in new software.
They do. They upgrade their Windows and Office all the time.
And if they don't, they shouldn't buy an x86-64 either, because without a new Windows, they cannot run 64 bit applications at all, and even if they do have a new Windows, they still need to buy new 64 bit versions of their applications. Might aswell get the best possible 64 bit hardware and software, while you're at it.
If they're willing to switch over to .NET, why not just get new software in the first place?
Because .NET will be cheaper, since it's a 'develop once, run anywhere' system. It might be cheaper to buy, and at least you only have to buy it once, if you decide to move to another platform later.
You're not asking for the freedom to do what you want to do, you're asking for the freedom to force your goals on everyone else.
This makes no sense to me. I am not forcing anyone.
They are successful because that's what people want. If they were not successful at this, why did Intel follow their lead and announce 64-bit extensions to the x86?
I already said that people often don't know what they really want.
It's pretty obvious that Intel has to follow AMD if they don't want to lose the x86-market. And it should be obvious that Intel's main source of income is the x86-market. It has nothing to do with the technical merits of x86.
The problem is that Intel was trying to force *their* vision down people's throats (just as they tried with the iAPX432) and the people decided against that by voting with their wallets and their enthusiasm.
Yes, the sales figures of AMD64 systems are overwhelming, as is the 64 bit software.
And as said before, Intel never released an Itanium aimed at the general public, so stop repeating the same statement. They didn't force anything, and they didn't lose to the AMD64 because they have not been aiming at the same market segment.
once tomorrow's apps become available, it will be a different thing altogether.
Perhaps tomorrow's apps show that we should have gone for Itanium or G5 afterall, and x86-64 turns out to be a dead-end.
The head movements alone would kill you. This is a perfect example of where a "home" user could make use of more than 4GB of memory.
Again this is a case of more memory, but not 64 bit addressing.
This sounds like a job for a caching HDD controller. The CPU doesn't need to know anything about this cache at all.
And even if you were to cache it with the CPU, not all cache would need to be in the same pool I suppose.
If you would have 1 thread for each file that is accessed, then each thread could have its own 4 gb cache segment, and still 32 bit would be plenty.
More tasks that are better suited by dedicated hardware. You wouldn't even WANT to do such things on a regular PC.
I would think that such a movie stream server would be a big 64 bit server, not an x86-based machine, if it were a regular computer system at all.
The TPC-C database does show a 4P HP Itanium 2 box that delivered 11% better TPC-C performance than its Opteron counterpart.
QED
The TPC-C database does show a 4P HP Itanium 2 box that delivered 11% better TPC-C performance than its Opteron counterpart. But that was achieved at 6,000 more in cost.
Rule of demand and supply. Want to lower the price? Buy it.
...cool, I'll buy a Quad Opteron. :)
I hold you personally responsible for x86 being with us longer than it should be.
And your taste in hardware is particularly lousy. Why buy 70s technology when you can get something new and shiny which performs?
And your taste in hardware is particularly lousy. Why buy 70s technology when you can get something new and shiny which performs?
Well, if the trout are any indication of what buying into the "new and shiny" results in - I'll stick to chosing my own poison after detailed research. I appreciate your comments on Itanium as I'm certain I will be buying one in the future - it is a good design which Intel has bet the farm on.
Afternoon, Randy.
Those are power users *today*. But follow the curve, tomorrow's home users are the ones running applications that today's power users dream about. Gee, 10-15 years ago we didn't have browsers. Think of the power it takes to run today's IE reasonably well. That would have been a *very* powerful machine just five years ago.
I still don't see it.
10-15 years ago we didn't have the browser but we also didn't have much of an internet.
What I was getting at is that when Win95 came out we did have audio and video editing software. Typical home users have, over the past decade, decided to not use that software. It wasn't because the platform was limited. People just didn't need it. They did find a use for browsers, email, MS Hearts and letter-writing (via MS Word).
These are applications which don't require massive amounts of processing power. They are only limited by the medium they use (i.e. mainly internet access speed - modem vs adsl vs cable). Most people I know still only have a modem for Net access thus limiting what they use for Net communication (i.e. don't use voice or video communication with Windows Messenger). If they had adsl or cable connections then they'd be into voice ("free international phone calls") and video chatting in a big way. This is with current technology and any 64bit cpu wouldn't improve the situation or change the way they use their PC.
You say "follow the curve", but I'm saying it isn't a curve at all.
MS Hearts is the same application it was 8-9 years ago.
MS Word might have had quite a few more features added to it over this time, however most users only do basic letter-writing. They don't use tables/ etc.
Email has been used to send jokes and pictures since it first became mainstream. It's still used for sending jokes and pictures, even with all the new capabilities of the email clients.
Browsing is still done the same way it's always been done since Win95. Even with the advent of flash/ java applets/ etc most people still click-away from a site if it takes too long to display.
Unless something pretty impressive in software is built which takes the masses by storm, the computer software industry is on the brink of collapse. The current PC crop is enough for the typical user to not need any upgrading for years to come. I'm not talking about the usual "purchase a replacement PC every 3 years". I mean the typical user wouldn't find a need to replace or upgrade their PC for the next ten years.
Unless something impressive comes along.
I'm just wondering what could possibly be impressive enough to make typical home users decide they have a need to upgrade/ purchase new hardware/ software.
Editing large video files is not it - they could be doing that already.
Editing/ Playing 101-piece orchestra music is not it. Why do that when mp3/ CD/ DVD is plenty for most.
Streaming video to multiple TVs throughout the home so that occupants could watch whatever they wanted immediately and alone isn't it either. Such a system would be control via anEntertainment System which would be contected to the TVs/ DVD players/ Video players/ Stereo surround-sound systems/ Fridges(?!?) etc. The PC would have no place in this.
Cheers,
Scronty
Those are power users *today*. But follow the curve, tomorrow's home users are the ones running applications that today's power users dream about. Gee, 10-15 years ago we didn't have browsers. Think of the power it takes to run today's IE reasonably well. That would have been a *very* powerful machine just five years ago.
I still don't see it.
10-15 years ago we didn't have the browser but we also didn't have much of an internet.
What I was getting at is that when Win95 came out we did have audio and video editing software. Typical home users have, over the past decade, decided to not use that software. It wasn't because the platform was limited. People just didn't need it. They did find a use for browsers, email, MS Hearts and letter-writing (via MS Word).
These are applications which don't require massive amounts of processing power. They are only limited by the medium they use (i.e. mainly internet access speed - modem vs adsl vs cable). Most people I know still only have a modem for Net access thus limiting what they use for Net communication (i.e. don't use voice or video communication with Windows Messenger). If they had adsl or cable connections then they'd be into voice ("free international phone calls") and video chatting in a big way. This is with current technology and any 64bit cpu wouldn't improve the situation or change the way they use their PC.
You say "follow the curve", but I'm saying it isn't a curve at all.
MS Hearts is the same application it was 8-9 years ago.
MS Word might have had quite a few more features added to it over this time, however most users only do basic letter-writing. They don't use tables/ etc.
Email has been used to send jokes and pictures since it first became mainstream. It's still used for sending jokes and pictures, even with all the new capabilities of the email clients.
Browsing is still done the same way it's always been done since Win95. Even with the advent of flash/ java applets/ etc most people still click-away from a site if it takes too long to display.
Unless something pretty impressive in software is built which takes the masses by storm, the computer software industry is on the brink of collapse. The current PC crop is enough for the typical user to not need any upgrading for years to come. I'm not talking about the usual "purchase a replacement PC every 3 years". I mean the typical user wouldn't find a need to replace or upgrade their PC for the next ten years.
Unless something impressive comes along.
I'm just wondering what could possibly be impressive enough to make typical home users decide they have a need to upgrade/ purchase new hardware/ software.
Editing large video files is not it - they could be doing that already.
Editing/ Playing 101-piece orchestra music is not it. Why do that when mp3/ CD/ DVD is plenty for most.
Streaming video to multiple TVs throughout the home so that occupants could watch whatever they wanted immediately and alone isn't it either. Such a system would be control via anEntertainment System which would be contected to the TVs/ DVD players/ Video players/ Stereo surround-sound systems/ Fridges(?!?) etc. The PC would have no place in this.
Cheers,
Scronty
Scronty, I totally agree about the stagnation. Even video cards seem to only have a couple generations left and the game companies can't even keep up it seems.
I think greater advances in user input recognition are right around the corner - as the computer generation is aging this becomes more mainstream. I'm very impressed by the written and speech recognition that exists and can imagine people getting tired with conforming to the machine. Keyboards could be gotten rid of and gesture input will happen.
I think greater advances in user input recognition are right around the corner - as the computer generation is aging this becomes more mainstream. I'm very impressed by the written and speech recognition that exists and can imagine people getting tired with conforming to the machine. Keyboards could be gotten rid of and gesture input will happen.