I also need to remind myself to add the Java code to the BHM project on SF. I think there's also some headers missing for the animation keys. I keep forgetting those things when I'm doing the actual coding on these projects.
Speaking of Java, that's the only engine that currently loads and displays the BHM file with skinning correctly :)
Perhaps because I haven't really touched that code since 2005 :)
On the other hand, I did fix a bug in it last week, one that has been in there since 2003 at least. It prevented Croissant 9 from running as an application on *nix operating systems. It did still work as an applet though.
At any rate, I'll work on the dependency walker's output tonight, and integrate it into the engine loader... a few false positives aren't that big of a deal, you only need to check for problems if LoadLibrary fails... and the idea is just to give the developer/end-user some hints on where to start looking. It's never going to be 100% foolproof anyway. I'll just get the output as good as possible for now, as a sort of proof-of-concept, and then drop the priority of the dependency walker. From then on it's mostly cosmetics anyway. I'll concentrate on the other things first.
Another reminder: I need to make GetModulePath() return true/false, and remove a second CreateFile() from OpenDLL().
Speaking of Java, that's the only engine that currently loads and displays the BHM file with skinning correctly :)
Perhaps because I haven't really touched that code since 2005 :)
On the other hand, I did fix a bug in it last week, one that has been in there since 2003 at least. It prevented Croissant 9 from running as an application on *nix operating systems. It did still work as an applet though.
At any rate, I'll work on the dependency walker's output tonight, and integrate it into the engine loader... a few false positives aren't that big of a deal, you only need to check for problems if LoadLibrary fails... and the idea is just to give the developer/end-user some hints on where to start looking. It's never going to be 100% foolproof anyway. I'll just get the output as good as possible for now, as a sort of proof-of-concept, and then drop the priority of the dependency walker. From then on it's mostly cosmetics anyway. I'll concentrate on the other things first.
Another reminder: I need to make GetModulePath() return true/false, and remove a second CreateFile() from OpenDLL().
Okay, the Dependency walker routine is now integrated into the loader of the 3D engine.
If it finds problems loading any of the supplied DLLs, it will show a report of where it failed, and it will give some simple hints if it sees 'familiar' missing DLLs:

(In this case I ran it on Windows XP 32-bit, and removed the D3D9 DLL, forcing it to fail loading, and reporting why the D3D11 and D3D10 DLLs didn't load, since otherwise it would just load the D3D9 DLL and not show any report at all).
Here's a downloadable package with the updated launch code: http://bohemiq.scali.eu.org/Engine20091411.rar
The depedency walker code can be found in the CPUInfo repository on SourceForge.net, in the Windows/Dependencies folder.
If it finds problems loading any of the supplied DLLs, it will show a report of where it failed, and it will give some simple hints if it sees 'familiar' missing DLLs:

(In this case I ran it on Windows XP 32-bit, and removed the D3D9 DLL, forcing it to fail loading, and reporting why the D3D11 and D3D10 DLLs didn't load, since otherwise it would just load the D3D9 DLL and not show any report at all).
Here's a downloadable package with the updated launch code: http://bohemiq.scali.eu.org/Engine20091411.rar
The depedency walker code can be found in the CPUInfo repository on SourceForge.net, in the Windows/Dependencies folder.
Disaster struck... my GeForce 8800GTS seems to have broken down. I started up a game of Shattered Horizon, and almost as soon as the game loaded, the display became garbled (I wasn't even playing yet, I was still in the menu). Windows tried to reset the video driver, but that failed and the whole system locked up.
When I rebooted, the display was still garbled, even during POST. Windows didn't even seem to detect the card anymore, because it dropped to standard VGA mode.
The only other PCI-e videocard I have is a GeForce 7600GT, so that's what I put in now. The system works again, but the card is completely useless. It's not fast enough to play the games I regularly play, even at very low settings. And for software development it's pretty useless aswell. I no longer have DX10 support, no Cuda, no PhysX, no Compute Shaders, no OpenCL... nothing.
Originally my plan was to wait for DX11 hardware. Well, technically the 8800GTS served that goal... Problem is, I didn't buy any DX11 card yet, because currently we only have AMD's offerings. I wanted to wait for nVidia to see what they come up with. I did the same back with the 8800GTS. I waited for the Radeon HD2900, and then picked the best option.
So I had three options:
1) Continue using the 7600GT until nVidia releases their hardware, and then get a proper card.
2) Buy an AMD DX11 card now.
3) Buy another nVidia DX10 card now.
After a few minutes of gaming, it became clear that 1) wasn't really an option. It could take months until nVidia's hardware is finally on the shelves (especially since I generally don't buy the high-end models, but the 'sweet spot' models below that, such as the 8800GTS, 80% of the performance for 50% of the price, that sort of thing).
So I had the dilemma of getting another nVidia card, and still have the Cuda, PhysX and OpenCL support, but still missing out on DX11... or getting the Radeon and giving up on Cuda, PhysX, and settling for relatively poor OpenCL support for now.
I decided to go for a cheap model, still a reasonable upgrade from the 8800GTS, but not so expensive that I would regret buying it, should nVidia's next generation be a huge leap forward.
In the end, I went with a Radeon 5770... For practical reasons partly... It's a smaller card, and it uses less power and is less noisy. The 8800GTS barely fit in my case, so a GTX260 might not fit at all, and I might have to relocate my harddisks or whatever. Hopefully the 5770 will just fit.
In terms of Cuda, PhysX and OpenCL...
I mainly used Cuda because there weren't any alternatives. I wanted to go with OpenCL or DirectCompute when they became viable alternatives, because I didn't want to write nVidia-only code. AMD offers DirectCompute, and with CS5.0 too... and their OpenCL is in beta stage now... hopefully it will get there. nVidia already had very nice OpenCL support... bit of a shame, but I'll live.
PhysX, well, that's a bigger shame. I've been a fan of the technology for a long time. However, I only own one game that actually uses it. Sadly I can't use accelerated PhysX for my own development on the Radeon... but it wasn't something I was really planning on short notice. If nVidia gets their next generation right, I will have an nVidia card with DX11 and PhysX soon anyway. The Radeon is only temporary, so I can continue developing my DX11 stuff, and play the odd game.
I just hope it won't take too long, the Radeons are all in backorder. Who knows, it might take so long that nVidia's DX11 cards will be available :)
A twist of irony though, it is because of my GeForce breaking down that I’m now buying a Radeon. Had the GeForce lasted until the next nVidia cards were out, I might not have bought another Radeon for years.
When I rebooted, the display was still garbled, even during POST. Windows didn't even seem to detect the card anymore, because it dropped to standard VGA mode.
The only other PCI-e videocard I have is a GeForce 7600GT, so that's what I put in now. The system works again, but the card is completely useless. It's not fast enough to play the games I regularly play, even at very low settings. And for software development it's pretty useless aswell. I no longer have DX10 support, no Cuda, no PhysX, no Compute Shaders, no OpenCL... nothing.
Originally my plan was to wait for DX11 hardware. Well, technically the 8800GTS served that goal... Problem is, I didn't buy any DX11 card yet, because currently we only have AMD's offerings. I wanted to wait for nVidia to see what they come up with. I did the same back with the 8800GTS. I waited for the Radeon HD2900, and then picked the best option.
So I had three options:
1) Continue using the 7600GT until nVidia releases their hardware, and then get a proper card.
2) Buy an AMD DX11 card now.
3) Buy another nVidia DX10 card now.
After a few minutes of gaming, it became clear that 1) wasn't really an option. It could take months until nVidia's hardware is finally on the shelves (especially since I generally don't buy the high-end models, but the 'sweet spot' models below that, such as the 8800GTS, 80% of the performance for 50% of the price, that sort of thing).
So I had the dilemma of getting another nVidia card, and still have the Cuda, PhysX and OpenCL support, but still missing out on DX11... or getting the Radeon and giving up on Cuda, PhysX, and settling for relatively poor OpenCL support for now.
I decided to go for a cheap model, still a reasonable upgrade from the 8800GTS, but not so expensive that I would regret buying it, should nVidia's next generation be a huge leap forward.
In the end, I went with a Radeon 5770... For practical reasons partly... It's a smaller card, and it uses less power and is less noisy. The 8800GTS barely fit in my case, so a GTX260 might not fit at all, and I might have to relocate my harddisks or whatever. Hopefully the 5770 will just fit.
In terms of Cuda, PhysX and OpenCL...
I mainly used Cuda because there weren't any alternatives. I wanted to go with OpenCL or DirectCompute when they became viable alternatives, because I didn't want to write nVidia-only code. AMD offers DirectCompute, and with CS5.0 too... and their OpenCL is in beta stage now... hopefully it will get there. nVidia already had very nice OpenCL support... bit of a shame, but I'll live.
PhysX, well, that's a bigger shame. I've been a fan of the technology for a long time. However, I only own one game that actually uses it. Sadly I can't use accelerated PhysX for my own development on the Radeon... but it wasn't something I was really planning on short notice. If nVidia gets their next generation right, I will have an nVidia card with DX11 and PhysX soon anyway. The Radeon is only temporary, so I can continue developing my DX11 stuff, and play the odd game.
I just hope it won't take too long, the Radeons are all in backorder. Who knows, it might take so long that nVidia's DX11 cards will be available :)
A twist of irony though, it is because of my GeForce breaking down that I’m now buying a Radeon. Had the GeForce lasted until the next nVidia cards were out, I might not have bought another Radeon for years.
http://www.humus.name/index.php?page=News&ID=283
Yea I know... it's on the agenda already :)
I just ordered the Radeon 5770 because I'm not too confident that the card can be oven-fixed... in the event that the oven-fix works, I can still use the 8800GTS as an upgrade for some older box.
Then again, maybe not... it's a bit of a risk... if the card is short-circuited somewhere, it could damage the PC you try it in.
I just ordered the Radeon 5770 because I'm not too confident that the card can be oven-fixed... in the event that the oven-fix works, I can still use the 8800GTS as an upgrade for some older box.
Then again, maybe not... it's a bit of a risk... if the card is short-circuited somewhere, it could damage the PC you try it in.
15 seconds in the microwave should do the trick :D :D
Plus you get a light show for free.
Always wondered if video card cooking was a myth (faked) or if it provably worked. Report back if you try it.
Plus you get a light show for free.
Always wondered if video card cooking was a myth (faked) or if it provably worked. Report back if you try it.
Heard of xbox 360 :) ? You'd then have heard of RROD. And then, the towel-trick. Literally dozens of millions of non-tech people did the oven-cooking to fix their consoles multiple times these past 4 years.
It's not exactly rocket-science to understand it, either ^^
It's not exactly rocket-science to understand it, either ^^
Well, it's been about a week, but still no sign of the Radeon 5770...
@Scali
The 5000's have been in back order for a few weeks now. Apparently they aren't getting very good yeilds; a week or two back there were articles on news sites about the prices going up.
@Ultrano
Heard of xbox 360 :) ? You'd then have heard of RROD. And then, the towel-trick. Literally dozens of millions of non-tech people did the oven-cooking to fix their consoles multiple times these past 4 years.
It's not exactly rocket-science to understand it, either ^^
I was under the impression that the XBox360 failures were due to over heating.
The article you linked explains that cracked solder points are why the graphics cards stopped working and why cooking them can fix the problem. (Does over heading cause cracked solder points? and then more heating repair them?)
I don't claim to know the success and failure rate of the RROD "towel trick" but dozens of millions of successful attempts seems a bit high (considering there's only ~45million 360's sold). Didn't the majority of people get their 360's replaced under warranty when they got the RROD?
In any case, Ultrano, your 360 anecdote and rhetorical "it's not rocket science" don't really present much evidence that cooking PCI-e graphics cards can repair them :D
If you have a link of someone empirically testing this (say with 10 dead graphics cards and getting 3 out of 10 to work again) I'd be more than happy to change my thinking on the subject. Even Scali's first hand account (if he tries it and if it's successful) would help dissuade my doubts.
I don't want to hijack Scali's interesting thread any further.
The 5000's have been in back order for a few weeks now. Apparently they aren't getting very good yeilds; a week or two back there were articles on news sites about the prices going up.
@Ultrano
Heard of xbox 360 :) ? You'd then have heard of RROD. And then, the towel-trick. Literally dozens of millions of non-tech people did the oven-cooking to fix their consoles multiple times these past 4 years.
It's not exactly rocket-science to understand it, either ^^
I was under the impression that the XBox360 failures were due to over heating.
The article you linked explains that cracked solder points are why the graphics cards stopped working and why cooking them can fix the problem. (Does over heading cause cracked solder points? and then more heating repair them?)
I don't claim to know the success and failure rate of the RROD "towel trick" but dozens of millions of successful attempts seems a bit high (considering there's only ~45million 360's sold). Didn't the majority of people get their 360's replaced under warranty when they got the RROD?
In any case, Ultrano, your 360 anecdote and rhetorical "it's not rocket science" don't really present much evidence that cooking PCI-e graphics cards can repair them :D
If you have a link of someone empirically testing this (say with 10 dead graphics cards and getting 3 out of 10 to work again) I'd be more than happy to change my thinking on the subject. Even Scali's first hand account (if he tries it and if it's successful) would help dissuade my doubts.
I don't want to hijack Scali's interesting thread any further.
Well, as far as I understand it, the problem is related to the new ROHS standards, where solder has to be lead-free (I work at a company that also manufactures its own hardware, and had to go ROHS recently).
The lead in the solder caused the joints to be reasonably 'elastic'. The new lead-free solder is not quite as elastic, so soldered joints will crack sooner under stress... which could be from vibrations or from changes in temperature (different materials having different properties, so expanding and contracting at different rates).
From what I understood, the PCB of an XBox 360 'arcs' a bit when it gets hot, and the chips just 'pop loose' under the stress.
The idea of putting it in an oven then, is that the solder will have a melting temperature of about 220-250C, if you're lucky. So a standard oven is capable of generating enough heat to make the solder 'reflow' and restore any cracked joints.
A lot of hardware from a few years back (when they first went lead-free because of ROHS) has this problem, including the XBox, various laptops, and both ATi and nVidia videocards. Lead-free solder and the process of manufacturing have improved, so newer hardware is less likely to have issues.
The lead in the solder caused the joints to be reasonably 'elastic'. The new lead-free solder is not quite as elastic, so soldered joints will crack sooner under stress... which could be from vibrations or from changes in temperature (different materials having different properties, so expanding and contracting at different rates).
From what I understood, the PCB of an XBox 360 'arcs' a bit when it gets hot, and the chips just 'pop loose' under the stress.
The idea of putting it in an oven then, is that the solder will have a melting temperature of about 220-250C, if you're lucky. So a standard oven is capable of generating enough heat to make the solder 'reflow' and restore any cracked joints.
A lot of hardware from a few years back (when they first went lead-free because of ROHS) has this problem, including the XBox, various laptops, and both ATi and nVidia videocards. Lead-free solder and the process of manufacturing have improved, so newer hardware is less likely to have issues.
I originally ordered a Club3d Radeon 5770... Club3d is always cheap, and I've had two of them before (the 8800GTS was a Club3d, and my first Radeon 9600 was also a Club3d). But because of the shortage, it doesn't look like they were coming in stock anytime soon. So I changed my order to Sapphire. It seems that most of the GPUs produced go to Sapphire now, as that's one of the few brands that's in stock (some other brands that are in stock are just way more expensive, so that probably explains why those are in stock).
I hope it will be shipped tomorrow.
I hope it will be shipped tomorrow.
Well, I've had my Radeon 5770 for a few days now... It's excellent value for money, that's for sure. It's quite a bit faster than my 8800GTS was, and the image quality is at least as good, if not better. I can now play Crysis at very high settings with 4xAA and still get 25-30 fps nearly everywhere. You can tell it's cheap though... the cooler is rather flimsy and very light, and the fan is quite noisy compared to the one on my 8800GTS.
But... I've already seen some render bugs with some software, including some of my own. Not just OpenGL, the traditional achilles heel of ATi (funny enough, the Unigine Heaven benchmark, the first DX11 benchmark, doesn't work properly in OpenGL mode, while it did work on my 8800GTS), but also some older DX9 stuff. Ironically some stuff that I wrote when I had a Radeon 9600, so I know it has worked with certain ATi hardware and drivers at one point.
And to add insult to injury, nVidia released official OpenCL drivers a few days ago. I've played a bit with those drivers just before my 8800GTS died, and they had excellent performance and supported a lot of OpenCL features, including OpenGL interop.
ATi is nowhere near that stage yet. They have beta drivers for OpenCL, which don't perform very well, and support a bare minimum of features only.
Oh well, it's better than nothing. I'll just have to wait... Either nVidia comes up with a killer card and I'm going back to nVidia... or hopefully ATi's drivers will mature, and ATi will sort out their OpenCL.
But... I've already seen some render bugs with some software, including some of my own. Not just OpenGL, the traditional achilles heel of ATi (funny enough, the Unigine Heaven benchmark, the first DX11 benchmark, doesn't work properly in OpenGL mode, while it did work on my 8800GTS), but also some older DX9 stuff. Ironically some stuff that I wrote when I had a Radeon 9600, so I know it has worked with certain ATi hardware and drivers at one point.
And to add insult to injury, nVidia released official OpenCL drivers a few days ago. I've played a bit with those drivers just before my 8800GTS died, and they had excellent performance and supported a lot of OpenCL features, including OpenGL interop.
ATi is nowhere near that stage yet. They have beta drivers for OpenCL, which don't perform very well, and support a bare minimum of features only.
Oh well, it's better than nothing. I'll just have to wait... Either nVidia comes up with a killer card and I'm going back to nVidia... or hopefully ATi's drivers will mature, and ATi will sort out their OpenCL.
Getting an ATi card for development is like shooting yourself in the leg :D
Think so? I think it can be useful to make sure your code works as expected on crappy hardware, and it is a reasonable expectation that the majority of your end-users do NOT have a top-notch video card.
Should the market be coerced into purchasing a new card just to execute your applications?
Should the market be coerced into purchasing a new card just to execute your applications?
Think so? I think it can be useful to make sure your code works as expected on crappy hardware, and it is a reasonable expectation that the majority of your end-users do NOT have a top-notch video card.
Should the market be coerced into purchasing a new card just to execute your applications?
Should the market be coerced into purchasing a new card just to execute your applications?
Actually ATi is the one who's ahead this time. They are the only ones who have DX11 hardware on the market right now. So if I were to use my videocard to its full potential, then THAT would require people to purchase new videocards.
It's the software that's lousy with ATi... the hardware is just fine.
If you want to make sure your software works everywhere, no matter how crappy, I'd suggest using an Intel IGP :)
If you want to make sure your software works everywhere, no matter how crappy, I'd suggest using an Intel IGP :)
Now that's like shooting both legs, both arms, and poking one eye. :D
For the 'small challenge' I've hooked up my old Athlon XP1800+ system.
I figured I'd try to run the 3d engine on it... Got an even better example of my dependency checker now:

It tells me I need to install the latest DirectX redistributable!
I figured I'd try to run the 3d engine on it... Got an even better example of my dependency checker now:

It tells me I need to install the latest DirectX redistributable!
The past few days I've been playing with my JPG loader again. I originally started that project about 10 years ago, but I got stuck at some point with a bug I couldn't locate, and I lost interest.
I had some spare time, so I figured I'd give it one last try, and I finally found the bug.
I'll be playing with the JPG decompression routine a bit more, make it a bit more robust and clean it up here and there... Maybe I'll even try my hand at optimizing some of the remaining code (mostly the iDCT), as the point originally was to make a very fast implementation... and then I should return to the 3d engine stuff.
I had already made a GIF decompressor before I did the JPG one (which was also a very fast implementation even if I do say so myself)... Perhaps I should do a PNG one aswell at some point, then I have the three major fileformats covered. PNG shouldn't be that difficult.
At any rate, I suppose JPG is a good educational exercise, the format contains a few clever compression tricks, which you'll also find in various other formats.
I had some spare time, so I figured I'd give it one last try, and I finally found the bug.
I'll be playing with the JPG decompression routine a bit more, make it a bit more robust and clean it up here and there... Maybe I'll even try my hand at optimizing some of the remaining code (mostly the iDCT), as the point originally was to make a very fast implementation... and then I should return to the 3d engine stuff.
I had already made a GIF decompressor before I did the JPG one (which was also a very fast implementation even if I do say so myself)... Perhaps I should do a PNG one aswell at some point, then I have the three major fileformats covered. PNG shouldn't be that difficult.
At any rate, I suppose JPG is a good educational exercise, the format contains a few clever compression tricks, which you'll also find in various other formats.
Is there any reason write your own JPEG and PNG decompression, except for the educational purpose (D3DX accepts both these formats and if you want a portable solution you can use open-source libraries) ?.
ti_mo_n : DXT-class formats generally render that useless, but a jpg_decode+dxt_compress on gpu could be a nice feature for projects that load from slow DVD/BD (8-10MB/s).