Works fine on Win7 64-bit, GF9600 GT (D3D10.0 hardware). Can you explain the exact steps you do to get the display output's name? I can't seem to get it right and am forced to use Nvidia's Control panel API.
I do this:
In this case, outputDesc is the output description that comes from DXGI, which contains the strings of the form "\\.\DISPLAY1"
So I pass that as input, and select index 0.
After the call, the monitor's name will be in dd.DeviceString.
In D3D9 I need an additional step to get the "\\.\DISPLAY1" name.
First I use IDirect3D9::GetAdapterMonitor() to get a HMONITOR handle for the display.
Then I use GetMonitorInfo() to get the "\Display1" name (in MONITORINFOEX.szDevice).
Then I plug that string into EnumDisplayDevices() as described above.
Btw, lol@ the FPS count in the screenshot I pasted above. Clearly that is my Intel X3100 IGP in action :)
I just ran it on my PC at work, with a 9800GTX+, it touched 8000 fps (WinXP 32-bit, DX9). Only a rough factor 80 difference :)
// Resolve monitor name
DISPLAY_DEVICE dd;
dd.cb = sizeof(dd);
EnumDisplayDevices( W2T(outputDesc.DeviceName), 0, &dd, 0 );
In this case, outputDesc is the output description that comes from DXGI, which contains the strings of the form "\\.\DISPLAY1"
So I pass that as input, and select index 0.
After the call, the monitor's name will be in dd.DeviceString.
In D3D9 I need an additional step to get the "\\.\DISPLAY1" name.
First I use IDirect3D9::GetAdapterMonitor() to get a HMONITOR handle for the display.
Then I use GetMonitorInfo() to get the "\Display1" name (in MONITORINFOEX.szDevice).
Then I plug that string into EnumDisplayDevices() as described above.
Btw, lol@ the FPS count in the screenshot I pasted above. Clearly that is my Intel X3100 IGP in action :)
I just ran it on my PC at work, with a 9800GTX+, it touched 8000 fps (WinXP 32-bit, DX9). Only a rough factor 80 difference :)
No go on my 32bit xp sp3.
Application enters an infinite loop (by all appearances), no GUI, no nothin.
Application enters an infinite loop (by all appearances), no GUI, no nothin.
No go on my 32bit xp sp3.
Application enters an infinite loop (by all appearances), no GUI, no nothin.
I have noticed the same on a Vista PC with Intel onboard chip that I've been playing around with here.
It appears that it got stuck in the D3D11 DLL (which was not installed on the machine btw).
Could you try to delete or rename the D3D11 DLL and try again? If that fails, delete or rename the D3D10 DLL aswell?
I'm currently trying to install SP2 and D3D11 on that machine, see if it fixes anything. I'm not sure why it gets stuck in that DLL, it would have to fail at LoadLibrary(), since it cannot resolve all imports. Then again, I recently replaced LoadLibrary with AfxLoadLibrary... that may be a problem.
After installing D3D11 on the machine, it no longer went into the infinite loop on the Vista machine. However, it now reported that it could not find any display modes for the specified pixelformat. Figures, it has a cheapo Intel Q35 onboard chip.
So I think the problem may be related to DXGI and the iteration of the display modes.
What video card are you using, Homer?
So I think the problem may be related to DXGI and the iteration of the display modes.
What video card are you using, Homer?
For reasons unknown to me there are some D3D10-related DLLs on my XP machine. The proper way to detect D3D10, I think, is that you should check the Windows version and if it's not at least Vista, then the applications should be forced to used D3D9 to eliminate any problems with buggy installations on XP / buggy drivers reporting D3D10 support, etc. In ohter words: It's safer to assume that a pre-Vista Windows supports ONLY D3D9 (which is true), than to 'manually' check for D3D10/11 on those machines and risk an infinite loop. So, GetVersionEx FTW! ;)
Thank you for the info about display output's name. I'll try it ASAP.
Thank you for the info about display output's name. I'll try it ASAP.
For reasons unknown to me there are some D3D10-related DLLs on my XP machine. The proper way to detect D3D10, I think, is that you should check the Windows version and if it's not at least Vista, then the applications should be forced to used D3D9 to eliminate any problems with buggy installations on XP / buggy drivers reporting D3D10 support, etc. In ohter words: It's safer to assume that a pre-Vista Windows supports ONLY D3D9 (which is true), than to 'manually' check for D3D10/11 on those machines and risk an infinite loop. So, GetVersionEx FTW! ;)
I think GetVersionEx is not a good idea. It's quite error-prone (what about non-MS solutions, eg Wine, ReactOS? What if someone builds a D3D10/11 wrapper for XP that actually works? etc etc). It may give false negatives, I'd rather have false positives (I'll make a commandline function so you can always force the API).
What I do is this:
1) I try to load my DLL. If it fails, then apparently there were some imports that couldn't be resolved, probably because that version of D3D is not supported.
2) The DLL loaded. I import my functions. One of them is a Test() function. It tries to create a device of the specific D3D version and reports back.
Don't forget, just because D3D10/11 is installed on the machine doesn't mean that the hardware actually supports it too.
3) Test() returned positive. I will start the configuration dialog, which will detect the adapters, monitors, display modes etc.
I'm not sure why exactly the DLL got stuck in the first place. On an XP or Vista machine with no D3D11, LoadLibrary() should already fail in step 1). Even if it DID succeeed (eg someone copied the DLLs onto the system manually), then step 2) should fail.
I wonder if AfxLoadLibrary does something weird. I could change back to LoadLibrary and see if it fixes something ofcourse.
I'll also double-check my Test() function to make sure it doesn't report false positives in D3D11.
I have done a few bits of cleanup and added some exceptions here and there: http://bohemiq.scali.eu.org/Engine20100203.zip
I haven't mentioned it explicitly, but there are the following requirements:
VC++ 2008 sp1 redistributable: http://www.microsoft.com/DOWNLOADS/details.aspx?FamilyID=a5c84275-3b97-4ab7-a40d-3802b2af5fc2&displaylang=en
DirectX August 2009 redistributable: http://www.microsoft.com/downloads/details.aspx?familyid=04AC064B-00D1-474E-B7B1-442D8712D553&displaylang=en
What I wonder though... why did the Intel Q35 chip report no videomodes for my format? I use DXGI_FORMAT_R8G8B8A8_UNORM, which I assume to be the most basic format there is...
There is DXGI_FORMAT_R8G8B8A8_UNORM_SRGB, but I think that would be less common, especially on low-end hardware (I think it does gamma correction, so it's some kind of nonlinear format). And there are DXGI_FORMAT_B8G8R8A8_UNORM formats aswell... but I would think those are quite rare aswell.
In D3D9 I use D3DFMT_X8R8G8B8. The only difference would be that I don't specifically ask for an alphachannel. But there is no equivalent format in D3D10/11. I'll have to put DXCapsViewer on that machine and see if there are any formats that it DOES support.
I haven't mentioned it explicitly, but there are the following requirements:
VC++ 2008 sp1 redistributable: http://www.microsoft.com/DOWNLOADS/details.aspx?FamilyID=a5c84275-3b97-4ab7-a40d-3802b2af5fc2&displaylang=en
DirectX August 2009 redistributable: http://www.microsoft.com/downloads/details.aspx?familyid=04AC064B-00D1-474E-B7B1-442D8712D553&displaylang=en
What I wonder though... why did the Intel Q35 chip report no videomodes for my format? I use DXGI_FORMAT_R8G8B8A8_UNORM, which I assume to be the most basic format there is...
There is DXGI_FORMAT_R8G8B8A8_UNORM_SRGB, but I think that would be less common, especially on low-end hardware (I think it does gamma correction, so it's some kind of nonlinear format). And there are DXGI_FORMAT_B8G8R8A8_UNORM formats aswell... but I would think those are quite rare aswell.
In D3D9 I use D3DFMT_X8R8G8B8. The only difference would be that I don't specifically ask for an alphachannel. But there is no equivalent format in D3D10/11. I'll have to put DXCapsViewer on that machine and see if there are any formats that it DOES support.
Yes, R8G8B8A8_UNORM and B8G8R8A8_UNORM are the two most "basic" formats which should be supported by every hardware there is. Please do the ckeck because now I'm curious myself.
I've hooked up an old box, to see if I could replicate any of the problems that Homer reported.
I have the following configuration:
- Athlon XP1800+
- Radeon 9600XT
- Windows XP SP3
I found that it wouldn't run the engine initially... Not because of any installation problems, but because I compiled with SSE2-optimizations by default. Since the Athlon XP doesn't support SSE2, the engine couldn't be loaded from the DLL.
I've recompiled a version without any SSE: http://bohemiq.scali.eu.org/Engine20100203a.zip
Now it worked like a charm, even on that ancient box:


And oh the irony that this ancient box still runs more than 7 times as fast as the Intel laptop... It's a shame I don't have Windows 7 or Vista on this machine. I'd love to see if this old thing can run the D3D11 engine in downlevel 9.1.
I have the following configuration:
- Athlon XP1800+
- Radeon 9600XT
- Windows XP SP3
I found that it wouldn't run the engine initially... Not because of any installation problems, but because I compiled with SSE2-optimizations by default. Since the Athlon XP doesn't support SSE2, the engine couldn't be loaded from the DLL.
I've recompiled a version without any SSE: http://bohemiq.scali.eu.org/Engine20100203a.zip
Now it worked like a charm, even on that ancient box:


And oh the irony that this ancient box still runs more than 7 times as fast as the Intel laptop... It's a shame I don't have Windows 7 or Vista on this machine. I'd love to see if this old thing can run the D3D11 engine in downlevel 9.1.
Deleting the DX11 component was enough.
Deleting the DX11 component was enough.
Must be the same problem that the Vista machine had. My theory is that it got stuck in the AfxLoadLibrary call. Seems like it doesn't happen on all of them, because I tried it in XP 32-bit on my own PC at work, and it just happily skipped the D3D11 and D3D10 DLLs and did a fallback to D3D9 automatically.
Couldn't try it on my old box here, because it won't load any DLLs anyway, because of the SSE2-problem. The rebuilt DLLs loaded fine on it though, so hopefully I have fixed it.
About the SSE problem: I think today it's safe to enable SSE on 32-bit builds and SSE2 on 64-bit builds, isn't it?
About the SSE problem: I think today it's safe to enable SSE on 32-bit builds and SSE2 on 64-bit builds, isn't it?
You can't even disable SSE/SSE2 on 64-bit builds anymore. Windows doesn't even save the full x87 in a context switch, or so I've heard. Default is to use SSE2 for all floats anyway. SSE2 is not an extension in the case of the x64 instructionset.
As for 32-bit... depends on how you look at it. The Athlon XP supports it, but its cousin, the Athlon Classic doesn't. Theoretically I don't think a lot of people would still use an Athlon XP, let alone a Classic, especially not in combination with a reasonable videocard. So I think SSE2 is an okay minimum spec. On the other hand, for a lot of stuff, an Athlon XP or Classic is still fast enough, so why not? Especially the difference between Athlon XP and Classic is very small. The SSE implementation isn't that good, and the x87 is excellent on the Athlon. So most of the time you'll barely win anything with SSE anyway. I still have a 1400 MHz Athlon Classic around here somewhere. No doubt that it will perform pretty much the same as the 1800+ (which is 1533 MHz), if it had the same videocard.
The only reason why I enabled SSE2 was to have an apples-to-apples comparison between 32-bit and 64-bit mode (testing with the Marching Cubes code, quite CPU-intensive). I just never bothered to change the settings back. Wasn't a problem, until I tried my Athlon XP again :)
http://security.freebsd.org/advisories/FreeBSD-SA-06:14-amd.txt
SSE1 is 100% available on all PCs, it's available since 1999.
SSE2 is required in many games; is available since 2001-2003, but Athlons/Semprons from before 2003 are simply enough for many people's home/office.
SSE1 is 100% available on all PCs, it's available since 1999.
SSE2 is required in many games; is available since 2001-2003, but Athlons/Semprons from before 2003 are simply enough for many people's home/office.
Certainly in my case, SSE2 was not the issue.
And incidentally, I too have some legacy DX10 files on my XP system, as I tried out someone's DX10 backport for XP a while ago, but that wasn't the problem either.
It was most certainly an issue in the DX11 component of the demo, although I didn't take the time to check where the loop was occuring, I am willing to do so.
And incidentally, I too have some legacy DX10 files on my XP system, as I tried out someone's DX10 backport for XP a while ago, but that wasn't the problem either.
It was most certainly an issue in the DX11 component of the demo, although I didn't take the time to check where the loop was occuring, I am willing to do so.
http://security.freebsd.org/advisories/FreeBSD-SA-06:14-amd.txt
SSE1 is 100% available on all PCs, it's available since 1999.
Up to a few months ago, my FreeBSD home server was still powered by a Pentium II :)
Not sure what the relevance of the linked article is though.
Old PCs make nice firewalls and support servers, yeah :D . (maybe wattage should be accounted for, though) . No heavy SSE software for them, fortunately.
The link was just about "Windows doesn't even save the full x87 in a context switch, or so I've heard." , just sharing as a quick and interesting reference.
The link was just about "Windows doesn't even save the full x87 in a context switch, or so I've heard." , just sharing as a quick and interesting reference.
Yes, R8G8B8A8_UNORM and B8G8R8A8_UNORM are the two most "basic" formats which should be supported by every hardware there is. Please do the ckeck because now I'm curious myself.
I've just checked with DXCapsViewer. I found the following:
- It supports ONLY B8G8R8A8-formats, but both the regular _UNORM and _SRGB variations of them. Explains why my code didn't work. I should check both ways.
- It does report D3D11 compatibility in level 9.1 mode. So if I fix the pixelformat it should work.
- In D3D9 mode, it reports pixelshader 2.0, but NO vertexshader. That would explain why although the D3D9 engine ran, the skinned hand did not appear. I originally had a workaround in my old engine, that would automatically enable the software vertexprocessing pipeline when no vertexshaders were available. I guess I should put that code back in there.
Now I wonder though, would B8G8R8A8 actually be a more common format than R8G8B8A8? My GeForce and Radeon supported both ways. I think my laptop's Intel X3100 also supports both ways. I think I'll take the B8G8R8A8 one as the default for now, and see if I can make a simple workaround to switch to A8R8G8B8 if the other reports no supported videomodes.
The link was just about "Windows doesn't even save the full x87 in a context switch, or so I've heard." , just sharing as a quick and interesting reference.
Yea, AMD didn't intend to fully use the x87 in 64-bit mode. Ofcourse an OS could still work around the limitations of the FX* instructions in 64-bit mode, but I don't think the popular OSes do. That's what I always read about 64-bit Windows in the early days anyway... "Don't use x87, use SSE2, as context switches don't preserve the full x87 state". But I haven't checked in years, so perhaps they changed the behaviour in Vista or Windows 7.
I thought the link was supposed to have something to do with how many SSE/SSE2 machines were on the market.
As for the wattage of a Pentium II... According to Wikipedia my 333 MHz model would have a TDP of 20.6W. That is actually considered 'efficient' these days, as most regular desktop CPUs seem to in the range of 60-140W.