Heya, I'll be using this thread to let you know about whatever OA32-related stuff I'm working on at the time.
The most recent work is a new object whose working title is 'Audio'.
Although currently quite simple, it will be expanded into a Manager-style class.
It provides support for loading and playing PCM/WAV audio samples, in both 2D and 3D.
Furthermore it supports the 'EAX - Environmental Audio eXtension', allowing you to take advantage of the newest features supported by current SoundCards.
Unfortunately, M$ decided to rewrite the Audio stack for VISTA, which completely broke the EAX model, even if your card supports this stuff... and to date, the only solution is a special series of cards provided only by Creative Labs (the SoundBlaster people), and even this requires special hacked drivers that subvert and spoof portions of DirectSound to get it to work - I bet M$ charged them for the privilege of certifying this as a genuine solution , and those cards as 'vista-compatible hardware'.
Anyway, OA32 now has some 3D audio support, which is Cool.
You can expect to see this object in the next update of OA32 (coming really soon), although the name of the file may change to reflect its place in the scheme of things.
** TODO **
-Support STREAMING Buffers
-Support Conversion Streams (allowing play from MP3 and any other format you happen to have a codec installed for)
The most recent work is a new object whose working title is 'Audio'.
Although currently quite simple, it will be expanded into a Manager-style class.
It provides support for loading and playing PCM/WAV audio samples, in both 2D and 3D.
Furthermore it supports the 'EAX - Environmental Audio eXtension', allowing you to take advantage of the newest features supported by current SoundCards.
Unfortunately, M$ decided to rewrite the Audio stack for VISTA, which completely broke the EAX model, even if your card supports this stuff... and to date, the only solution is a special series of cards provided only by Creative Labs (the SoundBlaster people), and even this requires special hacked drivers that subvert and spoof portions of DirectSound to get it to work - I bet M$ charged them for the privilege of certifying this as a genuine solution , and those cards as 'vista-compatible hardware'.
Anyway, OA32 now has some 3D audio support, which is Cool.
You can expect to see this object in the next update of OA32 (coming really soon), although the name of the file may change to reflect its place in the scheme of things.
** TODO **
-Support STREAMING Buffers
-Support Conversion Streams (allowing play from MP3 and any other format you happen to have a codec installed for)
I'm very interested about the "conversion streams", there are many formats that I can't decode with ACM, but know to work with DShow. It's a bit elusive how some wave-editors can import just about any wave-data, even render .mod/.xm/.. .
I'm very interested about the "conversion streams"(...)
Yeah, that's what I'd like to see. Do you plan it to use DirectShow or ACM? Because ACM codecs usually don't support as much as DShow ones (I mean: DShow codecs for exotic audio types are more common).
I was only planning on supporting ACM in the immediate future as that set of codecs covers most of the compressions I'm likely to work with, and I'm not really interested in processing FMV... however, like most things, I'll probably add more stuff later, either as I need it, or because I was asked nicely to support it.
In order to sink the SoundBuffer Notifications needed for Audio Streaming, I needed an Event servicing mechanism... so I wrote the EventBank and EventManager objects, which provide for user-defined event callbacks and are close to being totally Asynchronous.
EventBank manages up to 64 events, serviced via a monitor thread that uses a Wait function.
And if you need more than 64, the EventManager wraps the functionality of EventBank and essentially has no limit to the number of events you can manage, at a cost of one Thread per cluster of 64 Events... which still isn't as terrible as it sounds, since these threads are effectively Sleeping when not actually servicing an event.
These objects will be published in the next exciting edition of ObjAsm32.
EventBank manages up to 64 events, serviced via a monitor thread that uses a Wait function.
And if you need more than 64, the EventManager wraps the functionality of EventBank and essentially has no limit to the number of events you can manage, at a cost of one Thread per cluster of 64 Events... which still isn't as terrible as it sounds, since these threads are effectively Sleeping when not actually servicing an event.
These objects will be published in the next exciting edition of ObjAsm32.
Thanks to the new Event handling class, I've implemented WAV Audio Streaming, and it works beautifully in 3D, but my WAV loader chokes if you hand it a Stereo wavfile and I don't handle it that gracefully... at least it won't crash.
I've just added code to toggle between the 25 preset audio environments in EAX, so now I'm listening to the Ramones in 3D Space, and in a Padded Cell environment - cool 8)
now I'm listening to the Ramones in 3D Space, and in a Padded Cell environment - cool 8)
Now most of the way to supporting MP3 audio streaming into my 3D system, I'll be collapsing stereo channels in this case, as this stuff is being written explicitly for use in 3D environments.
If people whine later for true stereo support I'll consider it, but since we're talking about objects that are not yet published in the public domain, it's hardly a priority :P
Just curious - what are you guys working on lately?
If people whine later for true stereo support I'll consider it, but since we're talking about objects that are not yet published in the public domain, it's hardly a priority :P
Just curious - what are you guys working on lately?
So none of you are actually coding anything at all.
Well, that just makes me even more strange.
As mentioned, I'm working on streaming (assumed stereo) mp3 files into mono soundbuffers that are 'positioned' in 3D space, and have a '3D environment' distortion effect applied to them.
Just imagine a game like 'vice city', except now the cars that are passing you can play mp3 audio and you can hear their crap doof doof music coming from their cars... complete with doppler shift as they move relative to your position.
That's what I can do now.
In fact I'd like to take it further, by writing a Mixer on the Server application which lets each Player hear the crap music around them that other Players are really listening to.
I can now stream 'most' mp3 files, but 'some' are trouble.
I don't really understand it yet, but I'm on the case.
And I'd like to be able to capture the default hardware audio stream from the cd device (like we can for the mic), but that's going to sit on my TODO list for a while longer, unless someone has some input?
So, nobody is coding anything... Yeah, right!
Since you're not busy, perhaps you're interested in assisting me, noting that this is yet another donation to the cause and so not commercial in nature?... Yeah, right!
Well, who knows what it might lead to, I seem to be getting more serious lately :)
Well, that just makes me even more strange.
As mentioned, I'm working on streaming (assumed stereo) mp3 files into mono soundbuffers that are 'positioned' in 3D space, and have a '3D environment' distortion effect applied to them.
Just imagine a game like 'vice city', except now the cars that are passing you can play mp3 audio and you can hear their crap doof doof music coming from their cars... complete with doppler shift as they move relative to your position.
That's what I can do now.
In fact I'd like to take it further, by writing a Mixer on the Server application which lets each Player hear the crap music around them that other Players are really listening to.
I can now stream 'most' mp3 files, but 'some' are trouble.
I don't really understand it yet, but I'm on the case.
And I'd like to be able to capture the default hardware audio stream from the cd device (like we can for the mic), but that's going to sit on my TODO list for a while longer, unless someone has some input?
So, nobody is coding anything... Yeah, right!
Since you're not busy, perhaps you're interested in assisting me, noting that this is yet another donation to the cause and so not commercial in nature?... Yeah, right!
Well, who knows what it might lead to, I seem to be getting more serious lately :)
Hehe, I *am* working on something, actually...
It has to do with driver coding and the use of intel's VMX instruction set, and will likely also feature some pubkey cryptography... but I can't say much more about the specific project than that.
I CAN, however, say that debugging the thing is a darn nightmare. We can't even use a two-machine windbg setup for most of the stuff, since we work at such a low level (basically "sucking out windows' brain on the fly"). And even with a two-machine setup, it cumbersome constantly rebooting the test machine before you can do another test (edits + recompile in such a session are usually very short, so you spend time waiting for the test machine to boot back up).
Oh, and needless to say, VMX is nontrivial and the x86 architecture itself has so many quirks... but it's an interesting project :)
It has to do with driver coding and the use of intel's VMX instruction set, and will likely also feature some pubkey cryptography... but I can't say much more about the specific project than that.
I CAN, however, say that debugging the thing is a darn nightmare. We can't even use a two-machine windbg setup for most of the stuff, since we work at such a low level (basically "sucking out windows' brain on the fly"). And even with a two-machine setup, it cumbersome constantly rebooting the test machine before you can do another test (edits + recompile in such a session are usually very short, so you spend time waiting for the test machine to boot back up).
Oh, and needless to say, VMX is nontrivial and the x86 architecture itself has so many quirks... but it's an interesting project :)
Just curious - what are you guys working on lately?
Didn't reply for a week as I got stuffed enough at work: tech support (fixing stuff, adding features), and again plunging into the programmers' hell called SymbianOS/UIQ. You've gotta type 1k lines there for a hello-world, in a "holy framework" environment, where everything is hidden and bad legacy choices are the fundamentals. Worst of all, it's the extreme opposite of learning how to ride a bicycle. Just like Blender3D - leave it for a week or more, and you forget it all.
Meanwhile while resting or waiting for the compiler+emulator (duh..), developing SM3.0 shader-stuff in OpenGL. Recently finished motion-blur and DOF. Now sometimes trying to find the optimal data-structures for the type of scenes and gameplay I want to make.
And planning to make a simple antivirus app to disinfect 8,000 .exe files - (restore code entry-point, remove virus body) - something no AV could do and all would happily delete everything from my PC.
Today I wrapped DirectPlay's LobbyClient and LobbiedApplication interfaces, as they are rather unwieldy and the relevant code is ugly to look at.
The resulting OA32 objects are called D3D_LobbyClient and D3D_LobbyableApp ... a small demo was written to demonstrate an application that registers 'itself' so that a LobbyClient can Launch it from a Lobby, and will 'unregister' the next time you run it.
While doing this, I noticed that I had never translated the IDirectPlay8Address and IDirectPlay8AddressIP interfaces (and all the junk related to them)... which would make life rather difficult for anyone that ever tried to actually use DirectPlay - so I translated that stuff and appended it to DPlay8.inc
Later in the day, I wrapped DirectPlay's Client and Server interfaces.
It should be possible for me to now derive a LobbyServer class and implement a protocol for LobbyServer/LobbyClient communications.
Today's work:
-D3D_LobbyableApp (needed for Lobby Clients and Lobby Servers)
-D3D_LobbyClient (needed for Lobby Clients)
-D3D_Client (needed for Pure Clients)
-D3D_Server (needed for Pure Servers and Lobby Servers)
Note that I've not bothered with peer to peer communications.
In fact, I've barely fleshed out these objects, they're far from complete...however I intend them to be BaseClasses from which you derive protocol-specific classes.
And I'm not saying that DirectPlay is the coolest network code out there, I think my earlier IOCP work was much better than this (hey, it even had plugin protocols you could switch at runtime), but I'll implement this because its popular, and deserves to be part of OA32's DirectX support.
You can expect all this stuff in the next release of OA32.
For those who don't know and do care - a Lobby Server is a server that helps you find other Players, and other Servers.. and a Lobby Client is the program you use to look at that list, which may be a separate exe to the main Game, and which can be used to LAUNCH the main Game exe in order to join a particular game on a particular Game Server.
The resulting OA32 objects are called D3D_LobbyClient and D3D_LobbyableApp ... a small demo was written to demonstrate an application that registers 'itself' so that a LobbyClient can Launch it from a Lobby, and will 'unregister' the next time you run it.
While doing this, I noticed that I had never translated the IDirectPlay8Address and IDirectPlay8AddressIP interfaces (and all the junk related to them)... which would make life rather difficult for anyone that ever tried to actually use DirectPlay - so I translated that stuff and appended it to DPlay8.inc
Later in the day, I wrapped DirectPlay's Client and Server interfaces.
It should be possible for me to now derive a LobbyServer class and implement a protocol for LobbyServer/LobbyClient communications.
Today's work:
-D3D_LobbyableApp (needed for Lobby Clients and Lobby Servers)
-D3D_LobbyClient (needed for Lobby Clients)
-D3D_Client (needed for Pure Clients)
-D3D_Server (needed for Pure Servers and Lobby Servers)
Note that I've not bothered with peer to peer communications.
In fact, I've barely fleshed out these objects, they're far from complete...however I intend them to be BaseClasses from which you derive protocol-specific classes.
And I'm not saying that DirectPlay is the coolest network code out there, I think my earlier IOCP work was much better than this (hey, it even had plugin protocols you could switch at runtime), but I'll implement this because its popular, and deserves to be part of OA32's DirectX support.
You can expect all this stuff in the next release of OA32.
For those who don't know and do care - a Lobby Server is a server that helps you find other Players, and other Servers.. and a Lobby Client is the program you use to look at that list, which may be a separate exe to the main Game, and which can be used to LAUNCH the main Game exe in order to join a particular game on a particular Game Server.
Seriously considering DUMPING the DPlay support.
Simply put, it's horrible to work with - the idea of a standard framework for networking is a good one, but this implementation forces you to LEARN THIS IMPLEMENTATION, intimately, or nothing will work for you.
The more I think about it, the more I like my modular IOCP framework.
Seems to me it was more flexible and more user-friendly, even in its under-developed state.
You didn't need to learn how its internals worked to use it.
Think I might take a good long look at both of them from arms length and make a decision.
Simply put, it's horrible to work with - the idea of a standard framework for networking is a good one, but this implementation forces you to LEARN THIS IMPLEMENTATION, intimately, or nothing will work for you.
The more I think about it, the more I like my modular IOCP framework.
Seems to me it was more flexible and more user-friendly, even in its under-developed state.
You didn't need to learn how its internals worked to use it.
Think I might take a good long look at both of them from arms length and make a decision.
Polishing up my iocp server framework...
Removed the support for DLL-based Protocol Plugin objects.
I never found a reason for switching protocols midway through a network session.
Worked on several of the supporting baseclasses.
Added a User-defined StateBlock to the Client object.
Implemented support for DeltaCompression of State Changes in the Client object.
The actual implementation of that would of course belong in a derived Plugin (protocol) class, but a proposed implementation has been described in the comments.
The idea is not new... whenever changes in the client state need to be sent from client to server (or vice versa), only the data which has changed in value is sent, along with a small amount of data used to map the new values to offsets in the state container.
Added some utility methods to the Client object to look after Packing and Unpacking of the changes in the User StateBlock - more support for DeltaCompressing that crucial data.
Well, especially crucial for GAMING, because the Client State will be constantly changing, only not all of it will be changing at any given time.
I've sent Biterider a demo server project containing all the current code, now I'll give him some time to absorb the files and get back to me with his thoughts before I work on this stuff again.
So now I need to work on something else, I'll turn my attention to my animated skinmesh class, in particular, I want to be able to extend the reference FrameTree on a per-instance basis, so I can attach stuff to instances that will get driven by animations, so I don't need a whole frametree for every instance of the same skinmesh.
Well, especially crucial for GAMING, because the Client State will be constantly changing, only not all of it will be changing at any given time.
I've sent Biterider a demo server project containing all the current code, now I'll give him some time to absorb the files and get back to me with his thoughts before I work on this stuff again.
So now I need to work on something else, I'll turn my attention to my animated skinmesh class, in particular, I want to be able to extend the reference FrameTree on a per-instance basis, so I can attach stuff to instances that will get driven by animations, so I don't need a whole frametree for every instance of the same skinmesh.
I don't think (m)any of the bigger games use DPlay anyway - I can't remember when I last saw any game mentioning DPlay.
Like I said (about a year ago on this forum) DPlay is deprecated and the official statement says that developers should use Windows' networking mechanisms instead of DPlay ^^ This statement exists since DirectX8. It's most probably because Vista doesn't support it, IIRC. And Windows 7 isn't going to support it either.
DPlay had been revised and improved for 9.0a thru c - even so far as breaking backward compatibility, so I couldn't understand them deprecating something they actively developed - I decided to take a closer look at the model anyway, particularly because of its built-in support for nat traversal.
It wasn't a complete waste of time as I've found myself adopting some ideas from that model, although I'd have to say thats where the similarities end. I think my model, based on James Ladd's model, has a much cleaner and more logical layout. However some of the utility methods in the lower objects appealed to me.
It wasn't a complete waste of time as I've found myself adopting some ideas from that model, although I'd have to say thats where the similarities end. I think my model, based on James Ladd's model, has a much cleaner and more logical layout. However some of the utility methods in the lower objects appealed to me.
We have loaded a Reference SkinMesh, including its FrameTree.
This resource must be shared across all INSTANCES of the skinmesh.
Now we wish to attach other meshes to the skinmesh.
We should be able to attach instances of static meshes, as well as instances of skinmeshes!
It would be easiest to attach them to the Reference FrameTree, but that would affect ALL INSTANCES which rely on that resource. So we can't do that.
So if the changes to the frametree must NOT persist, the next best option would be to mark some reference frames as 'possible attachment points', and then while Walking the frametree just before we render it, whenever we see such a tagged node, we can refer back to the INSTANCE to check whether anything is actually attached there.
In fact, if we limit ourselves to attaching only to the BONEFRAMES of the hierarchy (as these are the only ones we actually care about), we don't need to tag anything either, the ref frametree can be completely left alone.
My proposal is to add an array of pointers to each Instance of the skinmesh.
Each pointer represents a BONEFRAME in the reference mesh, and can contain NULL (nothing is attached here) or a pointer...
We can use BIT31 of the pointer to indicate whether the attached mesh instance is static or skinmesh.
We will only walk the attached subtrees once the current node already has had its matrix concatenated, as the current node's matrix is the root matrix of the subtree!! Now - while walking our ref frametree, if we encounter an attached skinmesh, we should update the animation for the attached skinmesh instance by walking the subtree as if it was part of the main tree, and same when we wish to render it.
The only difficulty will be trying to synchronize the animation of a skinmesh with the animation of attached skinmeshes - this will be a challenge I will burden my animators with, rather than trying to calculate nice lerp factors for the animationcontroller... simply put, if we require animations to be synchronized, we should make sure they have the same playtime / period.
Anyway, its far more likely that we'll want to attach STATIC meshes to our ANIMATED skinmesh, I just like having choices.
This resource must be shared across all INSTANCES of the skinmesh.
Now we wish to attach other meshes to the skinmesh.
We should be able to attach instances of static meshes, as well as instances of skinmeshes!
It would be easiest to attach them to the Reference FrameTree, but that would affect ALL INSTANCES which rely on that resource. So we can't do that.
So if the changes to the frametree must NOT persist, the next best option would be to mark some reference frames as 'possible attachment points', and then while Walking the frametree just before we render it, whenever we see such a tagged node, we can refer back to the INSTANCE to check whether anything is actually attached there.
In fact, if we limit ourselves to attaching only to the BONEFRAMES of the hierarchy (as these are the only ones we actually care about), we don't need to tag anything either, the ref frametree can be completely left alone.
My proposal is to add an array of pointers to each Instance of the skinmesh.
Each pointer represents a BONEFRAME in the reference mesh, and can contain NULL (nothing is attached here) or a pointer...
We can use BIT31 of the pointer to indicate whether the attached mesh instance is static or skinmesh.
We will only walk the attached subtrees once the current node already has had its matrix concatenated, as the current node's matrix is the root matrix of the subtree!! Now - while walking our ref frametree, if we encounter an attached skinmesh, we should update the animation for the attached skinmesh instance by walking the subtree as if it was part of the main tree, and same when we wish to render it.
The only difficulty will be trying to synchronize the animation of a skinmesh with the animation of attached skinmeshes - this will be a challenge I will burden my animators with, rather than trying to calculate nice lerp factors for the animationcontroller... simply put, if we require animations to be synchronized, we should make sure they have the same playtime / period.
Anyway, its far more likely that we'll want to attach STATIC meshes to our ANIMATED skinmesh, I just like having choices.