im trying to write my own texture support but it just doesnt seem to work i've been debugging for quite some time now and im off soon so i posted my code here so someone else might see my problem :) (i get the model but its white, no textures on)

i use the following function to load a texture:
AddGLTexture proc xSpace:DWORD, xFileName:DWORD
local xFile:dword
local xSize:dword
local mMem:dword
local BytesRead:dword

  invoke CreateFile, xFileName, GENERIC_READ, FILE_SHARE_READ, NULL,OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL
  mov xFile, eax
  invoke GetFileSize, eax, NULL
  mov xSize, eax
  mov mMem, malloc(xSize)
  invoke ReadFile, xFile, mMem, xSize, addr BytesRead, NULL
  invoke CloseHandle, xFile

  invoke glGenTextures,1, xSpace
  mov eax, xSpace
  invoke glBindTexture, GL_TEXTURE_2D, dword ptr ds:
  mov ebx, mMem
  mov eax, .BitMap.dataoffset
  add eax, ebx
  invoke glTexImage2D,GL_TEXTURE_2D, 0, 3, .BitMap.imgwidth, .BitMap.imgheight, 0, GL_RGB, GL_UNSIGNED_BYTE, eax
  invoke glTexParameteri,GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_NEAREST
  invoke glTexParameteri,GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR_MIPMAP_LINEAR
  free mMem
     
  ret
AddGLTexture endp


to display the model i use the following concept:
   invoke glPushMatrix
  invoke glTranslatef, X, Y, Z
  invoke glRotatef,angle, rX, rY, rZ

GroupLoop:

  invoke glBindTexture, GL_TEXTURE_2D, TextureID
  invoke glEnable, GL_TEXTURE_2D 
     
  invoke glBegin, GL_TRIANGLES
 
TriangleLoop:
  invoke glNormal3f
  invoke glTexCoord2f
  invoke glVertex3f

  invoke glNormal3f
  invoke glTexCoord2f
  invoke glVertex3f

  invoke glNormal3f
  invoke glTexCoord2f
  invoke glVertex3f
 
Check if all triangles of this group are done, else loop

  invoke glEnd

Check if all groups are done else loop

  invoke glPopMatrix


i striped my code out in the drawing part to give a more clear vieuw of my function calls and the drawing part an sich works ok since the model gets shown, only the texture doesnt (ill release the code later on anyways when i got all working)
ps. parameters of glTexCoord2f are also working properly

Thanks, Scorpie
Posted on 2005-03-14 17:28:46 by Scorpie
What file format is the texture source image? RAW?
Rewrite your texture loader procedure for a known format like BMP, get it working, then work on supporting other file formats later.
You can find C (and possibly asm source) for this at NeHe, or I could post something.
Posted on 2005-03-14 20:38:31 by Homer
My code supports the BMP format  :P i made a bitmap struct to read out the width and height of the image as following:

.BitMap.imgwidth
.BitMap.imgheight

and i calculate the dataoffset as following:

mov ebx, mMem
mov eax, .BitMap.dataoffset
add eax, ebx

this adds the imagedata offset (which is relative to the start of the file) to the start of the Allocated space.
The bmp im using is a normal paint-made 24-bits bitmap and my structure is correct.

Scorpie
Posted on 2005-03-15 01:59:25 by Scorpie
Try this:

    invoke glPixelStorei , GL_UNPACK_ALIGNMENT, 1  ; This sets the alignment requirements for the start of each pixel row in memory.

Do it just before you bind to the newly-generated textureid.
That's the only fundamental difference between your bmp loader and mine, other than I use auxDIBImageLoad to load the file data.

Are you sure your BMP is 24bit color? If not, you might want to use auxDIBImageLoad.
Also, if I remember correctly,when loading BMP by hand as you do,  you need to switch around the ARGB/RGBA pixel colors even if your BMP is 24bits.
Using DIB loading will ensure the color formats of the File and the Display are matched.

In the following example, ptexture is a pointer to a dword to receive the returned textureid.
Your

CreateTexture proc ptexture, pstrFileName
local pBitmap:ptr AUX_RGBImageRec

    .if !pstrFileName      ; Return from the function if no file name was passed in
        return E_FAIL
    .endif
   
    invoke auxDIBImageLoad,  pstrFileName  ;Load the bitmap and store the data
    .if eax == NULL    ; If we can't load the file, quit!
        return E_FAIL
    .endif
    mov pBitmap , eax

    invoke glGenTextures,1, ptexture                              ; Generate a texture with the associated texture variable
    invoke glPixelStorei , GL_UNPACK_ALIGNMENT, 1  ; This sets the alignment requirements for the start of each pixel row in memory.

    mov eax,ptexture
  invoke glBindTexture,GL_TEXTURE_2D, dword ptr    ; Bind the texture to the texture variable passed in
 
  ; Build Mipmaps (builds different versions of the picture for distances - looks better)
      mov ebx, pBitmap
      invoke gluBuild2DMipmaps,GL_TEXTURE_2D, 3, .AUX_RGBImageRec.dwsizeX, .AUX_RGBImageRec.dwsizeY, GL_RGB, GL_UNSIGNED_BYTE, .AUX_RGBImageRec.data
 
  ; Lastly, we need to tell OpenGL the quality of our texture map.  GL_LINEAR_MIPMAP_LINEAR
  ; is the smoothest.  GL_LINEAR_MIPMAP_NEAREST is faster than GL_LINEAR_MIPMAP_LINEAR,
  ; but looks blochy and pixilated.  Good for slower computers though. 
      invoke glTexParameteri,GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_NEAREST
      invoke glTexParameteri,GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR_MIPMAP_LINEAR

; Now we need to free the bitmap data that we loaded since openGL stored it as a texture
    mov ebx, pBitmap
    free .AUX_RGBImageRec.data
    free pBitmap
    return S_OK
CreateTexture endp

Posted on 2005-03-15 03:42:08 by Homer
i think that the parameter is the problem indeed GL_RGB uses floats as far as i can see on msdn and i need to use GL_BGR_EXT for 24bits bitmaps i think, ill let you know when i get home (forgot to upload my code so i couldt edit at school :()
Posted on 2005-03-15 07:05:20 by Scorpie
yey, working now after some tweaking of parameters i use the following now:
invoke glPixelStorei , GL_UNPACK_ALIGNMENT, 4
and the GL_BGR_EXT parameter for gluBuild2DMipmaps

so i can now load and display a model made in Milkshape (1.6.4) with textures, next step, animations :)
Posted on 2005-03-15 14:21:15 by Scorpie
Bitmaps and DIBs are strange formats - having the image upside-down, and that 4-byte alignment per row :). That's why I prefer .tga  :P (ugly icon isn't it)
Posted on 2005-03-16 01:33:21 by Ultrano
Homer, what include holds the auxDIBImageLoad function?
Posted on 2005-03-16 08:23:08 by Scorpie

Bitmaps and DIBs are strange formats - having the image upside-down, and that 4-byte alignment per row :). That's why I prefer .tga? :P (ugly icon isn't it)

Well my card supports A32R32B32G32 floating point textures, but if I want to load textures from memory, what file format should I emulate, to make it possible for me to make a procedural texture with SSE and transfer it to vram?

Posted on 2005-03-16 11:42:07 by daydreamer
I have little experience with hardware-accelerated 3D, and even less with SSE - but I think you should first allocate Width*Height*16 bytes , make with SSE some procedural texture, then convert it to integer A8R8G8B8/R8G8B8A8, send the first Width*Height*4 bytes to vram with lock/unlock of the texture, and finally release the memory taken for the temporary buffer.
Your card may support cool formats, but not everyone has a card like yours ^_^. And I doubt floats on vcards are as fast as integers to compute. 8 bits are more than enough for a color channel anyway :)
Posted on 2005-03-16 12:59:02 by Ultrano
He's on the right track, but still thinking in terms of 16bit color.
The following information will allow you to write texture loaders for ANY file format, and how to modify the imagedata sourced from the file to create OpenGL Textures from them.

Most graphic file formats, including bmp and tga, store the pixel color data in either 24 or 32bit words these days.
Examples:
TGA, stored as 24bit (BGR) , or as 32bit (BGRA) words.
iirc BMP is RGB or ARGB for 24 or 32bit.

Basically you need to get your color data into RGB or RGBA format, depending on if you want to create/preserve the Alpha channel for transparent textures.

After you have done this, you should create the texture as normal, except you specify the bitsperpixel (24 or 32) in your call to glTexImage2D as follow:


invoke glTexImage2D,GL_TEXTURE_2D,dwLOD,  dwChannels, dwWidth, dwHeight, dwBorderWidth, dwType, GL_UNSIGNED_BYTE, pimageData


where dwLOD = MipMap Level Of Density (0 = default)
where dwChannels = 1, 2, 3 or 4 (number of color components, 1=mono, 2=16bpp, 3=24bpp, 4=32bpp)
where Width and Height are some power of 2 (ie 128,256 etc)
where Border is the thickness of pixel border around image (0 or 1)
where dwType = usually GL_RGB or GL_RGBA (GL_COLOR_INDEX, GL_RED, GL_GREEN, GL_BLUE, GL_ALPHA, GL_RGB, GL_RGBA, GL_LUMINANCE, and GL_LUMINANCE_ALPHA)
where pImageData is a pointer to the image data in the same format as given in dwType

That is the only critical part of creating your own textures in OpenGL.
You should also be able to see that you can create procedural textures using this information.

Have a nice day  :P
Posted on 2005-03-16 21:02:50 by Homer