Fri 22 Aug 2003 01:05:08 AM UTC, comment #5:
Ok, added support for picking better texture format via the hint flag for CL_Surface.
Next thing is to make CL_Sprite support the hint flag as well, and let the sprite and surface resources support the hint flag.
I'm going to bed - good night ppl. :)
|
Fri 22 Aug 2003 12:50:13 AM UTC, comment #4:
Following code can be used to determine what image format the OpenGL driver would choose for a texture:
glTexImage2D(GL_PROXY_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
GLint chosen_format = GL_RGBA8;
glGetTexLevelParameteriv(GL_PROXY_TEXTURE_2D, GL_TEXTURE_INTERNAL_FORMAT, &chosen_format);
|
Thu 21 Aug 2003 11:58:18 PM UTC, comment #3:
Ok sphair just pointed out another possibility. Maybe these old cards only support 16 bit textures. In that case, specifying GL_RGBA8 helps nothing, they will still choose to fall back on using GL_RGBA4 (4 bits per channel).
The following other GL hints may be of use:
GL_RGB5 = RGBA(5,5,5,0)
GL_RGB5_A1 = RGBA(5,5,5,1)
GL_RGBA4 = RGBA(4,4,4,4)
Unfortunately this require ClanLib to make a more intelligent pick for the texture format. Surfaces with no alpha should pick RGB5, those with only colorkey style transparency should pick RGB5_A1, and finally fully alpha'ed surfaces would have to pick RGBA4.
One solution could be to push the problem to the game developer, using the Hint parameter for surfaces.
|
Thu 21 Aug 2003 09:30:07 AM UTC, comment #2:
TGA surfaces I load using
surface = new CL_Surface("filename.tga");
and display using
surface->draw(x,y);
are dithered, i.e. they seem to be drawn at an 8 bit color resolution instead of 24 bit.
Any picture format of choice to have true 24 bit surface display? Unless this is a limitation of OpenGL with my video card? (I am using XFree 4.3 and I have an ATI All-In-Wonder Rage Pro 128).
Guillaume Pratte
http://www.dmi.usherb.ca/minvaders/en/
|