OpenGL textures and the power-of-two size restriction
Wednesday, April 8, 2009
Prior to version 2.0, the OpenGL specification required that texture dimensions be powers of two. This simplifies the implementation of texture mapping, because converting floating point texture co-ordinates (which are in the range 0..1) to texel coordinates is trivial; multiplying a floating point number by a power of two is essentially just adding a number to the exponent. Eventually, more capable drivers and graphics cards came along, and introduced the ability to use non-power-of-two texture dimensions. To signal this capability to GL applications, they report supporting the GL_ARGB_texture_non_power_of_two extension. OpenGL 2.0 implementations are required to support this extension.
In practice, the only major OpenGL implementations which don’t provide this extension are older X11 drivers, and the Microsoft Windows software renderer, which is a very bare-bones OpenGL 1.1 implementation.
There is a trick for padding textures up to a power of two for
implementations which don’t support this extension, however it doesn’t
seem to work everywhere either. Instead of manipulating the bitmap in
software before passing it onto a call to
glTexImage2D(),
it is permissible as of OpenGL 1.1 to pass a bitmap pointer of NULL
.
This creates a texture with uninitialized content. The
glTexSubImage2D()
function is used to fill it portions of the new texture. In particular,
glTexSubImage2D()
places no restriction on the width and height, even
if GL_ARGB_texture_non_power_of_two
is not supported.
The above trick works with the Windows software renderer. On the other hand, previous-generation MacBooks with Intel graphics suffer from a driver bug which results in artifacts appearing when this feature is used to render scaled textures. However, all OpenGL implementations in recent Mac OS X releases support non-power-of-2 textures, so on this platform, the workaround can be avoided entirely anyway.
In Factor, the opengl.capabilities vocabulary provides some utility words to check for extensions. For example, a common operation is checking for either a specific OpenGL version, or an extension (new versions of the GL spec frequently absorb existing extensions):
"2.0" { "GL_ARB_texture_non_power_of_two" } has-gl-version-or-extensions?
The gl-extensions
word outputs a sequence of all supported extensions.
Here is the output from the Mesa software renderer on
Linux.
I replaced my old code in
opengl.textures
which padded bitmap image
objects out to
powers of two using sequence manipulation words, with the new way using
texture sub-images instead. If the extension is present, no padding is
performed, ensuring correct behavior on Mac OS X. This means that any
code using opengl.textures
, such as the UI’s text rendering and image
support, should now spend less CPU time running Factor code.
Factor’s OpenGL
binding has been
in development for 4 years and has seen contribution from 4 developers.
For a demo of what it can do, try "spheres" run
in the UI listener.
You will need a video driver that supports OpenGL 2.0 or
GL_ARB_shader_objects
.