bergmann ad21b40fda | 7 years ago | |
---|---|---|
examples | 9 years ago | |
.gitignore | 9 years ago | |
README.md | 7 years ago | |
bitSpaceOpenGL.lpk | 8 years ago | |
bitSpaceOpenGL.pas | 8 years ago | |
dglOpenGL-LICENSE | 7 years ago | |
dglOpenGL.pas | 7 years ago | |
dglOpenGLES.pas | 9 years ago | |
glBitmapConf.inc | 10 years ago | |
uglcArrayBuffer.pas | 10 years ago | |
uglcBitmap.pas | 8 years ago | |
uglcCamera.pas | 9 years ago | |
uglcContext.pas | 9 years ago | |
uglcContextGtk2GLX.pas | 9 years ago | |
uglcContextGtkCustomVisual.pas | 10 years ago | |
uglcContextWGL.pas | 9 years ago | |
uglcFrameBufferObject.pas | 9 years ago | |
uglcLight.pas | 10 years ago | |
uglcShader.pas | 9 years ago | |
uglcTypes.pas | 10 years ago | |
uglcVertexArrayObject.pas | 10 years ago | |
ugluMatrix.pas | 10 years ago | |
ugluMatrixEx.inc | 8 years ago | |
ugluMatrixEx.pas | 8 years ago | |
ugluMatrixExHelper.pas | 8 years ago | |
ugluQuaternion.pas | 10 years ago | |
ugluVector.pas | 9 years ago | |
ugluVectorEx.inc | 8 years ago | |
ugluVectorEx.pas | 8 years ago | |
ugluVectorExHelper.pas | 8 years ago |
OpenGL is a useful collection of classes for creating OpenGL projects in Free Pascal. It wraps the most common OpenGL objects (such as RenderContext, Textures, VertexArray, FrameBuffer, ...) into simple classes. It also comes with some data types all around graphic programming.
The files dglOpenGL.pas and dglOpenGLES.pas contain all necessary constants, types and functions to make calls to the OpenGL or OpenGLES library. Also it takes care of dynamically loading the required function pointers from the library. If you want to write pure OpenGL code, these two files are all you need. The rest of this project is just to make your life easier.
dglOpenGL.pas and gdlOpenGLES.pas are maintained by the Delphi OpenGL Community.
The render context class is used to create an OpenGL render context. It is the first step you have to do before you continue writing your OpenGL App.
First we need to know wich context class we have to create, because every operating system (e.g. Windows, Linux) and every UI toolkit (e.g. pure X11, Gtk2) has its own way to create a render context. Luckily this project has a simple way to do this.
var ContextClass: TglcContextClass;
ContextClass := TglcContext.GetPlatformClass();
The second thing we need is a suitable pixel format descriptor that will change the pixel format of our device context to our needs. This is as simple as the last step.
var cpfs: TglcContextPixelFormatSettings;
cpfs := ContextClass.MakePF();
Now we can create our context object. This will not create the render context! The context object is a simple object to manage the render context. The render context will be created later.
var Context: TglcContext;
Context := ContextClass.Create(aWinControl, cpfs);
Now everything is ready to use. We can create our render context.
!Note: This step can be done in a separate thread, if yout want to do some offscreen rendering. But be careful: the render context can only be used in the thread were it was created.
Context.BuildContext();
Congratulations, you have created a valid render context and can start normal rendering now. After you are finished you should cleanup everything. Never mind if you forget to do this, the TglcContext will care about that for you.
Context.CloseContext();
Last step: you have to free the context object.
FreeAndNil(Context);
Array buffers, also known as vertex buffer objects, are used to store vertex data in video memory, so you can render your geometry very fast. Before you upload your data to the video memory you need to define what data you want to upload. We will define a vertex that has a three dimensional position, a two dimensional texture coordinate and a three dimensional normal vector.
type
TVertex = packed record
pos: TgluVector3f; // vertex position
tex: TgluVector2f; // texture coordinates
nor: TgluVector3f; // normal vector
end;
PVertex = ^TVertex;
Once you have defined your data, you can create an array buffer and upload your data to the video memory. In this example we use indexed vertex rendering. Each vertex is only stored once in our vertex array and then we will define an index array that describes which vertices have to be rendered.
varv
VertexBuffer: TglcArrayBuffer;
IndexBuffer: TglcArrayBuffer;
p: Pointer;
{ create our buffer objects }
VertexBuffer := TglcArrayBuffer.Create(TglcBufferTarget.btArrayBuffer);
IndexBuffer := TglcArrayBuffer.Create(TglcBufferTarget.btElementArrayBuffer);
{ write vertex data to vertex buffer }
vBuffer.Bind;
// allocate 4 * sizeof(TVertex) bytes of video memory and set its usage to StaticDraw
vBuffer.BufferData(4, SizeOf(TVertex), TglcBufferUsage.buStaticDraw, nil);
// memory map video memory to our application
p := vBuffer.MapBuffer(TglcBufferAccess.baWriteOnly);
// write our data to the mapped video memory
try
PVertex(p).pos := gluVertex3f(0.0, 0.0, 0.0);
PVertex(p).tex := gluVertex2f(0.0, 0.5);
PVertex(p).nor := gluVertex3f(0.0, 1.0, 0.0);
inc(p, SizeOf(TVertex));
{ ... more vertices following }
finally
vBuffer.UnmapBuffer;
vBuffer.Unbind;
end;
{ write indices to index buffer }
iBuffer.Bind;
iBuffer.BufferData(4, SizeOf(GLuint), TglcBufferUsage.buStaticDraw, nil);
p := iBuffer.MapBuffer(TglcBufferAccess.baWriteOnly);
try
PGLuint(p) := 0;
inc(p, sizeof(GLuint));
{ ... more indices following }
finally
iBuffer.UnmapBuffer;
iBuffer.Unbind;
end;
All data is uploaded. Now we can render our vertices using one of the glDraw methods. Before you call glDraw you must tell OpenGL where the vertex data is stored.
// use array buffers to draw primitive
VertexBuffer.Bind;
IndexBuffer.Bind;
try
// tell OpenGL where to find the vertex position
// 3 float values, starting at offset 0
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, SizeOf(TVertex), Pointer(0));
// tell OpenGL where to find the texture coordinates
// 2 float values, starting at offset 12 (= 3 * sizeof(GLfloat))
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glTexCoordPointer(2, GL_FLOAT, SizeOf(TVertex), Pointer(12));
// tell OpenGL where to find the normal vector
// normal vector starting at offset 20 (= 5 * sozeof(GLfloat))
glEnableClientState(GL_NORMAL_ARRAY);
glNormalPointer(GL_FLOAT, SizeOf(TVertex), Pointer(20));
// tell OpenGL to use our index array for indexed rendering
glEnableClientState(GL_INDEX_ARRAY);
glIndexPointer(GL_INT, 0, nil);
glDrawElements(GL_QUADS, iBuffer.DataCount, GL_UNSIGNED_INT, nil);
glDisableClientState(GL_INDEX_ARRAY);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
finally
IndexBuffer.Unbind;
VertexBuffer.Unbind;
end;
That’s it. Don’t forget to free your buffer objects when you’re done.
FreeAndNil(VertexBuffer);
FreeAndNil(IndexBuffer);
TglcBitmap objects are used to manage and render textures. To simply load and render a texture you need to create two objects. TglcBitmapData to load a texture from file and convert it to the format you want to use in your texture and TglcBitmap2D to manage a OpenGL texture object, upload or download texture data and of course render your texture. Sounds simple, doesn’t it?
// create our texture object
var Texture: TglcBitmap2D;
Texture := TglcBitmap2D.Create;
// load texture data to texture object
var TextureData: TglcBitmapData;
TextureData := TglcBitmapData.Create;
try
TextureData.LoadFromFile('./my-texture.png');
// convert your TextureData if it is not supported by OpenGL
if (TextureData.Format <> TextureData.FormatDescriptor.OpenGLFormat) then
TextureData.ConvertTo(TextureData.FormatDescriptor.OpenGLFormat);
Texture.UploadData(TextureData);
finally
FreeAndNil(TextureData);
end;
As you can see, you can destroy the TextureData object once the data is uploaded to your texture. The memory is not used any longer. Also you can do all operations to TextureData in a separate thread to have a nice load balancing and reduce render glitches.
Frame buffer objects are mostly used to do some offscreen rendering. You simply create a frame buffer object, attach some color and depth buffers to it and then render some stuff. Attached buffers can be of type render buffer or texture buffer. Render buffers are just a simple buffer OpenGL stores some needed data in (like a depth buffer). A texture buffer instead can be used as a normal texture after you have rendered something to the frame buffer object. Sounds complicated, but it’s easier as you imagine.
var
fbo: TglcFrameBufferObject;
tex: TglcTextureBuffer;
buf: TglcRenderBuffer;
fbo := TglcFrameBufferObject.Create;
try
// set the size of the frame buffer object
fbo.SetSize(800, 600);
// creating texture buffer and attach it as color buffer
tex := TglcTextureBuffer.Create(TglcFormat.fmRGBA, TglcInternalFormat.ifRGBA16F);
fbo.AddBuffer(tex, TglcAttachment.atColor0, true);
// creating render buffer and attach it as depth buffer
buf := TglcRenderBuffer.Create(TglcInternalFormat.ifDepthComponent);
fbo.AddBuffer(buf, TglcAttachment.atDepth, true);
// render to frame buffer object
fbo.Bind;
{ ... do some rendering }
fbo.Unbind;
// use texture buffer
tex.Bind;
{ ... do some rendering }
tex.Unbind;
finally
FreeAndNil(fbo);
end;
What would an OpenGL application be without fancy shaders? TglcShader helps you to load and use shaders in your application. The simplest example of using the shader classes is to load a shader and then use it for rendering.
Shader Code:
/* ShaderObject: GL_VERTEX_SHADER */
#version 330
layout(location = 0) in vec3 inPos;
void main(void)
{
gl_Position = vec4(inPos, 1.0);
}
/* ShaderObject: GL_FRAGMENT_SHADER */
#version 330
out vec4 outColor;
void main(void)
{
outColor = vec4(1.0, 0.0, 0.0, 1.0);
}
Application Code:
var Shader: TglcShaderProgram;
{ 'Log' is a callback to write log outputs to so you can detect errors while compiling your shader. }
Shader := TglcShaderProgram.Create(@Log);
Shader.LoadFromFile('./my-shader-file.glsl');
Shader.Compile;
Shader.Enable;
{ ... do some rendering }
Shader.Disable;
As mentioned before this is a really simple example. TglcShaderProgram has some more useful features, like setting uniform variables, binding attribute locations or completely build your shader step by step programmatically linking different TglcShaderObjects into the TglcShaderProgram.
Vertex array objects are used to store all information about vertex buffer objects and index buffer objects. Instad of telling OpenGL all these information in every render loop, they are stored in a single object and can be activated by simply binding the vertex array object.
var
vbo: TglcArrayBuffer;
vao: TglcVertexArrayObject;
{ ... create VBO and fill with data }
vao := TglcVertexArrayObject.Create;
bao.BindArrayBuffer(vbo, true);
vao.VertexAttribPointer(LAYOUT_LOCATION_POS, 3, GL_FLOAT, False, SizeOf(TVertex), GLint(@PVertex(nil)^.pos));
vao.VertexAttribPointer(LAYOUT_LOCATION_TEX, 2, GL_FLOAT, False, SizeOf(TVertex), GLint(@PVertex(nil)^.tex));
vao.Bind;
{ ... render vertices using glDrawXXX }
vao.Unbind;
TglcCamera is a quite simple class. It wraps the model view matrix into a useful class and provide some methods to manipulate the model view matrix such as Tilt, Turn or Move.
var Camera: TglcCamera;
Camera := TglcCamera.Create;
Camera.Perspective(45, 0.01, 100, 800/600); // define perspective view
Camera.Move(gluVector(2, 3, -5)); // move 2 right, 3 up and 5 back
Camera.Tilt(-25); // turn 25 degrees down
Camera.Turn(-10); // turn 10 degrees left
Camera.Activate; // activate camera
{ ... do normal rendering }
TglcLight and TglcMaterial are simple class wrappers of OpenGL light and material, just to manage them in an object oriented way. TglcLight is specialized in 3 different classes: