How to do it…

Let's follow these steps to learn how to use textures in OpenGL:

  1. Open up renderwindow.h and add the variables, that are highlighted in the following code block:
QOpenGLContext* openGLContext;
QOpenGLFunctions* openGLFunctions;
QOpenGLShaderProgram* shaderProgram;
QOpenGLVertexArrayObject* vao;
QOpenGLBuffer* vbo_vertices;
QOpenGLBuffer* vbo_uvs;
QOpenGLTexture* texture;
  1. We must call glEnable(GL_TEXTURE_2D) in the initializeGL() function to enable the texture mapping feature:
void RenderWindow::initializeGL()
{
openGLFunctions = openGLContext->functions();
glEnable(GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
  1. We need to initialize our texture variable under the QOpenGLTexture class. We will load a texture image called brick.jpg from our application folder and flip the image by calling mirrored(). OpenGL uses a different coordinate system, which is why we need to flip our texture before passing it to the shader. We will also set the min and max filters to Nearest and Linear accordingly, like so:
texture = new QOpenGLTexture(QImage(qApp->applicationDirPath() + "/brick.jpg").mirrored());
texture->setMinificationFilter(QOpenGLTexture::Nearest);
texture->setMagnificationFilter(QOpenGLTexture::Linear);

  1. Add another array called uvs. This is where we save the texture coordinates for our cube object:
GLfloat uvs[] = {
0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f,
1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 1.0f,
0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f,
1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f,
1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 1.0f,
0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f,
1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f,
0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 1.0f, 1.0f, 1.0f, 0.0f,
1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f,
1.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f
};
  1. We have to amend our vertex shader so that it takes in the texture coordinates for calculating where the texture will be applied to the object surface. Here, we simply pass the texture coordinate to the fragment shader without modifying:
static const char *vertexShaderSource =
"#version 330 core "
"layout(location = 0) in vec3 posAttr; "
"layout(location = 1) in vec2 uvAttr; "
"uniform mat4 matrix; "
"out vec3 fragPos; "
"out vec2 fragUV; "
"void main() { "
"fragPos = posAttr; "
"fragUV = uvAttr; "
"gl_Position = matrix * vec4(posAttr, 1.0); }";
  1. In the fragment shader, we create a texture by calling the texture() function, which receives the texture coordinate information from fragUV and the image sampler from tex:
static const char *fragmentShaderSource =
"#version 330 core "
"in vec3 fragPos; "
"in vec2 fragUV; "
"uniform sampler2D tex; "
"out vec4 col; "
"void main() { "
"vec4 texCol = texture(tex, fragUV); "
"col = texCol; }";
  1. We have to initialize the VBO for texture coordinate as well:
vbo_uvs = new QOpenGLBuffer(QOpenGLBuffer::VertexBuffer);
vbo_uvs->create();
vbo_uvs->setUsagePattern(QOpenGLBuffer::StaticDraw);
vbo_uvs->bind();
vbo_uvs->allocate(uvs, sizeof(uvs) * sizeof(GLfloat));
  1. In the paintEvent() function, we must send the texture coordinate information to the shader and then bind the texture before calling glDrawArrays():
vbo_uvs->bind();
shaderProgram->bindAttributeLocation("uvAttr", 1);
shaderProgram->enableAttributeArray(1);
shaderProgram->setAttributeBuffer(1, GL_FLOAT, 0, 2);

texture->bind();
glDrawArrays(GL_TRIANGLES, 0, 36);
  1. If you compile and run the program now, you should see a brick cube rotating on the screen:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset