Using GLSL shaders in QMLIn this tutorial, we will show how to apply QML property animations to GLSL shaders in a 3D application. Starting with a relatively simple shader program, we are going to manipulate various parameters to explain how both the shader and the QML integration work (This is quite a lot to start a tutorial with, but we'll focus on each small piece of it in turn):
At the highest level, shaders in Qt3D are created using the ShaderProgram element. The ShaderProgram element in this example has the id program, and applying it to the teapot is as simple as assigning it to the effect property of an Item3D derived element. The two most important properties of the ShaderProgram element are the vertexShader and fragmentShader properties, which are strings containing the code for the respective shader programs. Broadly speaking, the vertex shader is run on each vertex in a model (for example, the 8 corners of a cube) to calculate the final position of each point, while the fragment shader is run on each pixel visible onscreen to calculate it's color. The other attribute we use in this tutorial is the texture property. This property is connected to the qt_Texture0 attribute in our shader programs, and we'll cover how it's used shortly. Texture Coordinates and Textures (shader-tutorial-1-textures.qml)To start with, there are a couple of obvious problems with our starting point - the Qt is upside down and back to front, and it's also boring (before you ask, the reasons for the image being the wrong way around are also boring). First, let's get the logo the right way up. The key here is the texture coordinates. Let's look at it's declaration in the vertex shader for a moment: attribute highp vec4 qt_MultiTexCoord0; The attribute declaration indicates that this is a per-vertex value that we get from our model. Our teapot has a position, a normal, and texture coordinates for each vertex, and qt3d automatically provides these to us through the qt_Vertex, qt_MultiTexCoord0, and qt_Normal attributes. (we don't care about normals until we get to lighting). The mediump tag indicates that we want pretty good accuracy on this attribute, but it doesn't need double precision. It's a hint for embedded systems to save bandwidth (highp is usually used for positions, and lowp is generally only suitable for colors, and the rule of thumb is to use as little memory as you can get away with). vec4 indicates that the value is a 4-element vector. Our texture is 2D, so we only care about the first two elements, but there are 3D textures and cube maps out there. To fix the image's orientation we can simply reverse the sign when we pass the texture coordinate to our fragment shader because the default behaviour is for texture coordinates wrap around if they are higher than 1.0 or lower than 0.0: texCoord = -qt_MultiTexCoord0; In order to fix the boring, we're going to have to add a bit more QML first. We need to add a property to our ShaderProgram element, and a matching variable to the shader program. The types match up exactly between QML and GLSL, but if the variables have the same name they should automatically be hooked up, so we just add this to the ShaderProgram element: property real textureOffsetX : 1.0 NumberAnimation on textureOffsetX { running: true; loops: Animation.Infinite from: 0.0; to: 1.0; duration: 1000 } and we add this to the vertex shader: uniform mediump float textureOffsetX;
The final step is to work our new variable into the program, by changing the texCoord assignment. We'll add our animated variable to our horizontal texture coordinate, which will effectively scroll our texture around our teapot: texCoord = vec4(-qt_MultiTexCoord0.s - textureOffsetX, -qt_MultiTexCoord0.t, 0,0); Adding an additional texture is done by adding another uniform. The ShaderProgram element interprets string properties as URIs for texture resources, so adding a second texture is as easy as: property string texture2: "textures/basket.jpg"
In order to have a smooth transition back and forth, we'll add and animate a second property to use to blend the two textures: property real interpolationFactor : 1.0 SequentialAnimation on interpolationFactor { running: true; loops: Animation.Infinite NumberAnimation { from: 0.0; to: 1.0; duration: 2000 } PauseAnimation { duration: 500 } NumberAnimation { from: 1.0; to: 0.0; duration: 2000 } PauseAnimation { duration: 500 } } Next we need to use all of that information in the fragment shader: varying highp vec4 texCoord; uniform sampler2D qt_Texture0; uniform sampler2D texture2; uniform mediump float interpolationFactor; void main(void) { mediump vec4 texture1Color = texture2D(qt_Texture0, texCoord.st); mediump vec4 texture2Color = texture2D(texture2, texCoord.st); mediump vec4 textureColor = mix(texture1Color, texture2Color, interpolationFactor); gl_FragColor = textureColor; } In general, textures needs to have the same name in the shaders as in the ShaderProgram element in order to be automatically connected, but qt_Texture0 is special cased and very slightly faster, so it's good to use it first. mix() is another handy built in GLSL function that interpolates linearly between two values or vectors. (You can find details of all the built in functions in the official OpenGL Shader Language specification available at http://www.khronos.org/opengl/ ) Finally, let's make one more change to make this example pop. If you're a performance fanatic, it just might have rankled that we padded out our texture coordinates with two zeros, and passed them in for processing on every single pixel of our teapot. Let's make use of that space by putting a second set of co-ordinates in there to use with our second texture. Let's change our texture assignment to this: texCoord.st = vec2(-qt_MultiTexCoord0.s - textureOffsetX, -qt_MultiTexCoord0.t); texCoord.pq = vec2(-qt_MultiTexCoord0.s + textureOffsetX, -qt_MultiTexCoord0.t); Now the top half of our vector contains co-ordianates spinning in the opposite direction. Back in the fragment shader, we just need use these instead for our second texture color, so let's change the texture2Color assignment to this, and really fix that boring: mediump vec4 texture2Color = texture2D(texture2, texCoord.pq); Varying values (shader-tutorial-varying.qml)The left hand side value is our varying attribute texCoord. varying values are how the vertex shader communicates with the fragment shader, and the declaration has to be identical in both shaders. varying values are calculated once for each vertex, but the values are interpolated across the shapes. The shader-tutorial-varying.qml shows this visually using the Pane class and a neat debugging trick - using the texture coordinates as a color. Even with only 4 vertexes the texture coordinates are smeared smoothly across the shape:
Vertexes and Matrices (shader-tutorial-2-vertex-shader.qml)Let's go back to the vertex shader. As already mentioned, the vertex shader's primary function is to generate the final position of a vertex. First, let's look at qt_Vertex. This is the value we're getting out of our model, the actual points on our shape. Manipulating this value will lets us change the position of the points somewhat independantly. For this tutorial, we'll create a squashing effect that might work for something rising out of water, or something rubbery being squashed down. What we need to do is create a floor, where vertexes above the floor retain their position, and vertexes below it are clamped to (nearly) that value. Then we move the model relative to this floor to create a nice effect. We need to use some knowledge that we have about the model for this to work - most notably it's height, and it's bottom. A bit of experimenting suggests that 2.0 for height and -1.0 for bottom are close enough. We need to introduce a couple more GLSL things here. max() is one of the GL SL built in functions, selecting the higher of the two arguments as you might expect. The foo.xyzw are called twiddles, and are convenient and efficient ways to pull out specific elements of vector areguments. You can see in this example, that we use twiddles to get out the x, z, and w values from our original vector and substitute in our own y value. For convenience, there are 3 different sets of twiddles that are all equivalent: foo.xyzw (co-ordinates), foo.rgba (colors), and foo.stpq (texture coordinates). As far as GLSL is concerned, though, they're just vectors, and effectively just commenting your code. The vec4() function will accept whatever combination of values and twiddles you throw at it, and mold them back into a 4 element vector. What this function is doing is moving the model down the screen (which for us, here, is along the y axis in the negative direction). We draw an imaginary line where the bottom of the model used to be, and if the vertex ends up below that line, we move it back to just past the line. "Just past" is important. Graphics cards get confused if vertexes are too close together, and the result is ugly. Try taking out the "qt_Vertex.y * 0.01" if you'd like to see what it looks like. float newY = max(qt_Vertex.y - squashFactor * modelSize, qt_Vertex.y * 0.01 + modelBottom); const float modelSize = 2.0; const float modelBottom = -1.0; float newY = max(qt_Vertex.y - squashFactor * modelSize, qt_Vertex.y * 0.01 + modelBottom); gl_Position = qt_ModelViewProjectionMatrix * vec4(qt_Vertex.x, newY, qt_Vertex.zw); Hopefully, that makes the function of the qt_Vertex attribute clear, so next we'll look at the Model/View/Projection matrices. The model matrix is generally used to place an object in a scene. In the simplest case, this might just be "up and to the left a bit", but it's often inherited from a parent. It's very easy mathematically to combine the matrices in such a way that one object is always in the same relative position to another ("The hip bones connected to the thigh bone"). The view matrix is generally used much like a camera is in a movie, for panning around a whole scene at once. Manipulating the view matrix is also commonly used for effects like mirrors. The projection matrix (qt_ProjectionMatrix) functions much like a camera does when you take a picture, converting all the 3d points in a scene into the 2d points on the screen. We won't be exploring the matrices individually, but let's explore what happens if we use the same effect that we just used on the vertex after the transformation matrices are applied. Firstly, we're going to need multiple teapots to see the difference, so let's add those in. We want to move them all together, so we'll wrap them in an Item3D, and apply our animations to that item instead of the teapots directly. Item3D { z: -8.0 transform: [ Rotation3D { NumberAnimation on angle { running: true; loops: Animation.Infinite from: 0; to: 360; duration: 5000 } axis: Qt.vector3d(0, 0, 1.0) } ] TutorialTeapot {id: teapot1; effect: program; y:2.0; x:0.0} TutorialTeapot {id: teapot2; effect: program; y:-1.0; x:-1.732} TutorialTeapot {id: teapot3; effect: program; y:-1.0; x:1.732} } In order to show the difference, what we want to do now is the same sort of effect as the previous example, only applied to the positions after the matrices have been applied, so our new vertex shader looks like this: attribute highp vec4 qt_Vertex; uniform mediump mat4 qt_ModelViewProjectionMatrix; attribute mediump vec4 qt_MultiTexCoord0; varying mediump vec4 texCoord; void main(void) { const float modelBottom = -4.0; vec4 workingPosition = qt_ModelViewProjectionMatrix * qt_Vertex; float newY = max(workingPosition.y, workingPosition.y * 0.15 + modelBottom); workingPosition.y = newY; gl_Position = workingPosition; texCoord = -qt_MultiTexCoord0; } There's nothing new here, we're just tweaking a few numbers for the new effect and manipulating the vertexes after the matrices have been applied. The result is an imaginary line across the whole scene, and when any part of any teapot dips below that line we deform it as though it's being squished or refracted. The obvious difference is that when you're manipulating qt_Vertex, the inputs, outputs, and changes are relative to the model. After the matrices are applied, We'll leave adding a pretty watery surface as an exercise for the reader. LightingFinally, we'll add lighting. We've left lighting till last because it requires a lot of additional variables, and it is not within the scope of this tutorial to explore them all individually. There are many better resources readily available, and the techniques already covered can be used to explore each element. For further reading, the full specification for the OpenGL Shader Language is available from the Khronos website at http://www.khronos.org/opengl/ Return to the main Tutorials page. Files: |