Overview
This example shows how a ShaderEffectItem can be used to apply postprocessing effects, expressed in GLSL, to QML VideoOutput items.
It also shows how native code can be combined with QML to implement more advanced functionality - in this case, C++ code is used to calculate the QML frame rate. This value is rendered in QML in a semi-transparent item overlaid on the video content.
Finally, this application demonstrates the use of different top-level QML files to handle different physical screen sizes. On small-screen devices, menus are by default hidden, and only appear when summoned by a gesture. Large-screen devices show a more traditional layout in which menus are displayed around the video content pane.
The following screenshots show shader effects being applied. In each case, the effect is implemented using a fragment shader.
Here we see an edge detection algorithm being applied to a video clip (Elephant's Dream from blender.org).
This image shows a page curl effect, applied to the same video clip.
Here we see a 'glow' effect (edge detection plus colour quantization) being applied to the camera viewfinder.
This image shows a 'lens magnification' effect applied to the viewfinder.
The application includes many more effects than the ones shown here - look for Effect*.qml files in the list above to see the full range.
Application structure
Shader effects can be applied to video or viewfinder content using ShaderEffectItem, as shown in the following example, which applies a wiggly effect to the content:
import QtQuick 2.0
import QtMultimedia 5.0
Rectangle {
width: 300
height: 300
color: "black"
MediaPlayer {
id: mediaPlayer
source: "test.mp4"
playing: true
}
VideoOutput {
id: video
anchors.fill: parent
source: mediaPlayer
}
ShaderEffect {
property variant source: ShaderEffectSource { sourceItem: video; hideSource: true }
property real wiggleAmount: 0.005
anchors.fill: video
fragmentShader: "
varying highp vec2 qt_TexCoord0;
uniform sampler2D source;
uniform highp float wiggleAmount;
void main(void)
{
highp vec2 wiggledTexCoord = qt_TexCoord0;
wiggledTexCoord.s += sin(4.0 * 3.141592653589 * wiggledTexCoord.t) * wiggleAmount;
gl_FragColor = texture2D(source, wiggledTexCoord.st);
}
"
}
}
In this application, the usage of the ShaderEffect and VideoOutput types is a bit more complicated, for the following reasons:
- Each effect can be applied to either a VideoOutput or an Image item, so the type of the source item must be abstracted away from the effect implementation
- For some effects (such as the edge detection and glow examples shown in the screenshots above), the transformation is applied only to pixels to the left of a dividing line - this allows the effect to be easily compared with the untransformed image on the right
- Most effects have one or more parameters which can be modified by the user - these are controlled by sliders in the UI which are connected to uniform values passed into the GLSL code
The abstraction of source item type is achieved by the Content, which uses a Loader to create either a MediaPlayer, Camera or Image:
import QtQuick 2.0
Rectangle {
...
Loader {
id: contentLoader
}
...
function openImage(path) {
console.log("[qmlvideofx] Content.openImage \"" + path + "\"")
stop()
contentLoader.source = "ContentImage.qml"
contentLoader.item.source = path
}
function openVideo(path) {
console.log("[qmlvideofx] Content.openVideo \"" + path + "\"")
stop()
contentLoader.source = "ContentVideo.qml"
contentLoader.item.mediaSource = path
}
function openCamera() {
console.log("[qmlvideofx] Content.openCamera")
stop()
contentLoader.source = "ContentCamera.qml"
}
}
Each effect is implemented as a QML item which is based on the Effect, which in turn is based on the ShaderEffect:
import QtQuick 2.0
ShaderEffect {
property variant source
property ListModel parameters: ListModel { }
property bool divider: true
property real dividerValue: 0.5
property real targetWidth: 0
property real targetHeight: 0
property string fragmentShaderFilename
property string vertexShaderFilename
QtObject {
id: d
property string fragmentShaderCommon: "
#ifdef GL_ES
precision mediump float;
#else
# define lowp
# define mediump
# define highp
#endif // GL_ES
"
}
onFragmentShaderFilenameChanged:
fragmentShader = d.fragmentShaderCommon + fileReader.readFile(fragmentShaderFilename)
onVertexShaderFilenameChanged:
vertexShader = fileReader.readFile(vertexShaderFilename)
}
The interface of the Effect allows for derived effects to specify the number of parameters which they support (and therefore the number of sliders which should be displayed), and whether a vertical dividing line should be drawn between transformed and untransformed image regions. As an example, here is the implementation of the pixelation effect. As you can see, the pixelation effect supports one parameter (which controls the pixelation granularity), and states that the divider should be displayed.
import QtQuick 2.0
Effect {
parameters: ListModel {
ListElement {
name: "granularity"
value: 0.5
}
}
property real granularity: parameters.get(0).value * 20
fragmentShaderFilename: "shaders/pixelate.fsh"
}
The main.qml file shows a FileOpen, which allows the user to select the input source and an EffectSelectionPanel item, which lists each of the available shader effects. As described above, a Content item is used to load the appropriate input and effect type. A Divider item draws the vertical dividing line, which can be dragged left / right by the user. Finally, a ParameterPanel item renders the sliders corresponding to each effect parameter.
Here is the source selection menu:
And here is the effect selection menu:
Calculating and displaying QML painting rate
The QML painting rate is calculated by the FrequencyMonitor class, which turns a stream of events (received via the notify() slot), into an instantaneous and an averaged frequency:
class FrequencyMonitor : public QObject
{
Q_OBJECT
Q_PROPERTY(qreal instantaneousFrequency READ instantaneousFrequency NOTIFY instantaneousFrequencyChanged)
Q_PROPERTY(qreal averageFrequency READ averageFrequency NOTIFY averageFrequencyChanged)
public:
...
static void qmlRegisterType();
public slots:
Q_INVOKABLE void notify();
};
The FrequencyMonitor class is exposed to QML like this
void FrequencyMonitor::qmlRegisterType()
{
::qmlRegisterType<FrequencyMonitor>("FrequencyMonitor", 1, 0, "FrequencyMonitor");
}
and its data is displayed by defining a QML item called FrequencyItem, like this:
import FrequencyMonitor 1.0
Rectangle {
id: root
...
function notify() {
monitor.notify()
}
FrequencyMonitor {
id: monitor
onAverageFrequencyChanged: {
averageFrequencyText.text = monitor.averageFrequency.toFixed(2)
}
}
Text {
id: labelText
anchors {
left: parent.left
top: parent.top
margins: 10
}
color: root.textColor
font.pixelSize: 0.6 * root.textSize
text: root.label
width: root.width - 2*anchors.margins
elide: Text.ElideRight
}
Text {
id: averageFrequencyText
anchors {
right: parent.right
bottom: parent.bottom
margins: 10
}
color: root.textColor
font.pixelSize: root.textSize
}
}
The result looks like this:
All that remains is to connect the afterRendering() signal of the QQuickView object to a JavaScript function, which will eventually call frequencyItem.notify():
QmlApplicationViewer viewer;
...
QQuickItem *rootObject = viewer.rootObject();
...
QObject::connect(&viewer, SIGNAL(afterRendering()),
rootObject, SLOT(qmlFramePainted()));
Files:
Images: