ShaderEffectItem
element together with the ShaderEffectSource .git://gitorious.org/qt-labs/qml1-shadersplugin.git
git clone git://gitorious.org/qt-labs/qml1-shadersplugin.git
QT += declarative opengl
. If everything compiles, there should be no problems during the installation..r, .g, .b, .a
), depth, texture coordinates ( .x, .y, .z, .w
or .s, .t, .p, .q
). The entry point to the shader is the void main()
function. If the program uses both types of shaders, then there are two entry points main. Before entering the main
function, global variables are initialized. Special types of variables are defined in GLSL:uniform
- shader communication with external data (in the case of QML, these will be properties of elements (property)), it should be noted that this type of variables is read only;varying
- this type of variables is necessary to connect a fragmentary shader with a vertex shader, that is, to transfer data from a vertex shader to a fragment shader. In the vertex shader, they can be changed, and in the fragment shader they are read-only;attribute
- global scope variables;sampler2D
is one of the GLSL types representing texture (there is also sampler1D, sampler3D, samplerCube, sampler1Dshadow, sampler2Dshadow);vec4 texture2D(sampler2D s, vec2 coord)
is a function used to read a pixel from texture s
, with texture coordinates coord
.gl_FrontColor
is a vector in which final color texture data is written and which is available only in a fragment shader. QmlApplicationViewer viewer; ... QGLWidget* glWidget = new QGLWidget(format); ... viewer.setViewport(glWidget); ...
main
generated in QtCreator (Qt Quick Application Wizard), the QmlApplicationViewer
class QmlApplicationViewer
inherited from QDeclarativeView
. After each example, I will provide a link to the full source code.ShaderEffectItem
element will be used, which allows you to make changes to the display on the screen of various QML elements using OpenGL mechanisms. It is available in the Qt.labs.shaders 1.0
module (since it is under development), but now you can try using it. For writing vertex and fragment shader code, the properties (of type string
) fragmentShader and vertexShader are defined respectively.ShaderEffectSource
required to specify the QML component to be available in the shader. It will mainly use the properties of sourceItem and hideSource . The first points to a specific QML element (its identifier) ​​that will be affected by the shaders, and hideSource
“says” that the original element will be hidden when the shaders effect is applied.ShaderEffectItems
as source (s) for other ShaderEffectItems
, but you should not declare ShaderEffectItems
as a child of an element defined in source
, since this will most likely cause a redrawing loop.property
), and they will also be available as variables in shader programs. This happens automatically if their names match and the variable in the shader is declared with the uniform
qualifier already mentioned - the so-called binding. When we get to the examples, this moment will be immediately clear.gradient
property. In the example below, I want to show how the gradient can be achieved using the shader mechanism. So, a Rectangle
element with dimensions 360 by 360 will be created. It is also necessary to add a ShaderEffectItem
element as a child to the Rectangle
, with the anchors.fill
property anchors.fill
with a value of parent
. Thus, we say that the shader "covers" the entire parent element. The code is presented below: import QtQuick 1.0 import Qt.labs.shaders 1.0 Rectangle { width: 360 height: 360 ShaderEffectItem { anchors.fill: parent fragmentShader: " varying highp vec2 qt_TexCoord0; void main(void) { lowp vec4 c0 = vec4( 1.0, 1.0, 1.0, 1.0 ); lowp vec4 c1 = vec4( 1.0, 0.0, 0.0, 1.0 ); gl_FragColor = mix( c0, c1, qt_TexCoord0.y ); } " } }
fragmentShader
property - it contains the text of the fragment shader program. First, we define the variable varying highp vec2 qt_TexCoord0
, which we get from the vertex shader, although it is not defined by us, it has a default implementation and we can get data from there. qt_TexCoord0
determines, as I understand it, the texture coordinates of the scene as a whole (I will be glad if someone qt_TexCoord0
me and tells you what it is called correctly, in terms of computer graphics). Now let's turn to the main
function. We define in it two vectors c0
containing white color (the color is represented as rgba) and c1 is red, and then we assign to the output vector gl_FragColor
value obtained for each pixel using the mix
function, a linear interpolation function between two values:mix (vec4 x, vec4 y, float a)
- is expressed by the formula: x * ( 1.0 - a )+y * a
a
here will be the .y
value of the texture vector corresponding to the vector coordinate along the y axis. Accordingly, the output will be as follows:qt_TexCoord0.y
represents the vector coordinate along the y
axis, the gradient will be from top to bottom, if, for example, we want a gradient from left to right, we need to replace the line: gl_FragColor = mix( c0, c1, qt_TexCoord0.y );
gl_FragColor = mix( c0, c1, qt_TexCoord0.x );
.x
means the vector coordinate of x
. And if we want to just paint over everything in red, without any gradient, then there will be such a code (here absolutely all pixels are painted in red): void main(void) { gl_FragColor = vec4 ( 1.0, 0.0, 0.0, 1.0 ); }
x
and y
, you can use texture s
and t
respectively. The result will be the same. Source code is available here.ShaderEffectSource
to define a property in the Item
element, for example, under the name source
. In the same shader, we specify uniform lowp sampler2D source;
thereby making the binding of our texture (the image of the planet) to the shader code and the ability to make changes to its rendering. To create any animation you need to change some data in time. For this, I will use the QML PropertyAnimation element. And what kind of data we need to change? Here I want to show an example of how to replace the data of one pixel with the data of another and thereby get the effect of animation. Those. for example, we have a pixel with x, y texture coordinates (as well as color data), and we substitute some neighboring pixel (with our own color data) instead, and we will choose it as some increment obtained for some functions, let it be a sin
function. Therefore, as variable data, it is desirable to have an angle from 0 to 360 degrees. Thus, if you look at the code below in PropertyAnimation
angle property is set to change from 0.0
to 360.0
. import QtQuick 1.0 import Qt.labs.shaders 1.0 Item { width: img.width height: img.height Image { id: img source: "images/space.jpg" } ShaderEffectItem { property variant source: ShaderEffectSource { sourceItem: img; hideSource: true } anchors.fill: img fragmentShader: " varying highp vec2 qt_TexCoord0; uniform lowp sampler2D source; uniform highp float angle; void main() { highp float wave = 0.01; highp float wave_x = qt_TexCoord0.x + wave * sin( radians( angle + qt_TexCoord0.x * 360.0 ) ); highp float wave_y = qt_TexCoord0.y + wave * sin( radians( angle + qt_TexCoord0.y * 360.0 ) ); highp vec4 texpixel = texture2D( source, vec2( wave_x, wave_y ) ); gl_FragColor = texpixel; }" property real angle : 0.0 PropertyAnimation on angle { to: 360.0 duration: 800 loops: Animation.Infinite } } }
highp float wave = 0.01
. Why do I need the function of radians
I think no need to explain. But if we substitute just the value of the angle angle
picture will simply move different sides, and we also need something more spectacular - a “beating”. Texture coordinates vary from 0 to 1, respectively, for each pixel there will be a “multiplication in the sin function by an angle of 360”. In wave_x
and wave_y
I will record the coordinates of a pixel from some neighboring neighborhood, taken along the x
axis and along the y
axis y
respectively. With texture2D( source, vec2( wave_x, wave_y ) );
we take the values ​​of this new pixel and write them into the gl_FragColor
we already know.Button.qml
. I added a bit to her description of work with shaders. The fragment shader code is almost the same as the example above, only I slightly increased the oscillation amplitude wave = 0.02
: import QtQuick 1.0 import Qt.labs.shaders 1.0 Item { width: but.width height: but.height property alias text: textItem.text Rectangle { id: but width: 130; height: 40 border.width: 1 radius: 5 smooth: true gradient: Gradient { GradientStop { position: 0.0; color: "darkGray" } GradientStop { position: 0.5; color: "black" } GradientStop { position: 1.0; color: "darkGray" } } Text { id: textItem anchors.centerIn: parent font.pointSize: 20 color: "white" } MouseArea { property bool ent: false id: moousearea anchors.fill: parent onEntered: { ent = true } onExited: { ent = false effect.angle = 0.0 } hoverEnabled: true } } ShaderEffectItem { id: effect property variant source: ShaderEffectSource { sourceItem: but; hideSource: true } anchors.fill: but property real angle : 0.0 PropertyAnimation on angle { id: prop1 to: 360.0 duration: 800 loops: Animation.Infinite running: moousearea.ent } fragmentShader: " varying highp vec2 qt_TexCoord0; uniform lowp sampler2D source; uniform highp float angle; void main() { highp float wave = 0.02; highp float wave_x = qt_TexCoord0.x + wave * sin( radians( angle + qt_TexCoord0.x * 360.0 ) ); highp float wave_y = qt_TexCoord0.y + wave * sin( radians( angle + qt_TexCoord0.y * 360.0 ) ); highp vec4 texpixel = texture2D( source, vec2( wave_x, wave_y ) ); gl_FragColor = texpixel; }" } }
import QtQuick 1.0 import Qt.labs.shaders 1.0 Item { width: 150 height: 190 Column { anchors.horizontalCenter: parent.horizontalCenter Button { text: "Apple" } Button { text: "Red" } Button { text: "Green" } Button { text: "Blue" } } }
onExited
event, onExited
necessary to reset the angle property of the effect
element element to 0.0, otherwise the angle substituted into the calculation of the neighboring pixel will begin to be calculated not from 0, but from the last value, and not exactly what we expect. The result is this effect:ShaderEffectItem
element:property real xPos: 65.0
property real yPos: 65.0
property real radius: 50
MouseArea
element and handling of the onPositionChanged
event are defined to track the mouse movement. Below is the source code and further explanation: Rectangle { width: img.width height: img.height Image { id: img source: "images/nature.jpg" } ShaderEffectItem { id: effect anchors.fill: parent MouseArea { id: coords anchors.fill: parent onPositionChanged: { effect.xPos = mouse.x effect.yPos = coords.height - mouse.y } } property real xPos: 65.0 property real yPos: 65.0 property real radius: 50 property int widthImage: img.width property int heightImage: img.height property variant source: ShaderEffectSource { sourceItem: img; hideSource: true } fragmentShader: "varying highp vec2 qt_TexCoord0; uniform highp float xPos; uniform highp float yPos; uniform highp float radius; uniform highp int widthImage; uniform highp int heightImage; highp vec2 pixcoords = qt_TexCoord0.st * vec2( widthImage, heightImage ); uniform sampler2D source; void main(void) { lowp vec4 texColor = texture2D(source, qt_TexCoord0.st); lowp float gray = dot( texColor, vec4( 0.6, 0.5, 0.1, 0.0 ) ); if ( ( pow( ( xPos - pixcoords.x ), 2 ) + pow( ( yPos - pixcoords.y ), 2 ) ) < pow( radius, 2 ) ) { gl_FragColor = vec4( gray, gray, gray, texColor.a) ; } else { gl_FragColor = texture2D( source, qt_TexCoord0 ); } }" } }
pixcoords.x; pixcoords.y
) pixcoords.x; pixcoords.y
into a circle with center at xPos
and yPos
and radius
.dot
function performs the scalar product). If not, then the specific pixel does not change. Again, you can see how the intermittent QML of an element is associated with shader program variables — they have the same name and equivalent types: real
equivalent to highp float
.ShaderEffectItem
two images texture0
and ShaderEffectItem
will be defined as elements of ShaderEffectSource
. In the fragment shader code, these two images will be stored as two textures in a uniform sampler2D texture0
and a uniform sampler2D texture1
.In the variables s1
and s2
we get the texture coordinates of each pixel of the first image and the second, respectively, as shown in the code below: import QtQuick 1.0 import Qt.labs.shaders 1.0 Rectangle { width: coffee.width height: coffee.height Image { id: coffee source: "images/coffee.jpg" } Image { id: granules source: "images/granules.jpg" } ShaderEffectItem { anchors.fill: parent id: effect property variant texture0: ShaderEffectSource { sourceItem: coffee; hideSource: true } property variant texture1: ShaderEffectSource { sourceItem: granules; hideSource: true } fragmentShader: " varying highp vec2 qt_TexCoord0; uniform sampler2D texture0; uniform sampler2D texture1; void main(void) { vec4 s1 = texture2D( texture0, qt_TexCoord0.st ); vec4 s2 = texture2D( texture1, qt_TexCoord0.st ) ; gl_FragColor = mix( vec4( s1.r, s1.g, s1.b, 1.0 ), vec4( s2.r * 0.6, s2.g * 0.6, s2.b * 0.6, 0.4 ), 0.35 ); }" } }
gl_FrontColor
will be recorded already familiar to us (during the creation of the gradient) the result of linear interpolation of two vectors containing the pixel color parameters. Moreover, each channel of color in the texture s2 (coffee beans will be multiplied by 0.6, since we need it as a background). As a result, we have the following result:Source: https://habr.com/ru/post/133828/
All Articles