⬆️ ⬇️

Andengine: arbitrary landscape with texture

The people began to write a game for android and encountered in Andengine (who does not know, this is the most popular graph. 2D engine for android) with this task: there is a set of interconnected lines that represent the landscape (how to generate, you can read here - gameprogrammer.com/fractal.html ). It looked like this:



image



But we do not need a “bridge”, we need a surface, and even with a texture, in general so that it would be like this ...



')

image



They began to dig AndEngine, it turned out that it can work with textures only as with sprites consisting of two triangles. We can not arrange it in any way, because we do not know in advance the size of the landscape, and therefore the proportions of UV coordinates of 1: 1 are not chanted. And in principle, here we are not a sprite, but the surface is a non-convex polyhedron. Therefore, we have to write our bike, because googling did not give normal results for the main andengine branch. It's good that he has adequate class interfaces and everything is logical, you just have to figure it out. We need our own class with a vertex buffer for triangles and their corresponding UV coordinates. I must say at once that I will not go into an explanation of why a number of functions are not overloaded and why some things are being done in a specific way. In places, andengine is a whole tangled forest in architectural terms and I just left things in the position in which it worked, because going through the entire engine of the engine will take 10 such articles and a half year life.

Go…



First, we agree that you already have a list in which all the lines that make up the surface lie. The same “bridge” shown on the first screen.



We begin to describe the class that will represent our surface:

private abstract class GroundShape extends Shape { 


For convenience, let's create an object for each vertex that stores its two-dimensional coordinates in space and UV.

  protected class Vertex { float x, y; float u, v; }; protected class MorphVertexBuffer extends VertexBuffer { public MorphVertexBuffer(int capacity) { //    ,  . super(capacity, GL11.GL_STATIC_DRAW, true); } //        . public void update(Vertex[] vertexes) { int j = 0; final float[] bufferData = new float[vertexes.length*2]; for (int i = 0; i < vertexes.length; ++i) { bufferData[j++] = vertexes[i].x; bufferData[j++] = vertexes[i].y; } final FastFloatBuffer buffer = this.getFloatBuffer(); buffer.position(0); buffer.put(bufferData); buffer.position(0);//,       :) super.setHardwareBufferNeedsUpdate(); } } 


The code above describes an internal class, which is a buffer with vertices that can be fed to the engine. We inherit from VertexBuffer to remain in the standard architecture and describe the update () method that fills the vertex buffer.

The next step is to create a type that describes a buffer with UV coordinate data and a texture mapping method.

  protected class MorphTexture extends BufferObject { //ITexture  ,    . final ITexture mTexture; public MorphTexture(ITexture tex, int pCapacity) { super(pCapacity, GL11.GL_STATIC_DRAW, true); mTexture = tex; } public void ApplyUV(Vertex [] vertexes) { final float[] bufferData = new float[vertexes.length*2]; for (int i = 0, j = 0; i < vertexes.length; ++i) { bufferData[j++] = vertexes[i].u; bufferData[j++] = vertexes[i].v; } final FastFloatBuffer buffer = this.getFloatBuffer(); buffer.position(0); buffer.put(bufferData); buffer.position(0);//,       :) super.setHardwareBufferNeedsUpdate(); } 


Here we have again formed a buffer, which we can later feed to the engine.

The following describes a function that “applies” a texture to an object and sets the pointer of the vertex buffer of the UV coordinates to the one that we formed in ApplyUV ()

  public void onApply(final GL10 pGL) { this.mTexture.bind(pGL);//   ,    glBindTexture() if(GLHelper.EXTENSIONS_VERTEXBUFFEROBJECTS) { final GL11 gl11 = (GL11)pGL; selectOnHardware(gl11); GLHelper.texCoordZeroPointer(gl11); } else { GLHelper.texCoordPointer(pGL, getFloatBuffer()); } } } 


Next, we enter the vertex buffer objects and UV coordinates described above.

  MorphVertexBuffer m_Buffer; MorphTexture m_TextureRegion; int vertexesLimit; //    . protected BitmapTextureAtlas m_Texture;//,     


I remind you that the classes described above are the internal GroundShape classes and therefore we continue further with its description from the constructor, which itself is trivial and we are only interested in the fact that the texture to be applied to it is transferred to it.

  public GroundShape(BitmapTextureAtlas texture) { super(0, 0); m_Texture = texture; } 


Next, we describe the initialization function, which must be called in the child to initialize the vertex buffers and UV coordinates.

  protected void Init() { Vertex[] vertexes = buildVertexBuffer();//    . if (vertexes == null) return; //  . vertexesLimit = vertexes.length; m_Buffer = new MorphVertexBuffer(vertexesLimit*2); m_Buffer.update(vertexes); m_TextureRegion = new MorphTexture(m_Texture, vertexesLimit*2); m_TextureRegion.ApplyUV(vertexes); } 


Because GroundShape is abstract; in our descendants we will have to overload the buildVertexBuffer function, in which we need to compile a list of vertices (with UV coordinates) and return them. Here she is

  protected abstract Vertex[] buildVertexBuffer(); 


The next step is to overload a pair of GroundShape methods to tell AndEngine what and how to draw on our surface.

  @Override protected void doDraw(final GL10 pGL, final Camera pCamera) { //  m_TextureRegion.onApply(pGL); // super.doDraw(pGL, pCamera); } @Override protected void onInitDraw(final GL10 pGL) { //           UV . //GLHelper - . super.onInitDraw(pGL); GLHelper.enableTextures(pGL); GLHelper.enableTexCoordArray(pGL); } @Override protected void drawVertices(GL10 pGL, Camera arg1) { //   . pGL.glDrawArrays(GL10.GL_TRIANGLES, 0, vertexesLimit); } 


If the vertices of the UV coordinates that we need to use are specified in doRaw, calling onApply, then to indicate the vertices of the triangles themselves, we do not need to call the functions, but simply reload getVertexBuffer and return the vertex buffer.

  @Override protected VertexBuffer getVertexBuffer() { return m_Buffer; } 


The functions that are simply overloaded by default and have no meaning for us are described below, but they are an obligatory part in the inheritance process.

  @Override public boolean collidesWith(IShape arg0) { return false; } @Override public float getBaseHeight() { return 0; } @Override public float getBaseWidth() { return 0; } @Override public float getHeight() { return 0; } @Override public float getWidth() { return 0; } @Override public boolean contains(float arg0, float arg1) { return false; } @Override protected boolean isCulled(Camera arg0) { return false; } @Override protected void onUpdateVertexBuffer() { } } 


Ok, we have sketched a class that dictates AndEngine how to draw ANY "model" consisting of triangles and having a texture. Although this is a 2D engine, it still works through OpenGL, just sprites are drawn on two triangles.

By the way, please note that under the android in OGL, there is no GL_POLYGONS , only GL_TRIANGLES . The fastest of which is GL_TRIANGLE_STRIP , read about them here - en.wikipedia.org/wiki/Triangle_strip . However, they require a certain order and problems, which I don’t want to do, so we will use GL_TRIANGLES (considering that with later tests, the performance gain was minimal). And so our surface, if you look at it “through” triangles, should look like this, compared to the beginning:

imageimage

So now we need to generate it based on the list of lines that will be transmitted to us. Create an object for this:

 private class GroundSelf extends GroundShape { public GroundSelf(List<Section> sec, BitmapTextureAtlas texture) { super(texture); sections = sec;//    ,    . Init();//    GroundShape' } 


And GroundShape :: Init (), as we remember, will call buildVertexBuffer (), which each heir must overload. In this function we need to build all the vertices of each triangle and set the UV coordinates. It is worth considering that the texture is square, and the earth is generally a non-convex polyhedron, and if we stupidly stretch the texture in 1: 1 coordinate all triangles, then we will not even analyze the textures as images. We need to be able to set the factors, and, since the length is greater than the height, the U coordinates must be greater by the coefficient.

I highly recommend that when you work with textures, take as a picture - a compass of some kind so that you can correctly determine the orientation of the texture coordinates.

In fact, in the buildVertexBuffer function, you define all the triangles of your object and its UV coordinates.

  @Override protected Vertex[] buildVertexBuffer() { int vertexesCount = 0, i, j, k = 0; float hellY = 800.0f;//   "". final float maxU = 4.0f;//     U final float maxV = 2.0f;//  V float stepU; //       // -   ,    . //    V. float startV = sections.get(0).lines.get(0).line.getY1(); float valueV = hellY - sections.get(0).lines.get(0).line.getY1(); for (i = 0; i < sections.size(); ++i) vertexesCount += sections.get(i).lines.size()*6; Vertex[] res = new Vertex[vertexesCount]; Section tmpSection; Line tmpLine; for (i = 0; i < sections.size(); ++i) { tmpSection = sections.get(i); //         //.     . //  6 . for (j = 0; j < tmpSection.lines.size(); ++j) { tmpLine = tmpSection.lines.get(j).line; stepU = maxU/(float)tmpSection.lines.size(); res[k] = new Vertex(); res[k].x = tmpLine.getX1(); res[k].y = tmpLine.getY1(); res[k].u = (float)j*stepU; res[k++].v = maxV + ((startV - tmpLine.getY1())/valueV)*maxV; res[k] = new Vertex(); res[k].x = tmpLine.getX1(); res[k].y = hellY; res[k].u = (float)j*stepU; res[k++].v = 0.0f; res[k] = new Vertex(); res[k].x = tmpLine.getX2(); res[k].y = tmpLine.getY2(); res[k].u = (float)(j + 1)*stepU; res[k++].v = maxV + ((startV - tmpLine.getY2())/valueV)*maxV; res[k] = new Vertex(); res[k].x = tmpLine.getX2(); res[k].y = tmpLine.getY2(); res[k].u = (float)(j + 1)*stepU; res[k++].v = maxV + ((startV - tmpLine.getY2())/valueV)*maxV; res[k] = new Vertex(); res[k].x = tmpLine.getX1(); res[k].y = hellY; res[k].u = (float)j*stepU; res[k++].v = 0.0f; res[k] = new Vertex(); res[k].x = tmpLine.getX2(); res[k].y = hellY; res[k].u = (float)(j + 1)*stepU; res[k++].v = 0.0f; } } //  ,  . return res; } List<Section> sections; } 


Surface created. Now we need to attach it to the world. This is done as usual in AndEngine:

 //  grndSelf = new GroundSelf(sections, EvoGlobal.getTextureCache().get(EvoTextureCache.tex_ground).texture); //  AndEngine EvoGlobal.getWorld().getScene().attachChild(grndSelf); 


Result:

image



I hope the one who now came here through Google in search of a solution is satisfied.

Source: https://habr.com/ru/post/150042/



All Articles