📜 ⬆️ ⬇️

Nuances of developing a plugin for Unity

Recently faced with writing plugins for Unity. There was no experience before, and I am only a 2-3 month user of this environment. During the development of the plugin a lot of interesting points have accumulated, about which there is little information on the Internet. I want to describe all these moments in more detail for the pioneers, so that they do not fall on the same rake, which I myself attacked many and many times.

This article should also be useful for experienced users. It will consider a useful toolkit and the nuances of plug-in development for OSX, Windows, iOS and Android.

With playing video in Unity for a long time, not everything worked out well. Built-in tools are very limited, and on mobile platforms they can play videos only in full screen mode, which is not ice for game devas! At the beginning we used third-party plugins. However, there either lacked the necessary functionality, or there were bugs, the fixes of which had to wait a long time (if they were fixed at all). For this reason, we decided to write our own video decoder version for Unity with blackjack and w ... , stop, and with features.

I will not post the creation of the plug-in and the code myself - sorry, commercial secret, but I’ll stop on general principles. To implement the video decoder, I took the codecs vp8 and vp9, which can play the open and WebM format that does not require license fees. After decoding a video frame, we receive data in the YUV color model. Then we write each component in a separate texture. In essence, this is where the plugin’s work ends. Further in the Unity itself, the shader decodes the YUV into the RGB color model, which we already apply to the object.
')
You ask - why the shader? Good question. At first I tried to convert the color model software on the processor. For desktops, this is acceptable, and performance is not particularly low, but on mobile platforms, the picture is radically different. On the iPad 2 in the working scene, the software converter gave 8-12 FPS. When color conversion in the shader received 25-30 FPS, which is already a normal playable indicator.

Let's move on to the nuances of plugin development.

Main provisions


The documentation for writing plug-ins for Unity is rather scarce, everything is described in general terms (for iOS, I found many nuances by myself). Link to dock.

What pleases - there are examples collected under the current studios and platforms (except for iOS: probably, Apple did not pay extra to the developers). The examples themselves are updated with each update of Unity, but there is also a fly in the ointment: APIs often change, interfaces, defaults and constants are renamed. For example, I took a fresh update, from where I used a new leader. Then I understood for a long time why the plugin does not work on mobile platforms, until I noticed:

SUPPORT_OPENGLES  //  SUPPORT_OPENGL_ES //  

Probably the only important point for all platforms that needs to be immediately taken into account is the rendering cycle. Rendering in Unity can be performed in a separate stream. This means that the main thread will not work with textures. To resolve this situation, the scripts have a function IssuePluginEvent , which at the right time pulls the callback, where the work with the resources needed for rendering should be performed. When working with textures (creation, update, deletion) I recommend using corute, which will pull the callback at the end of the frame:

 private IEnumerator MyCoroutine(){     while (true) {            yield return new WaitForEndOfFrame();            GL.IssuePluginEvent(MyPlugin.GetRenderEventFunc(),magicnumber);     } } 

What is interesting, if you try to work with textures in the main thread, then the game falls only on the DX9 API, and even then not always.

Osx


Probably the most simple and hassle-free platform. The plugin is going to fast, debugging is also easy. In xCode, do attach to Process → Unity. You can set bryaki, watch the callstack in the fall, etc.

There was only one interesting point. Recently, Unity has been updated to version 5.3.2. In the editor, the main graphics API was OpenGL 4, in the older version there was OpenGL 2.1, which is now deprecated. In the updated version, the editor simply did not lose the video. Quick debug showed that the function glTexSubImage2D (GL_TEXTURE_2D, 0, 0, 0, width, height, GL_ALPHA, GL_UNSIGNED_BYTE, buffer) returns the error GL_INVALID_ENUM . Judging by the OpenGL documentation, GL_RED came to replace the GL_ALPHA pixel format, which does not work with OpenGL 2.1 ... I had to back up with a crutch:

 const GLubyte * strVersion = glGetString (GL_VERSION); m_oglVersion = (int)(strVersion[0] – '0'); if (m_oglVersion >= 3) pixelFormat = GL_RED; else pixelFormat = GL_ALPHA; 

And the most mysterious thing is that in the final build, compiled under OpenGL 4, everything works fine with the GL_ALPHA flag. Recorded this nuance in the section of magic, but still made humanly.

Unity Editor can be run on an older version of OGL. To do this, we write in the console:

Applications/Unity/Unity.app/Contents/MacOS/Unity -force-opengl

From useful utilities I want to mention the OpenGL Profiler, which is part of Graphics Tools. Tools can be downloaded from the Apple site in the Developer section. The profiler allows you to fully track the status of OpenGL in the application, you can catch errors, view the contents of textures (size, type, format), shaders and buffers in video memory, set breakpoints for different events. A very useful tool for working with graphics. Screenshot:



So I found out that the Unity Editor uses 1326 textures.

Windows


On this platform, the OpenGL version of the plugin is also assembled without any problems. But on DirectX 9 I will stop in more detail.

1. DirectX 9 has such a feature as a lost device. OpenGL and DirectX (starting with the 10th version) are free from this drawback. In fact, there is a loss of control over graphic resources (textures, shaders, meshes in video memory, etc.). It turns out that we have to handle this situation, and if it happened, we must load or create all the textures again. According to my observations in many plugins, this is exactly what they do. I managed to cheat a little: I create textures from Unity scripts, and then pass them pointers to the plugin. Thus, I leave all the resource management to Unity, and he does an excellent job with the device loss situation.

 MyTexture = new Texture2D(w,h, TextureFormat.Alpha8, false); MyPlugin.SetTexture(myVideo, MyTexture.GetNativeTexturePtr()); 

2. When it seemed everything was ready, an unexpected problem emerged. Sometimes and only on some videos the picture was displayed with an offset, as shown in the screenshot:



Judging by the appearance of the image, the error could be present in the algorithm for copying data into a texture, in texture coordinates, or was associated with a texture wrap. Documentation suggested that DirectX, for optimization, can align texture sizes by adding extra bytes. This information is stored in the structure:

 struct D3DLOCKED_RECT { INT  Pitch; void *pBits; } 

Pitch - the number of bytes in the same row of the texture with alignment.
A little tweaking the copying algorithm got the desired result (added pixels filled with zeros):

 for (int i = 0; i < height; ++i) { memcpy(pbyDst, pbySrc, sizeof(unsigned char) * width); pixelsDst += locked_rect.Pitch; pixelsSrc += width; } 

To debug OpenGL, use the gDEBugger utility, which is similar in functionality to the OpenGL Profiler for OSX:



Unfortunately, I did not find such utilities for DX9. Having such a tool would help in finding an error with copying data into a texture.

Ios


An example of a project for this platform in the samples was not. In the documentation there is little useful information and basically only about access to functions from the plugin.

I will focus on the important aspects:

1. In xCode, we create an ordinary iOS project with the type StaticLib. We connect OpenGL frameworks - and you can build a plugin.

2. The name of the final plugin file does not matter. In Unity, functions are imported from all plug-ins that are in the iOS folder:

 [DllImport("__Internal")] 

3. An important point - if you have a function with the same name in another plugin, then build the build will fail. Unity ticker will use double implementation. Advice - call it so that no one thinks of such a name.

4. UnityPluginLoad (IUnityInterfaces * unityInterfaces) , which should be called when the plug-in is loaded, is not called! To find out when the plug-in is still started and get information about the current render device, you need to create your controller inherited from UnityAppController and register the function call for the start of the plug-in and RenderEvent in it . The created file should be placed in the folder with plug-ins for iOS. An example implementation of a controller for registering functions:

 #import <UIKit/UIKit.h> #import "UnityAppController.h" extern "C" void MyPluginSetGraphicsDevice(void* device, int deviceType, int eventType); extern "C" void MyPluginRenderEvent(int marker); @interface MyPluginController : UnityAppController { } - (void)shouldAttachRenderDelegate; @end @implementation MyPluginController - (void)shouldAttachRenderDelegate; { UnityRegisterRenderingPlugin(&MyPluginSetGraphicsDevice, &MyPluginRenderEvent); } @end IMPL_APP_CONTROLLER_SUBCLASS(MyPluginController) 

5. If the plugin uses several different architectures, for convenience, they can be combined into one static library:

lipo -arch armv7 build/libPlugin_armv7.a\
-arch i386 build/libPlugin _i386.a\
-create -output build/libPlugin .a

6. During testing, I found out that negative texture coordinates are not transferred from the vertex shader to the pixel shader — always zeroes come. By default, textures are created with the address mode CLAMP_TO_EDGE . In this case, OpenGL ES cuts everything to the range [0..1]. On desktop platforms this is not observed.

7. A serious bug was noticed. If you compile a project for iOS with Script Debugging enabled, then xCode will also crash in the game. As a result, neither the logs, nor the callstack ...

Debugging a plugin for the iOS platform is a pleasure - in xCode there is always a callstack when it crashes. In the console, you can read the logs of both the scripts and the plug-in, and if you add * .CPP files of the plug-in to the project, you can set breakpoints and use the full functionality of the lldb debugger! But with scripts everything is much worse, so logging to help.

Android


Building for Android requires the most tools:

- normal IDE for editing C ++ code. I used xCode. And in general, on a Mac for Android, it is somehow easier to compile;
- NDK to build C code into a static library;
- Android Studio with all its needs like Java, etc. The studio is needed for convenient logging of what is happening in the application.

Let's walk through interesting points:

1. The situation with debugging plug-ins in Android is rather sad, so I recommend to immediately think about writing logs to a file. You can, of course, get bogged down and try to set up remote debugging, but I didn’t have time for this and had to go a simpler way by viewing the logs through Android Studio. To do this, in android / log.h there is a function __android_log_vprint, which works like printf. For convenience, wrapped in a cross-platform view:

 static void DebugLog (const char* fmt, ...) { va_list argList; va_start(argList, fmt); #if UNITY_ANDROID __android_log_vprint(ANDROID_LOG_INFO, "MyPluginDebugLog", fmt, argList); #elif printf (fmt, argList); #endif va_end(argList); } 

I advise you not to bypass asserts . If they are triggered, Android Studio allows you to view the full call stack.

2. On this platform, special plugin naming specificity is libMyPluginName.so. For example, the lib prefix is ​​required (more details can be found in the Unity documentation).

3. In an Android application, all resources are stored in one bundle, which is a jar or zip file. We can not just open the stream and start reading the data, as in other platforms. In addition to the video path, you need an Application.dataPath, which contains the path to the Android apk, only in this way can we get and open the necessary asset. From here we get the length of the file and its offset from the beginning of the bundle:

 unityPlayer = new AndroidJavaClass("com.unity3d.player.UnityPlayer")) activity = unityPlayer.GetStatic<AndroidJavaObject>("currentActivity") assetMng = activity.Call<AndroidJavaObject>("getAssets") assetDesc = assetMng.Call<AndroidJavaObject>("openFd", myVideoPath); offset = assetFileDescriptor.Call<long>("getStartOffset"); length = assetFileDescriptor.Call<long>("getLength"); 

Open the file stream on the way Application.dataPath with standard tools (fopen or whatever you like), and start reading the file with offset offset - this is our video. The length is needed to know when the video file will end and to stop further reading.

4. Found a bug.

 s_DeviceType = s_Graphics->GetRenderer(); 

s_DeviceType always contains kUnityGfxRendererNull . Judging by the forums, this is a Unity error. I wrapped Android part of the code in define, where by default I determined:

 s_DeviceType = kUnityGfxRendererOpenGLES 

When developing for Android, you need to be ready to constantly dig in the console and rebuild regularly. If you initially configure Android.mk and Application.mk correctly, then problems with the build should not arise.

Well, sort of, and all. I tried to dwell on all the important points that were not obvious from the beginning. Now that you have this knowledge, you can develop a normal plug-in architecture in advance, and you will not have to rewrite the code several times.

As a conclusion


According to my preliminary calculations, this work was supposed to take 2-3 weeks, but it took 2 months. Most of the time had to spend on clarifying the points described above. The most tedious and long stage is Android. The process of rebuilding static libraries and the project took about 15 minutes, and debugging was done by adding new logs. So stock up on coffee and be patient. And do not forget about the frequent crashes and hangs Unity.

I hope this material will be useful and will help save valuable time. Criticism, questions - welcome!

Thanks for attention.

Source: https://habr.com/ru/post/277605/


All Articles