
Just recently, my friends and I launched our first project for iOS - a toy about a snake Shadow Snake. I wanted to try out the development for the iPad, so we decided to transfer the already finished project made in Flash, especially since the mechanics made it possible to use the touch controls very harmoniously.
Of course, the first idea was to try the AIR SDK - the project started, but from a dynamic arcade it turned into a “turn-based” arcade. FPS was terribly low in part due to the fact that almost all the graphics in the flash drive was vector-based, and the mobile AIR was not very friendly with it. I don’t remember exactly which AIR was available at that time (2.x or the first versions 3.x), but we experimented with both vector graphics and another project made in a Flixel raster. The results are not very pleased us. Now, I see, progress has moved, Adobe has even released some new compiler. In any case, at that time it was decided to use Unity 3D.
Unity 3D at that time already proved itself well in mobile games and had a good visual environment for creating game scenes. Thanks to this, it was possible to separate programming, resource preparation, and game level design directly and distribute these tasks among different team members. In addition, by the nature of my core business, I have been working with C # for a long time, so I felt comfortable with Unity.
')
There was a problem how to transfer graphics and animations (including composite ones) to a new project.
Exporting graphics from Flash
How to open SWF and convert vector images to PNG or JPEG? Right! Need to use flash. And how to save the results to disk? Right! Need to use AIR.
Rasterization was implemented using
BitmapData and
PNGEncoder and then saved to a file using
FileStream .
It was necessary to build the process in such a way that the animations were exported with the smallest “gestures”. Namely, so that all metadata (animation titles, storyboard, frame scale, looping parameters) can be edited in Flash itself.
Adobe Flash does not allow adding additional information to MovieClip, so I had to enter several agreements. For example, the names of the clips that needed to be exported began with the prefix “e_”. This prefix was cut off when forming the final files. For marking animation and setting its properties, frame labels specified in a specific format were used. The result is a fairly convenient syntax, and the structure of each MovieClip exported looked like this:
The output of the application was a set of spritelists and their description in XML format. Who cares, can dig into the
source . Unprepared, undocumented sources, as is :). Just now there is no time to arrange them in the “product”, but they can be pulled apart “for parts”.
Separately, I note that flipping frames during rasterization in a MovieClip loaded from outside requires “specific” techniques. This is due to the behavior of nested clips - they can appear for several frames, then disappear in the process of animation. Therefore, for each new passage it is better to create an object again. To obtain a list of available clips in the uploaded file, the
SWFExplorer library was
used .
Import graphics in Unity 3D
After the resources are converted to a normal view for processing, spritelists with their metadata must be presented in a format understandable to Unity.
We decided not to use ready-made 2D frameworks for Unity, since It was supposed to test technology, get new experience and better understand how to make games in terms of this environment. Therefore, 2 special components were invented - AnimatedSprite and SpriteSheet.
The first was responsible for the actual output of the sprite in the game space. He created the standard for Unity components MeshFilter and MeshRenderer and filled them with data from SpriteSheet. Thus, it was possible to use tasty features, a la Dynamic Batching.
SpriteSheet also contained a link to the connected texture, a list of animations, frames and other settings - all the information that was unloaded from the SWF. It was decided to store it with Unity tools, in order to be able to adjust the parameters with the environment tools. SpriteSheets were generated automatically in the form of ready-made prefabs, which were then placed on the stage.
The result of the AIR-converter operation consisted of png-and xml-files, so at that time it was most convenient to place the code for creating spriteshits in your own texture import post-processor (
AssetPostprocessor ). This code checked the location of the file with the texture and the presence of an xml file with a description next to it — in this case, the handler generated (or updated, in the case of texture re-import) the corresponding prefabs and folded their corresponding folder.
The first problems. Level with a bomb.
We encountered the first problems when we transferred a bomb - an object that destroys all the enemies on the screen. Initially, the effect with which the explosion occurred, created a bright halo on the whole screen. Naturally, when exporting, the size of the texture began to exceed all reasonable limits. In order. We have not grown to “Karmakovsky” mega-textures yet :), and iOS draws a maximum of 2048x2048.
With other objects, similar problems began to arise - animations that had different translucent effects (flashes, lights, areola) started to take up just a huge place. The frame size for one strip (and there could be several animations in one strip) was fixed. It turned out that one large frame burst all animation, while most of the texture was filled with emptiness.
The first difficulty was solved simply by removing the bad effects from the animation and replacing them with Unity particle systems. I had to write additional logic, but there was no other way out. For the second case, we had to add the concept of a composite SpriteSheet - this is an object that behaves like a regular sprites from the point of view of the in-game API, but in fact contains many other sprites with separate textures. When marking a Flash clip, the labels indicated which composite SpriteSheet this clip belongs to. Thus, we have taken out all heavy animations to separate resources.
Conclusion
In this way, we got a process that, in a couple of clicks, allowed us to receive animations ready for use from Flash. Of course, the process could have been done better and more correctly, for example, at that time I did not know about the existence of a
ScriptableObject - it is better suited for these tasks.
I deliberately did not write code sheets here — anyone who needs to solve a similar problem will easily find all the necessary documentation (I tried to provide references to the materials used here). We managed to solve the set task - we were able to easily use vector animations created in Flash, prepare textures for different game resolutions and many more bonuses. Of course, Unity 4 threw a little bit of tar in the barrel, but these problems are also resolved.
If anyone has comments or questions, ask in the comments.