📜 ⬆️ ⬇️

Research UI in Unity 4.6 beta

The other day began to study new UI in Unity 4.6 beta. Everything on the official website in the video tutorials was naturally viewed, but there is nothing about how the new UI works. I didn’t see the docks either, and naturally I wanted to make out, I know how it all works. So briefly about what I understood:

On the basis of research using the spear method, the main object, without which the construction of the UI is impossible, is Canvas. He is responsible for drawing the interface elements and forwarding events to them. Also, Canvas has 3 UI rendering options: ScreenSpace - Overlay, ScreenSpace - Camera and WorldSpace.

ScreenSpace - Overlay and ScreenSpace - Camera


As can be seen from the name of the modes, they work with screen coordinates. This allows you to build the PixelPerfect interface, but if you need the interface to look the same at all resolutions, or you want to create a 3D UI, these options do not suit you.

WordlSpace


This mode draws elements in world space, so if you do not have a camera into which they will fall, then you may not see them. I was interested in this mode because it allows you to create a UI that will look the same - regardless of the screen resolution. The only small problem is that Canvas, unlike the previous modes, does not respond to a change in the aspect of the screen. But this problem is solved by a simple script that controls the width / height of the canvas at the start.
')
using UnityEngine; using System.Collections; public class CanvasHelper : MonoBehaviour { private const float ETHALON_2x3_LANDSCAPE = 1.3333333333333f; private const float ETHALON_2x3_PORTRAIT = 0.666666666666f; public Canvas canvas; // Use this for initialization void Start () { var ethalon = Screen.orientation == ScreenOrientation.Landscape ? ETHALON_2x3_LANDSCAPE : ETHALON_2x3_PORTRAIT; var cam = canvas.worldCamera; var rectTransform = canvas.transform as RectTransform; var delta = rectTransform.sizeDelta; if (Screen.orientation == ScreenOrientation.Landscape) delta.x *= cam.aspect / ethalon; else delta.y *= cam.aspect / ethalon; rectTransform.sizeDelta = delta; } } 


I will not describe individual elements, anyway you can get acquainted with them by the method of mathematical spear. But the message system is worth a look in more detail.

For your UI to work, you must have an EventSystem on the scene. EventSystem is a component that handles user events and passes them to the UI. Event handling occurs in the InputModule components. I met 2 (StandaloneInputModule for PC, consoles and the web and TouchInputModule for mobiles and tablets). And, judging by their settings, they can at least partially be interchangeable.

InputModule catches user events and sends them to the EventSystem, which already sends the whole thing to the UI. But how is it determined whether the active element was pressed? This is the responsibility of GraphicRaycaster.

GraphicRaycaster


This component is located on the Canvas and, in response to a mouse click / touch, determines to which object the event should be sent. In total, the new UI has 3 types of reykasterov: raikaster for 2D physics, raikaster for 3D physics and raikaster for graphic elements. By default, the last object is added to the game object.

This racaster has one huge drawback: in order for an object to receive an event, it must have a graphic component. In other words, if you want to create a transparent area of ​​the screen that will trigger a particular action by pressing, you will have to create a component with graphics and make it completely transparent. In my opinion, this is very inconvenient, it is good that there is an opportunity to expand this system by introducing new racaster.

A bit about the code


The UI system can be divided into 3 parts: the event generation system (UnityEngine.Events) - an innovation in Unity 4.6, affects not only the UI, but also the physics and rendering systems, the event capture system (UnityEngine.EventSystems) and the UI logic itself (UnityEngine.UI ). Moreover, the first system is part of the main library of the engine, and the rest - 2 part of the UI library.

Events


This namespace contains classes that describe the basic structure of the event. There are 2 types of classes: UnityAction (event source) and UnityEvent (event listener, while it is possible to listen to several events). They can accept up to 4 parameters of the following types: EventDefined, Void, Object, Int, Float, String, Bool (based on the description in Enum PresustentListenerMode).

Event systems


This namespace contains classes and interfaces that provide event handling for UI elements, and there is a different interface for each event. Also in this namespace are raikastery for physics and the base class for describing the behavior of UI elements - UIBehaviour and setting input modes.

Ui


Here are classes directly related to UI elements. Including the description for GraphicRaycaster, as well as a bunch of interfaces related to events. I’ve just started exploring this namespace, this is where the key to writing your own UI elements lies.

I will write the next part when I figure out how to create my own controls. Thanks to all of you, if you have a similar experience, I will be glad to read about it in the comments.

Source: https://habr.com/ru/post/236971/


All Articles