📜 ⬆️ ⬇️

How does the cloud gaming platform for b2b and b2c-clients. Solutions for great pictures and the last mile fight

Cloud gaming is called one of the main technologies that you should follow right now. For 6 years, this market should grow 10 times - from $ 45 million in 2018 to $ 450 million in 2024. Technological giants have already rushed to develop a niche: Google and Nvidia have launched beta versions of their cloud gaming services, Microsoft, EA, Ubisoft, Amazon and Verizon are getting ready to enter the scene.

For gamers, this means that very soon they will finally be able to stop spending money on upgrading hardware and run powerful games on weak computers. Is it beneficial for the rest of the ecosystem? We tell why cloud gaming will increase their earnings and how we have created a technology that allows us to easily enter a promising market.



Publishers, developers, TV manufacturers and telecom operators: why do they all need cloud gaming


Game publishers and developers are interested in quickly delivering their product to the largest number of players. Now, according to our data, 70% of potential buyers do not reach the game - they do not wait for the client to download and the installation file weighing tens of gigabytes. At the same time, 60% of users, judging by their video cards , in principle, cannot run powerful games (AAA-level) on their computers in acceptable quality. Cloud gaming will be able to solve this problem - it will not only not reduce the earnings of publishers and developers, but will help them build up a paying audience.
')
In the direction of cloud gaming now watching manufacturers and TV set-top boxes. In the era of smart homes and voice assistants, they have to increasingly compete for the user's attention, and the game functionality is the main way to attract this attention. With integrated cloud gaming, their client will be able to run modern games directly on the TV, paying for the service to the manufacturer.



Another potentially active participant in the ecosystem is communication operators. Their way to increase revenue is to provide additional services. Gaming is just one of these services that operators are already actively implementing. Rostelecom launched the “Game” tariff, Akado sells access to our Playkey service. It is not only about broadband Internet operators. Mobile operators in connection with the active spread of 5G will also be able to make cloud gaming their additional source of income.

Despite bright prospects, it’s not so easy to enter the market. All existing services, including products of technological giants, have not yet managed to completely overcome the “last mile” problem. This means that, due to the imperfection of the network directly in the house or apartment, the Internet speed of the user is not enough for cloud gaming to work correctly.


See how the WiFi signal fades as it spreads from the router through the apartment.

Players who have been on the market for a long time and possess powerful resources are gradually moving towards solving this problem. But to start your cloud games from scratch in 2019 is to spend a lot of money, time, and perhaps never create an effective solution. To help all participants in the ecosystem develop in a rapidly growing market, we have developed a technology that allows you to launch your cloud gaming service quickly and without high costs.

How we made a technology that makes it easy to start your cloud gaming service.


Playkey began to develop its cloud gaming technology back in 2012. In 2014, a commercial launch took place, and by 2016, 2.5 million players took advantage of the service at least once. Throughout the development, we have seen interest not only from gamers, but also from TV set-top box manufacturers and telecom operators. We have even launched several pilot projects with NetByNet and Er-Telecom. In 2018, we decided that our product may have a B2B future.

Developing for each company its own way of integrating cloud gaming, as we did in the pilot projects, is problematic. Each such implementation took from three months to six months. Why? Everyone has different hardware and operating systems: someone needs cloud gaming on an Android console, and someone like an iFrame in the web interface of a personal account for streaming to computers. In addition, everyone has a different design, billing (separate brave world!) And other features. It became clear that it was necessary either to increase the development team tenfold, or to make the most universal boxed B2B solution.

In March 2019, we launched RemoteClick. This software that companies can install on their servers and get a working cloud gaming service. What will it look like for the user? He will see a button on his usual website allowing him to launch the game in the cloud. When clicked, the game will start on the company's server, and the user will see the stream and will be able to play remotely. This is how it might look like in popular digital game distribution services.





Active struggle for quality. And passive too.


How RemoteClick copes with numerous technical barriers, we now will tell. Cloud gaming of the first wave (for example, OnLive) destroyed the poor quality of the Internet among users. Then, in 2010, the average Internet connection speed in the USA was only 4.7 Mbit / s. By 2017, it has grown to 18.7 Mbit / s, and soon 5G will appear everywhere and a new era will come. However, despite the fact that, in general, the infrastructure is ready for cloud gaming, the “last mile” problem already mentioned remains.

One of its sides, which we call objective: the user really has problems with the network. For example, the operator does not allocate the stated maximum speed. Or use 2.4 GHz WiFi, noisy microwave and wireless mouse.

The other side, which we call the subjective: the user does not even suspect that he has problems with the network (does not know that he does not know)! At best, he is sure that once the operator sells him a 100 Mbit / s tariff, he has an Internet connection of 100 Mbit / s. At worst, it has no idea what a router is, and the Internet divides it into blue and color. Real case from cascade.


Blue and color internet.

But both parts of the “last mile” problem are solvable. In RemoteClick we use active and passive mechanisms for this. Below is a detailed account of how they deal with obstacles.

Active mechanisms


1. Effective noise-tolerant coding of data sent aka redundancy (FEC - Forward Error Correction)

When transmitting video data from the server to the client, error-correcting coding is used. With it, we restore the original data with their partial loss due to problems in the network. What makes our solution effective?

  1. Speed. Encoding and decoding are very fast. Even on “weak” computers, the operation takes no more than 1 ms for 0.5 MB of data. Thus, encoding and decoding add almost no delay when playing through the cloud. The importance is hard to overestimate.
  2. Maximum data recovery potential. Namely, the ratio of excess data and the potential for recovery. In our case, the ratio is 1. Suppose you need to transfer 1 MB of video. If we add 300 KB of additional data during encoding (this is called redundancy), then in the decoding process, to restore 1 original megabyte, we will need only any 1 MB of the total 1.3 MB that the server sent. In other words, we can lose 300 Kb and still recover the original data. As can be seen, 300/300 = 1. This is the maximum possible efficiency.
  3. Flexibility in setting additional data when encoding. We can configure a separate level of redundancy for each video frame that needs to be transmitted over the network. For example, by noticing network problems, we can increase or decrease the level of redundancy.


We play through Playkey in Doom on Core i3, 4 GB RAM, MSI GeForce GTX 750.

2. Send data

An alternative way to deal with losses is to request data again. For example, if the server and the user are in Moscow, the delay in transmission will not exceed 5 ms. With this value, the client application will have time to request and receive from the server the lost part of the data transparent to the user. Our system decides for itself when to apply redundancy and when to send it.

3. Customized data transfer

To choose the best way to deal with losses, our algorithm analyzes the user's network connection and adjusts the data transfer system individually for each case.

He looks:


If you rank the connections for losses and delays, then the wire is most reliable, of course. Through Ethernet, losses are rare, and delays in the “last mile” are extremely unlikely. Then comes WiFi 5 GHz and only then WiFi 2.4 GHz. Mobile connections are generally trash, waiting for 5G.



When using WiFi, the system automatically configures the user's adapter, translating it to the most suitable mode for use in the cloud (for example, turning off power saving).

4. Individual coding setting

Streaming video exists thanks to codecs - programs for video compression and recovery. In the uncompressed form, one second of video easily exceeds one hundred megabytes, and the codec reduces this value by an order of magnitude. We are armed with H264 and H265 codecs.

H264 is the most popular. All major video card manufacturers have been supporting this work in hardware for over a decade. H265 is a daring young successor. Hardware began to support him five years ago. Encoding and decoding in H265 requires more resources, but the quality of the compressed frame is much higher than on H264. And without increasing the volume!



Which codec to choose and which encoding parameters to set for a specific user, based on his “hardware”? Nontrivial task that is solved automatically. The smart system analyzes the capabilities of the equipment, sets the optimal parameters for the encoder, and selects a decoder on the client side.

5. Compensation of losses

They did not want to admit it, but even we are not perfect. Some data lost in the depths of the network cannot be recovered and we do not have time to send them. But in this case there is a way out.

For example, adjusting the bitrate. Our algorithm constantly controls the amount of data sent from the server to the client. It records every shortfall and even predicts possible future losses. Its task is to notice in time, and ideally, to predict when losses will reach a critical value and will begin to create noticeable interference on the screen for the user. And adjust at this moment the amount of data sent (bitrate).



We also use the invalidation of unassembled frames and the mechanism of reference frames in the video stream. Both tools reduce the number of noticeable artifacts. That is, even with serious violations in data transmission, the image on the screen remains acceptable, and the game is playable.

6. Distributed shipping

Timed data transmission also improves the quality of streaming. How it is distributed depends on specific indicators in the network, for example, the presence of losses, ping and other factors. Our algorithm analyzes them and selects the best option. Sometimes distribution within a few milliseconds reduces losses by several times.

7. Delay Reduction

One of the key characteristics when playing through the cloud is delay. The smaller it is, the more comfortable it is to play. The delay can be divided into two parts:


Network depends on the infrastructure and to deal with it is problematic. If the wire gnawed the mouse, dancing with a tambourine will not help. But the delay of the system can be reduced by several times and the quality of cloud gaming for the player will change dramatically. In addition to the already mentioned robust coding and personalized settings, we use two more mechanisms.

  1. Fast data acquisition from control devices (keyboard, mouse) on the client side. Even on weak computers, 1-2 ms is enough for this.
  2. Drawing the system cursor on the client. The mouse pointer is not processed on the remote server, but in the Playkey client on the user's computer, that is, without the slightest delay. Yes, this does not affect the actual control in the game, but the main thing here is human perception.


Drawing a cursor without delay in Playkey using the example of Apex Legends

With the use of our technology with a network delay of 0 ms and work with a video stream of 60 FPS, the delay of the entire system does not exceed 35 ms.

Passive mechanisms


In our experience, many users have little idea how their devices connect to the Internet. In an interview with the players, it turned out that some people do not know what a router is. And that's fine! You do not have to know the device of an internal combustion engine to drive a car. It is not necessary to require the user sysadmin knowledge, so that he could play.

However, some technical points are still important to convey, so that the player can independently remove the barriers on his side. And we help him.

1. Indication of support for WiFi 5 GHz

Above, we wrote that we see the standard Wi-Fi - 5 GHz or 2.4 GHz. And we also know if the network adapter of the user device supports the ability to operate at 5 GHz. And if so, we recommend using this range. We cannot change the frequency on our own, as we don’t see the characteristics of the router.

2. WiFi signal strength indication

For some users, the WiFi signal may be weak, even if the Internet works well and seems to be at an acceptable speed. The problem will be revealed precisely during cloud gaming, which puts the network under real test.

Signal intensity is affected by obstacles — for example, walls — and interference from other devices. The same microwaves emit a lot. As a result, there are losses, imperceptible when working on the Internet, but critical to the game through the cloud. In such cases, we warn the user about interference, suggest moving closer to the router and turning off “noisy” devices.

3. Indication of traffic consumers

Even if the network is good, other applications may consume too much traffic. For example, if a video is running on Youtube in parallel with a game in the cloud or torrents are being downloaded. Our application calculates the thieves and warns the player about them.


Fears from the past - debunking the myths about cloud gaming


Cloud games as a fundamentally new way of consuming game content have been trying to break through to the market for almost a decade. And as in the case of any innovation, their story is a series of small victories and loud defeats. It is not surprising that over the years cloud gaming has become overgrown with myths and prejudices. At the dawn of technology, they were reasonable, but today they are completely groundless.

Myth 1. The picture in the cloud is worse than in the original - as if playing on YouTube


Today, in a technically advanced cloud solution, the pictures of the original and the clouds are almost identical - there is no difference with the naked eye. Individual adjustment of the encoder for the player's equipment and a complex of mechanisms for dealing with losses close this issue. There is no blurring of frames or graphic artifacts on a quality network. We even take permission into account. It makes no sense to stream the picture in 1080p, if the player uses 720p.

Below are two Apex Legends videos from our channel. In one case - this is a gameplay recording when playing on a PC, in another - through Playkey.

Apex Legends on PC

Apex Legends on Playkey


Myth 2. Unstable quality


Network status is indeed inconsistent, but this problem has been resolved. We dynamically change the settings of the encoder to the quality of the user's network. A constantly acceptable level of FPS is supported by special methods of image capture.

How it works? The game has a 3D engine that is used to build a 3D world. But the user is shown a flat image. For him to see it, a picture of memory is created for each frame - a peculiar photograph, how this 3D world is seen from a certain point. This picture is in encoded form stored in the video memory buffer. We capture it from the video memory and transfer it to the encoder, which already decrypts it. And so with each frame, one after another.

Our technology allows you to capture and decode the image in one stream, which increases the FPS. And if these processes are conducted in parallel (a solution that is quite popular in the cloud gaming market), then the encoder will constantly turn to capture, pick up new frames with a delay and, accordingly, transfer them with a delay.


Video at the top of the screen is obtained using the technology of capture and decoding in a single stream.

Myth 3. Because of the lags in the management will be "cancer" in multiplayer


Delay in control is normally a few milliseconds. And usually it is invisible to the end user. But there is a tiny rift between moving the mouse and moving the cursor. It does not affect anything, but creates a negative impression. Describing the cursor above directly on the user's device eliminates this disadvantage. Otherwise, the total delay of the system in 30-35 ms is so small that neither the player nor his rivals in the match notice anything. The outcome of the battle is decided only by skills. Proof below.


Streamer bends over the playkey

What's next


Cloud gaming is already a reality. Playkey, PlayStation Now, Shadow are working services with their audience and market place. And like many young markets, cloud gaming will grow rapidly in the coming years.

One of the scenarios that seems most likely to us is the emergence of our own services for game publishers and telecom operators. Someone will develop their own, someone will use ready-made box solutions, like RemoteClick. The more players on the market, the faster the cloud-based way of consuming game content will become mainstream.

Source: https://habr.com/ru/post/449280/


All Articles