Let's compare shared hosting and cloud storage.

To begin with, let us recall one curious and sufficiently fundamental history from the general course of the economy. In the 70s of the last century, one enterprising person, the future Nobel Prize winner,
George Akerlof described an innovative economic model. He called it - “
The Market of Lemons ”, making it clear in advance what it will be about.
So what did he do?
He reviewed the used car market. As now, 40 years ago, in this market, not only those cars were sold that the owners cherished and took care of their cars, but also those that were in fairly bad condition. It was the last and called "lemons". The whole model rested on one important fact - the buyer never fully knows whether he buys a “lemon” or a good, solid car.
"Lemon" in America was first called a used car, and later this term began to call any thing, as which one can not be sure
It is logical that in fact this problem is solved quite simply - it was necessary only to specify ALL the exciting parameters of a potential purchase - not only mileage, but, let's say, the service life. The fewer parameters of the potential purchase we ask, the more likely it is that it will not satisfy us.
The same rule works very well and, for example, with shared hosting. A lot of memory in the package? Get in the appendage slow SATA drives. SSD hosting? In the appendage processor on the Atom. And other unpleasant combinations.
If we do not specify specific selection parameters for the criteria we need - then, according to the "law of lemons", we get illiquid by the criteria that we have forgotten.
')

What does this mean for us in the case of cloud services?
The parameters that really need to be important to us are the processor clock speed, bus speed, memory type, capacity, and ECC. But what do we see in the descriptions? The amount of space (without iops), the number of cores (
which cores? ), The amount of RAM ...
It turns out that we see only “marketing” parameters that, in fact, do not make any weather . And this means that most attempts to compare cloud services to the forehead (
for example, according to the price list ) are doomed to failure or fraud. Here it is - a marketing victory in IT! We stopped to look at the essence of the cloud, but only measure the marketing values.
Therefore, starting to choose on the principle - I have more RAM and cheaper, we will fall into a pre-set trap. But why should a cloud hide its real iron?
Let's take this example of the provider. Let's say I am a regional provider with a 1GB dedicated channel. I sell it to twenty users - like a gigabit one (or fifty users, how lucky). If all of them at the same time do not turn on the torrent that can clog the entire channel, then no one will notice that gigabits are communal for a long time, and not his personal one.
The same does and
shared hosting - selling space and memory. In fact, not all sites are constantly viewed by someone — and, as a result, they
can sell one server six / seven times .
The cloud works in a similar way. The transition to conventional units allows us to lose sight of actually consumed resources and begin to measure the world “in parrots”. The same system allows us to sell one server several times. And because of this, we get another unknown variable in the work with the cloud.
We don’t know how much is actually in the resource cloud, how they are currently loaded. We do not even know how much real resources we consume ourselves.
Partly realizing the losing side of such a system, most cloud services are trying to bind to themselves additional services. For example, using the Amazon cloud, you will instantly become a user of their balancer, etc. Dedicated servers are the same for all providers. From this - a low margin and great competition. But for the time being, the clouds are personal.