Hi GT! Modern computing systems allow you to do really interesting things, and the presence in the information field of research reports in various industries - right up to astrology and alternative scoring - suggests that the most reliable and productive memory will be increasingly required to handle Big Data class tasks.

It's no secret that the amount of data continues to grow - IDC analysts have already counted more than 20 zettabytes of information in computer systems. By 2020, more than 100 Zettabytes are expected, and by 2025, over 160 Zettabytes are expected. Do not fall behind and the cost of analytics and storage of big data. In 2017 alone, they increased by 12.4% to $ 150 billion, and after three years they will spend $ 210 billion to work with Big Data.
But our task today is not to count money. We just wanted to show that the processes of creating high-performance systems are developing today at an active pace. Against this background, there are many interesting projects that aim to digitize and analyze data from unexpected industries. It would seem, what sphere of knowledge can be more foggy than astrology? But even in this sector, machine learning technologies can be applied to verify which hypotheses are true and which are simply speculated.
')
Bitcoin Forecast
An interesting thought occurred to a
group of scientists : why not use machine learning to test the craziest hypotheses? For example, let's test astrology - after all, we have already accumulated historical data sets, and the location of the planets has long been known (ephemeris are calculated with great accuracy for all important "celestial bodies").
As an object for their experiment, the authors of the project chose nothing more than a bitcoin rate against the dollar. And that - the topic of Bitcoin is in fashion today, everything about the blockchain is of great interest, and the course of cryptocurrency is well known. We began to build a model of learning artificial intelligence, based on the method of solar returns - the annual cycles preceding each event studied. This is an astrological approach that allows for the date of "birth" of any event to show the dynamics of events in which the system develops during the year. For example, cycles for Bitcoin begin on August 18th. At each point, a calculation is made and compared with the original event calculation. In general, astroprognosis is a complex thing, and his illustration clearly proves this fact.

The researchers tested the astrological model with the help of machine learning tools and came to impressive results. They broke the known data on the fluctuations of the cryptocurrency rate into a set for learning (70% of the data) and a test set. After using the machine learning technology using the “K-nearest neighbors” algorithm on the test set, the model showed an accuracy of 94%. This news in its time excited the Internet and was published on the portal bitcoin.com. Of course, the model was fairly simple, the system only assessed whether there was a rise or fall over the month, and compared it with existing data. And yet the result above 90% is quite good. And since no one can predict exactly what will happen with Bitcoin than astrology is worse than other methods, if its results are supported by machine learning?
So what will happen to Bitcoin?
After analyzing the natal chart of the cryptocurrency, astrologers concluded that the latest, dizzying growth in the Bitcoin rate could be predicted. True, it is worth making a reservation that it was possible to predict not the pace and intensity, but the very fact of growth.

According to preliminary expectations, from November 2017 to February 2018 cryptocurrency is waiting for even greater growth. At the same time, from March to May 2018, cryptocurrency is waited for by restrictions from states and financial systems, which is likely to lead to a fall in its value. But do not forget that we are talking about the simplest model - and the fall in this case means that there will be no growth at this time. Is it true? We'll see, time will tell.
Credit risks in Africa? Blockchain again and big data!
But we will not be limited to astrology, to which many refer ambiguously. Go down to earth, where big data and machine algorithms will solve other important problems. For example, in African countries there are huge problems with the circulation of cash currency: its high volatility, the presence of gray and black markets. Almost 100% of payments and transactions occur in cash there, barter exchange is used.
Unlike Western countries, in many regions of the continent it is impossible to determine the exact address of a person, not to mention the history of purchases and official salary. All sorts of charitable initiatives and joint fundraising are also often useless - the funds do not reach the recipients.

Plans to unite disparate people from African countries are planned by the
Humaniq FinTech startup
company that has created a universal social platform for both individuals and small enterprises.
At the moment, on the basis of the mobile platform of the company, a mobile cryptocell, a secure instant messaging service and a user biometric identification system using face photo and facial expression are deployed. Within the framework of the platform, both interest-free internal HMQ cryptocurrency transactions and external transfers via crypto-exchange are possible.
The company plans to open an API, so that on the basis of the application it was possible to launch third-party services - financial, including microcredit, charity, as well as related to remote work and crowdfunding.
The open API allows you to use the accumulated big data for further processing by neural networks. For example, in the field of microfinance to analyze the user profile, verify the identity and establish the borrower's credit rating. The use of the platform with confirmed user data, the collection of large data and their analysis promises a new era in social statistics, new transparent systems of civil voting, the provision of services of state and commercial institutions.

Why did we tell this?
Of course, astrology, blockchain and microfinance are just some interesting examples of using big data analytics. In reality, the fields of application of neural networks and artificial intelligence cover more and more areas of human knowledge. But for such software to work, it is necessary to have powerful multiprocessor systems capable of loading huge samples of information into memory at once. And even if the work of astrologers and the construction of psychological portraits tomorrow will require the presence of servers with large amounts of RAM, in more traditional areas where machine learning technologies have taken root for a long time, the demand for fast processors, modern architectures, as well as capacious and reliable memory will be huge . And here Kingston already has its own proposals, which distinguish professional memory for heavy tasks, allowing neural networks to work without failures.
Kingston's professional data center memory kits undergo 35 quality control points throughout the entire production cycle, including a 24-hour stress test at 100 degrees and at increased voltage.
The standard for testing Kingston memory modules includes checking for compliance with the specification, assessing the quality of components, testing for operability under adverse environmental conditions, compatibility and reliability. Independent assessment tests are used to assess quality, including ServerBench, NetBench, and WebBench. In addition, all DRAM chips used in Kingston's memory pass the full cycle of testing by specialized companies, for example,
AVL (Advanced Validation Labs) . The list of tests conducted to verify the reliability of server memory includes multiple cycles of heating and cooling, working at high humidity and so on. Ultimately, every cell is subject to verification, and this is a very painstaking job, because there are 136 billion of them in the 16 GB memory module! But today such testing is required during the construction of the data center, and tomorrow, when customers need continuous data streams, memory requirements will only increase.

In addition to the usual Unbuffered ECC DIMM, Registered DIMM and Load Reduced DIMM or LRDIMM, which we have already mentioned in previous
posts . And the general course on Big Data even in astrology suggests that reliable and capacious memory will be in great demand in the coming years.
It will be even steeper next! Subscribe and stay with us!
For more information about Kingston and HyperX products, visit
the company's official website .