📜 ⬆️ ⬇️

IBM and MIT want to teach artificial intelligence to see and hear people



IBM began a long-term relationship with the Massachusetts Institute of Technology (Massachusetts Institute of Technology, MIT), a division of the Department of Brain & Cognitive Sciences . Within the framework of the new project, a laboratory has been created that will develop cognitive computer systems capable of understanding and analyzing data from external sources, just like humans. First of all, we are talking about audio and visual information. The computer should be able to use the obtained data to build a picture of the world that surrounds it.

According to experts, cognitive "smart" systems can be used in industries such as health, education, entertainment. Simply put, the joint IBM-MIT team will train the machines to understand what they see and hear. For example, it is not difficult for a person to describe everything that he saw in a short clip. Machines can not yet cope with this task, because they need a reliable system of recognition of images and images.

Representatives of IBM-MIT, among other tasks, will be engaged in the development of such systems. In addition to understanding what they see, the machines will try to train to draw conclusions and predictions based on the data obtained. A person can speculate on what will happen in the video or film next. Computers are not capable of this.
')
A team of specialists, including neurologists, machine learning specialists, programmers and representatives of other professions, is working on the implementation of the project’s tasks. In addition to video, the machines will be trained to understand audio signals, as discussed above. Understanding only visual information does not allow to get a complete picture of what is happening around (or on the screen). Therefore, computers should be able to analyze audio streams, processing such data using special algorithms.

“In a world where people and machines have been working together for a long time, a breakthrough in computer vision can potentially lead to meaningful results in the health sector,” said Guru Banavar, a spokesman for IBM Research. "By combining the achievements of the science of the brain and computer science together, we can solve a number of complex problems."



The head of the joint group was Professor James Di Carlo, head of the Department of Brain & Cognitive Sciences (BCS) at MIT. The project will also actively use the capabilities of the IBM Watson cognitive platform.

“We already understand quite a lot about what an AI should be, but all current developments have one drawback - computers and we interpret information about the outside world in different ways,” said Di Carlo.

“Our scientists eagerly set to work together with scientists and engineers from IBM. Our goal is to create a new generation of cognitive systems. We believe that computer vision and hearing are important components of such systems, ”says Di Carlo. The ability to quickly draw conclusions from what he saw and heard with the ability to predict future events at computer systems can be very useful. For example, “smart” robots can professionally care for people or work in production.

Computer systems will interact with IBM-MIT employees during the “communication” process, with the exchange of information of various types. The cognitive systems that scientists hope to develop will have to understand a person to one degree or another. A key element of success in the development of such systems is the integration of machine learning, machine vision, logic, and special algorithms for working with external data.

Now our company works with more than 250 universities around the world. 3000 researchers and 13 laboratories located on six continents participate in various IBM projects.

Source: https://habr.com/ru/post/398207/


All Articles