📜 ⬆️ ⬇️

AI from OpenAI learned to write poems, articles and news



Despite the fact that chatbot is still not very well in conversation with people (although they are constantly improving on this), they work much better with text. You can check this statement with the help of a web application, the basis for which is artificial intelligence (its weak form).

So, if a user starts writing a news article, the bot can complete it. Also, the technology relatively well supports “communication” with a person through correspondence. If you ask “What should I do today?”, The program will give a quite clear answer. There is also a solution in the form of a web service, this is TalkToTransformer.com .


')
It was developed by Canadian engineer Adam King. It is worth noting that he created the external part of the service, but in its basis - the AI, developed by the research organization OpenAI. Earlier this year, OpenAI introduced its language AI system, GPT-2, and TalkToTransformer is an opportunity to try out this system.

Previously, it was available only for testing selected scientists and journalists. "Transformer" service is called the type of neural network, which is the basis of GPT-2 .



If you want to get acquainted with the language capabilities of AI, then there is no better option than TalkToTransformer. The service is quite flexible. He is able to recognize a large number of types of textual information, including recipes, program code, song words, etc. He also knows how to identify characters from various literary works, including Harry Potter and The Lord of the Rings.

At the same time, the capabilities of the system are limited - it does not know how to “think” on a large scale, but acts superficially. The texts that the AI ​​writes can have storylines, heroes (if this is a story). But all this is not logically connected, that is, the characters appear and disappear, and their actions are random.

Dialogues are built on the same random principle. If the dialogue is more or less slender, then this is rather a case than a service opportunity. Nevertheless, the simpler texts of the AI ​​form quite well. Work is being done at the expense of network sources and other places.

Earlier on Habré it was reported that GPT-2 is studying on ordinary web pages (about 8 million sites, 40 GB of text). Sites that have a good rating on reddit have been included in the sample of sources of training - this is done to avoid spamming and advertising resources from clogging the data source.

When forming a dialogue, the beginning of the phrase must be given. For example, "Mars is ...", after which the system complements the sentence. The network can provide answers without special training for a specific task.


Source: https://habr.com/ru/post/452304/


All Articles