Habr, hello! Out of respect for those of you who don’t tolerate advertising here, we’ll let you know right away - yes, this is an advertising post. You can scroll further. For those who believe that advertising is not always harmful and sometimes helps us make important decisions, welcome to the cat.
For the past couple of years, the field of artificial intelligence has been bombarding us with regular news about new and new achievements.
AI was able to classify images better than humans .
AI was able to recognize it better than man .
AI beat a man in go. AI learned to draw pictures. AI learned how to make music. AI learned to be a journalist. AI learned to recognize the
emotions ,
gender and age of a person.
AI has learned to be a racist on Twitter.Most of these achievements are due to two factors: the emergence of large and high-quality datasets and the emergence of the necessary computing power. Algorithms of deep learning by and large remained the same, with various modifications.
Behind these algorithms there is a serious mathematics, therefore, by many, deep learning is perceived as something very heavy, like rocket science. It’s really not easy to dive into the subject. Information mass and different levels of availability of presentation. You can spend several years on the theoretical study of various topics and never move on to practice and develop your projects.
')
We are convinced that the elephant should be eaten in parts. In order to start using the achievements of deep neural networks, it is sometimes necessary to understand at a good level the intuition behind this or that algorithm, and not to know in detail every formula. Further, as we develop in this area, of course, we will have to master this as well. But it will be easier because of the presence of good practical experience.
Guided roughly by these principles, we designed our
educational program for Deep Learning . It looks like this:
Day 1. From 10:00 to 20:00- An overview of the current capabilities of neural networks
- Basics of Neural Networks
- Principles of image classification. Convolution networks (CNN)
- Case studies. Analysis of famous models: LeNet, AlexNet, ...
- Practice: Caffe Library. Creating your own neural network classifier from scratch
- Using convolutional networks for other tasks (style transfer, detection / segmentation, text classification)
- Case studies: Transferring image style. How do algorithms behind services like Prisma work?
Laboratory work. 24/7 competition for the best image classification using a virtual GPU machine.
Day 2. From 10:00 to 20:00- Analysis of laboratory work and award winners
- Basics of Recurrent Networks (RNN).
- Classification of texts using neural networks. Word2vec, doc2vec. Full-connected networks, convolutional networks, recurrent networks for classification.
- Practice: Keras / Theano Library. Work on sentimental analysis of texts using RNN.
- Sequence Learning and the seq2seq paradigm. Examples of problems solved with the help of seq2seq: translation, text generation, speech recognition
- Case study: “Create a chat bot”. Text generation in dialogs
- Multimodal training. Connection of convolutional and recurrent networks. Case study: generation of image descriptions
- Master class in deep learning in business
A lot of practice. A lot about business. There are just so many theories on that level to start solving problems and use all this in our activities. Full-time days on Saturdays: November 26 and December 3.
From what you need to know for the effective passage of the program: the basics of machine learning, the python language and basic skills of working in the unix command line.
By the way, the program partner is
IBM Bluemix , which has an excellent fleet of modern GPU-machines. IBM Bluemix will issue a virtual machine with a GPU to each participant in our program within round-the-clock access within 8 days of our program.
Register for the program
here .
See you!