Thoughts about the technology entry threshold in 2018, an example of a simple mobile application and not only
I once studied in the 5th grade and now for some reason it seems to me that between me and the guys who now go to the 5th grade is a huge gap in terms of access to technology. And as technology develops faster and faster, I wonder what will happen when the guys who now go to the 5th grade will become my peers.
In this short article, using the example of a simple mobile iPhone application, I want to show how accessible the technology is.
Some lyrics
About 20 years have passed since my eyes widened to the limit at the sight of an animated paper clip on a computer screen that offered to help me. The clip then seemed omnipotent, between what she could and the magic for me was not much difference.
')
Just 13 years ago, I first held the PDA in my hands (this is a pocket personal computer). Computer. Pocket. C Windows. It does not work from the 220 V network, but from the battery, it has access to the Internet. Without wires and cards Web Plas. Internet access. In the same Internet, 5 minute access to which was received as a gift for great services, our group of computer circles. We were allowed to visit 1 site (33 kbps modem). The whole group had a long discussion about what kind of site it would be. Voting, debates.
Fast forward to May 2018
California hosts the Google I / O 2018 conference, among dozens of other announcements: the ML Kit was added to the Firebase service - a tool that makes it possible to recognize the content of pictures, faces, text and many other things, even TensorFlow models, on a smartphone or in the cloud. Eka nevidal? What we do not know about machine learning and neural networks?
Okay, let's make an application for the text to recognize. Open Xcode - create a new project. Create a Podfile where we write the code below and install it:
pod'Firebase/Core' pod 'Firebase/MLVision' pod 'Firebase/MLVisionTextModel'
The interface is simple:
One UIImageView, where we will display the picture from the camera,
two UIButton, the first will start the camera, the second will start reading
To give iPhone access to the camera, add to Info.plist
Create a controller, connect the UIImageView to it and click on our two UIButton. And even if we managed to recognize what we had, the smartphone will voice it.
We start, we write with a pen on a leaflet (the text is completely uninteresting), we photograph
and look at the console:
Oh yes! All this works in airplane mode, that is, without access to the Internet. And handwriting recognition, and speech synthesis.
Tools, methods and tasks that seemed very complicated yesterday, today are available quickly and for free right out of the box. And what will happen next, after 20 years, and after 50 years?