To solve this problem, you need to decide on the command transfer channel per machine and think about the video transmission channel from the machine to the client in the browser.
I thought over many different options, until I had the idea to use the Android OS phone as a brain machine.
This solution immediately gives many advantages: there is a camera on the phone, you can probably organize a broadcast from it. Again, there is WiFi - you can organize the transfer of commands there and / or from there. And the weight is small ...
Of course, many other problems immediately arise - how to program the phone and how to connect motors to the phone.
To connect the phone with the outside world, I used its audio jack. Several short WAV audio files with a sinusoidal signal of different frequencies: 1000 Hz, 1200 Hz, 1400 Hz ... Their reproduction at the right moment is the command.
I connected the Mars rover to the audio connector (there were several articles about it on the Habré). This is a board for beginners experimenting with Altera's FPGAs. Of course, the FPGA does not have an ADC for digitizing audio, but I don’t need it. It is enough that the input element of the plisina switches at a certain threshold in the presence of the input audio signal. This was done . Then for the FPGA he made a project in the Altera Quartus II environment, which measures the frequency of the input signal and interprets it as a command.
At first I just turned on different LEDs, playing different sound files on the phone. So convinced that the idea works. Then he connected the motors to the board.
I had no experience in phone programming, I had to read and search a lot on the Internet. As a result, I stopped in the Python language - it turns out you can write small scripts almost on the phone and run them right there. To do this, you need to install SL4A (Script Layer For Android) on the phone and the actual support of Python itself. A site where you can read more about it - http://code.google.com/p/android-scripting/ There you can also download the SL4A in the download section. There are API descriptions and examples.
Studying the examples of Python scripts for the phone, I realized that everything should work out - after all, I had a rather powerful tool in my hands.
I will give a simple example. In order for the phone to start streaming video, you need to start using SL4A here's a simple script on the phone:
After that, the browser can connect to the phone on port 9091. True, there were problems with Firefox, but Chrome shows pretty well. Another trick is that the video should be faster and the phone should be faster. My HTC Wildfire S is not very fast, so the video sometimes slowed down.
Experimenting further, I decided that it would be right to launch the “web server” right on the phone. The server is also implemented as a Python script. The server sends pages in the form of html frames: in one frame of the video, and in the other HTML FORM with the buttons “Start”, “Stop”, “Left”, “Right”, “Back”. By pressing the buttons, the client browser sends an HTTP-GET request to the server running on the phone. The server interprets the GET requests and plays the desired sound files. The board recognizes the frequency of the beeps and turns the motors on or off. Something like that.
After testing each module individually, you can assemble and test the entire “system” - a crawler type vehicle with a Mars Rover card and an Android OS phone:
Here is a video demonstration:
All project sources, of course, are open and can be found here .