📜 ⬆️ ⬇️

Neuroclot: Part 1. Installing the Raspberry Pi and the cameras in the chicken coop and setting them up

image
Big Brother is watching you, bird!


Articles about neurocooler
Spoiler header
  1. Intro to learning about neural networks
  2. Iron, software and config for monitoring chickens
  3. A bot that posts events from the life of chickens - without a neural network
  4. Dataset markup
  5. A working model for the recognition of chickens in the hen house
  6. The result - a working bot that recognizes chickens in the hen house




The idea came a long time ago. Someone has the idea of ​​heating coops with cryptocurrency-minded video cards (cryptocurrency), which is great, of course, but someone has ideas in recognizing images, sounds, in neural networks and their real use.


Once upon a time, they read an article about a Japanese man who helped his father with sorting cucumbers; decided to analyze how our parents chickens rush, sending them reports to the messenger - an idea of ​​fun.



In general, a lot of plans. The fact that there was a stir near the nest may mean that the bird has climbed into the nest or has crawled out of it. This is easy to understand with the help of openCV, and we already know how. Make it easy with this blog .


But what if you recognize each bird and analyze which one is not rushing? Evaluate the productivity of each individual chicken? If the bird does not rush and has no other good reason for resting (for example, short daylight hours, shedding), then maybe it's time to cook chicken soup?


Just submit a message: “It seems to us that the bird ch11 is not rushing without a reason; perhaps we should consider its further fate.” And then it turns out that the bird ch11 is our old cat Cranberry, which simply lives with chickens.


Hackathon


Thoughts that it all sounds great, did not give rest. The first experience in the recognition of movement ( on cars outside the window ) was not bad, and now the equipment was idle. Everything always happens suddenly, so one fine Thursday I bought tickets for Friday night to my parents and flew over the weekend to set up data collection for a neuroculture.


The main difficulty was the lack of wired internet and the impossibility of carrying it in principle (wilderness, what to do). But when you don’t know what you are signing up for, hope for the best, yes.


In addition, there were no outlets in the hen house. The parents, of course, manage switches from the house, giving the light and signaling. Her father responded to a request to turn a rosette into a hen house, and she, in general, materialized there very quickly.


The main part of the equipment is the Raspberry Pi 3 and the camera board to it, a power source and a usb fan (for processing images without a fan heats the processor right up to 80 degrees). In addition, someone had to provide pi Internet.


So, among the alternatives for hotspot is a 3g / 4g modem huavei, an old xperia on an android. The modem is good because it does not need a separate power source, and the bad is that it works out of the box only with Windows. There are, of course, articles about how to get it on Linux, but something did not want to.


In the conditions of tightly limited time (left a day before departure) a telephone was chosen.
The provider did not provide a static IP service in this region. The IP turned out to be dynamic, which was decided to be fixed using a dynamic DNS service.


And all of a sudden (whoever doubted it), it didn't work. After all, IP is not just dynamic, it is gray dynamic. This means that it is impossible to reach it from the outside, the ports are closed.


At the same time, a Python script was captured to capture and transfer images to the server, but it was still raw.


In the meantime, half the time available was already spent.


A friend suggested that there is a beautiful thing, ssh back connect, which, in general, saved us from disappointment. There was very little time left, so it was not possible to fully understand how everything works, it was necessary for it to work at least somehow.


Before we left, we set up crowns with an ssh tunnel prokidyvaniem, temperature measurement and an alarm in the mail if something happens, and the entire setup went to the hen house. With the Internet there is still bad, but it is. It turned out that it is quite dark there and nothing is visible on the photos. My father promised to adjust the lighting as soon as there is time. For the time being, the camera was turned off.


The main thing is that it was possible to connect to pi from wherever the Internet was.


More about setting up


Slightly moving away from the hackathon - march-throw, I undertook to tune up this matter further. After reading the guides (using the keywords permanent autossh), I tried to set up autossh instead of reverse ssh, which was unstable and was supported by the crown. At first, nothing happened with autossh, I continued to use the first solution with crown, but the problem with the growing connections forced me to make friends with autossh.


To get everything started, you only need to create an executable file (who does not know how to google create executable file linux) on a remote device with a dynamic gray IP and add the following line there:


/usr/bin/autossh -M 0 -o ServerAliveInterval=50 -o ServerAliveCountMax=2 -nNTf -R 2222:localhost:22 userB@hostB -p bbbb 

In this line, 2222 can be replaced with any port you don't need, you need to replace userB with the user on your home server (that is, the one that is not in the hen house), hostB with the host on your home server, bbbb is the port of your home server, if different from the standard (22).


You can read about the parameters of the team yourself if you are interested or want to change something. Next we add to the cron (crontab -e) a line (if unfamiliar with the crown, then 1 2 3 4 friends collected introductory), which will run autossh when rebooting:


 @reboot /path/to/script/autosshtunnel.sh 

So now, if you go to a home server from another remote machine, make sure that the session is not broken. That is, I go to the server from a laptop, and already from the server I knock on the chicken coop, in which case I prescribe the parameters for the eternal session and when connected to the server, and when connected to the chicken coop (distributor).


This is done according to this pattern:


 ssh -o TCPKeepAlive=yes -o ServerAliveInterval=50 user@box.example.com 

I connect to the system in the hen house like this:


 ssh -o TCPKeepAlive=yes -o ServerAliveInterval=50 sshuser@localhost -p 2222 

This all concerned the possibility of a remote connection, now let's quickly talk about temperature alarms. To set up email alarms in debian systems like ubuntu and rasbian - just follow this guide , you just need to install ssmtp and fix the config, that's all. The simplest script for an alarm about overheating to a mail for a rasbian might look like this:


 TEMPERATURE="$(/opt/vc/bin/vcgencmd measure_temp)" NTEMPERATURE="$(echo $TEMPERATURE | tr -dc '0-9.')" LIMIT="61.0" if [ $(echo "$NTEMPERATURE > $LIMIT" | bc) -ne 0 ]; then echo "The critical CPU temperature has been reached $NTEMPERATURE" | sudo /usr/bin/ssmtp -vvv somename@somehost.com fi 

Then it remains to pack this script into an executable file and throw it into crowns. While not hot, I run the script every two minutes.


Now let's talk about the main script, which we collect images. Images we consider conditionally useful if we notice movement. We will fasten analytics and recognition to these images already. We have already mentioned a useful blog , from which we have taken the script as a basis, having rewritten it a little.


In the guide it is already written what is needed for work, but I will repeat that you need to make an OpenCV build . This can take a long time (in my case it took 5 hours). In addition, it is necessary to put the same and other libraries, also mentioned there, for example, numpy, imutils, - there were no pitfalls.


We rewrote the main script to fit our needs and made the following changes:
')

  • changed Python 2 to Python 3;
  • instead of dropbox, they used their server;
  • the original and compressed frame is saved.


The ready-made version of pi_surveillance.py looks like this (well, except that you still need to take out the constants from the script to the config):


 # import the necessary packages import sys sys.path.append('/usr/local/lib/python2.7/site-packages') from pyimagesearch.tempimage import TempImage from picamera.array import PiRGBArray from picamera import PiCamera import argparse import warnings import datetime import imutils import json import time import cv2 import os # construct the argument parser and parse the arguments ap = argparse.ArgumentParser() ap.add_argument("-c", "--conf", required=True, help="path to the JSON configuration file") args = vars(ap.parse_args()) # filter warnings, load the configuration and check if we are going to use server warnings.filterwarnings("ignore") conf = json.load(open(args["conf"])) client = None if conf["use_server"]: #we do not use Dropbox print("[INFO] you are using server") # initialize the camera and grab a reference to the raw camera capture camera = PiCamera() camera.resolution = tuple(conf["resolution"]) camera.framerate = conf["fps"] rawCapture = PiRGBArray(camera, size=tuple(conf["resolution"])) # allow the camera to warmup, then initialize the average frame, last # uploaded timestamp, and frame motion counter print("[INFO] warming up...") time.sleep(conf["camera_warmup_time"]) avg = None lastUploaded = datetime.datetime.now() motionCounter = 0 # capture frames from the camera for f in camera.capture_continuous(rawCapture, format="bgr", use_video_port=True): # grab the raw NumPy array representing the image and initialize # the timestamp and occupied/unoccupied text frame = f.array timestamp = datetime.datetime.now() text = "Unoccupied" # resize the frame, frame = imutils.resize(frame, width=1920) frameorig = imutils.resize(frame, width=1920) # convert it to grayscale, and blur it gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY) gray = cv2.GaussianBlur(gray, (21, 21), 0) # if the average frame is None, initialize it if avg is None: print("[INFO] starting background model...") avg = gray.copy().astype("float") rawCapture.truncate(0) continue # accumulate the weighted average between the current frame and # previous frames, then compute the difference between the current # frame and running average cv2.accumulateWeighted(gray, avg, 0.5) frameDelta = cv2.absdiff(gray, cv2.convertScaleAbs(avg)) # threshold the delta image, dilate the thresholded image to fill # in holes, then find contours on thresholded image thresh = cv2.threshold(frameDelta, conf["delta_thresh"], 255, cv2.THRESH_BINARY)[1] thresh = cv2.dilate(thresh, None, iterations=2) cnts = cv2.findContours(thresh.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE) cnts = cnts[0] if imutils.is_cv2() else cnts[1] # loop over the contours # check if there is at least one contour, which is large enough # I know this isn't the best practice # I know about bool variables # I know about other things too. I just don't actually care # Yes, I am a liar, 'cause if I did not care, # I wouldn't write anything of those ^ for c in cnts: # if the contour is too small, ignore it if cv2.contourArea(c) < conf["min_area"]: continue text = "Occupied" print("[INFO] room is occupied, motion counter is {mc}".format(mc=motionCounter)) # initiate timestamp ts = timestamp.strftime("%A-%d-%B-%Y-%I:%M:%S%p") ts1 = timestamp.strftime("%A-%d-%B-%Y") # let's create paths on a server pathorig = "{base_path}/{timestamp}/origs".format( base_path=conf["server_base_path"], timestamp=ts1) pathres = "{base_path}/{timestamp}/res".format( base_path=conf["server_base_path"], timestamp=ts1) os.system('ssh -p bbbb "%s" "%s %s"' % ("userB@hostB", "sudo mkdir -p", pathorig)) os.system('ssh -p bbbb "%s" "%s %s"' % ("userB@hostB", "sudo mkdir -p", pathres)) # upload images on a server if (text == "Occupied"): motionCounter += 1 if motionCounter >= conf["min_motion_frames"] and (timestamp - lastUploaded).seconds >= conf["min_upload_seconds"]: print("[INFO] time to upload, motion counter is {mc}".format(mc=motionCounter)) # upload original t = TempImage() cv2.imwrite(t.path, frameorig) os.system('scp -P bbbb "%s" "%s:%s"' % (t.path, "userB@hostB", pathorig)) t.cleanup() # upload resized image of 512 px framec = imutils.resize(frame, width=512) tc = TempImage() cv2.imwrite(tc.path, framec) os.system('scp -P bbbb "%s" "%s:%s"' % (tc.path, "userB@hostB", pathres)) tc.cleanup() #reset motionCounter motionCounter = 0 lastUploaded = datetime.datetime.now() # otherwise, the room is not occupied else: motionCounter = 0 # check to see if the frames should be displayed to screen if conf["show_video"]: # display the security feed cv2.imshow("Security Feed", frame) key = cv2.waitKey(1) & 0xFF # if the `q` key is pressed, break from the loop if key == ord("q"): break # clear the stream in preparation for the next frame rawCapture.truncate(0) 

How our config now looks like:


 { "show_video": false, "use_server": true, "server_base_path": "/media/server/PIC_LOGS", "min_upload_seconds": 1.0, "min_motion_frames": 3, "camera_warmup_time": 2.5, "delta_thresh": 5, "resolution": [1920, 1080], "fps": 16, "min_area": 6000 } 

And so - tempimage.py:


 # import the necessary packages import uuid import os import datetime class TempImage: def __init__(self, basePath="./temps", ext=".jpg"): # construct the file path timestamp = datetime.datetime.now() ts = timestamp.strftime("-%I:%M:%S%p") self.path = "{base_path}/{rand}{tmstp}{ext}".format(base_path=basePath, rand=str(uuid.uuid4())[:8], tmstp=ts, ext=ext) def cleanup(self): # remove the file os.remove(self.path) 

The first image was an image of a chicken tail in a nest. An excellent gift for May for an introvert in life, which in good weather stares at the console. The image really pleased, despite the darkness, the lack of a bird's head in the frame and the script's inadequate nature. This is a chicken tail ( Just think, a thousand kilometers away from you, the chicken crawled into the nest, not suspecting that you are watching it. ):


image

Then the lighting was tuned, and I received noticeably more inspirational photos.


image
image

The script is launched, taking into account the fact that OpenCV is installed in the virtual working environment cv, like this (we would also have to figure out how to send this to the background):


 source ~/.profile workon cv cd ~/chickencoop python3 /home/sshuser/chickencoop/pi_surveillance.py --conf conf.json 

image

To be continued...

Source: https://habr.com/ru/post/327978/


All Articles