1 Resources Link RobotName: Adeept_RaspClaws RobotURL: https://github.com/adeept/Adeept_RaspClaws RobotGit: https://github.com/adeept/Adeept_RaspClaws.git [Official Raspberry Pi website] https://www.raspberrypi.org/downloads/ [Official website] https://www.adeept.com/ [GitHub] https://github.com/adeept/Adeept_RaspClaws [Image file and Documentation for structure assembly] https://www.adeept.com/learn/detail-37.
2 Components List Acrylic Plates The acrylic plates are fragile, so please be careful when assembling them in case of breaking. The acrylic plate is covered with a layer of protective film. You need to remove it first. Some holes in the acrylic may have residues, so you need to clean them before the use.
3 Machinery Parts
4 Electronic Parts
5 Tools Self-prepared Parts 18650 battery specification: It is recommended to use lithium battery above 3000mAh and without overcurrent protection. The power supply current requirement is above 3A.
1 Content 1. Premise.............................................................................................................................................................................. 1 1.1 STEAM and Raspberry Pi.................................................................................................................................................1 1.2 About the Documentation...............................................................................
2 14.1 Multi-threading Introduction..................................................................................................................................... 87 14.2 Realization of Police Lights / Breathing Lights............................................................................................................87 14.3 Warning Lights or Breathing Lights in Other Projects................................................................................................
1 1. Premise 1.1 STEAM and Raspberry Pi STEAM stands for Science, Technology, Engineering, Arts and Mathematics. It's a type of trans disciplinary education idea focused on practice. As a board designed for computer programming education, Raspberry Pi has lots of advantages over other robot development boards. Therefore, Raspberry Pi is used for function control of the robot. 1.2 About the Documentation This documentation is for software installation and operation guide for the Python robot product.
2
3 2. Raspberry Pi System Installation and Development Environment Establishment 2.1 Install An Operating System for The Raspberry Pi 2.1.1 Method A: Write 'Raspbian' to The SD Card by Raspberry Pi Imager Raspberry Pi Imager is an image writing tool to SD card developed by the Raspberry Pi Organization.
4 ●Insert the SD card into the card reader, connect the card reader with your computer. ●Run the Raspberry Pi Imager, select CHOOSE OS -> Raspbian(other) -> Raspbian Full - A port of Debian with desktop and recommended applications. ●Click on CHOOSE SD CARD for the SD card to write the Raspbian Full, please be noted that the image writing will automatically delete all files on the SD card if any. ●Click on WRITE, wait for the writing.
5 ●Do not remove the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot, WiFi configuration without any peripherals may fail in the following process. 2.1.2 Method B: Download The Image File Raspbian and Write It to The SD Card Manually ●Since the image file is downloaded with Raspberry Pi Imager in 2.1.1, it can take a long time due to a slow network in some places.
6 4. Download the image file `Raspbian` - Torrent file: [Raspbian-Raspbian Buster with desktop and recommended software] -Zip file: [Raspbian - Raspbian Buster with desktop and recommended software] 5. Unzip the file, be noted that the path should be in English for the `.img` file extracted, no special characters allowed. 6. Write the image file `Raspbian` downloaded to the SD card with `Raspberry Pi Imager` 7.
7 ●On the Raspberry Pi website [Official Raspberry Pi website], select through Downloads -> Raspbian -> Raspbian Buster with desktop and recommended software, and click on the torrent or zip file to download. Unzip the file after download, be noted that the path should be in English for the .img file extracted, no special characters allowed; otherwise Raspberry Pi Imager may not open the .img file. It's recommended to save the .img file to the root directory of the C:\ or D:\ disk, but do not save .
8 ●Do not remove the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot it up, WiFi configuration without any peripherals may fail in the following process. 2.1.3 Method C: Manually Download The Image File Provided by Us and Write It to The SD Card (Not Recommended) ● The Raspbian image file downloaded in 2.1.1 and 2.1.2 is the official source with some preinstalled software.
9 3. Install the `Raspberry Pi Imager` 4. Download the image file `Adeept_RaspClaws` - [Image file for the Adeept_RaspClaws Robot] 5. Unzip the file, be noted that the path should be in English for the `.img` file extracted, no special characters allowed. 6. Write the image file `Raspbian` downloaded to the SD card with `Raspberry Pi Imager` 7. Leave the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later.
10 will automatically delete all files on the SD card if any. ●Click on WRITE, wait for the writing. ●Do not remove the SD card connected after writing is completed, we'll use for configuring WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot it up, WiFi configuration without any peripherals may fail in the following process. 2.
11 2.2.1 Method A: Enable SSH with Peripherals ●If you use (2.1.3 to manually download the image file we provide and write it to the SD card) to write the operating system of the Raspberry Pi to the SD card, you do not need to refer to this section to open SSH, because The SSH service in the image is already enabled. ●If you've connected a mouse, keyboard, or monitor to the Raspberry Pi, follow these steps to enable SSH. 1.
12 ●If you haven't connected any monitor to the Raspberry Pi, follow these steps to enable SSH. 1. Do not remove the SD card after `Raspberry Pi Imager` writes the image file. 2. Create a file named `ssh` under any directory, without any extension name. You may create a `ssh.txt` and delete the `.txt` (make sure under Folder Options the box of Hide extensions for known file types is unchecked. Then you have an `ssh` file without extension name. 3.
13 } 4. Type in your own information for `Insert country code here`, `Name of your WiFi`, and `Password for your WiFi`. Pay attention to the capitalization. Refer to the example below: ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev update_config=1 country=US network={ ssid="MyName" psk="12345678" } 5. Save and exit. Copy the `wpa_supplicant.conf` to the root directory of the SD card. 6. If you've already copied the file `ssh` to the SD card as instructed in **2.
14 3 Log In to The Raspberry Pi and Install The App ● If you followed the steps in 2.2.1 and 2.3.1 for SSH and WiFi configuration, you may remove the peripherals now and use SSH to remotely control the Raspberry Pi later on. ●If you followed the steps in 2.2.2 and 2.3.2, you may now insert the SD card into the Raspberry Pi and boot it up. The Raspberry Pi will auto boot and connect WiFi when powered on, with no need of peripherals. ●If you use the operation steps of 2.1.
15 of the Raspberry Pi, `raspberry` (pay attention to capitalization). There's no change on the screen when you're typing in, but it doesn't mean you're not entering the information. Press ‘enter’ after you finish typing in. ●So now you've logged into the Raspberry Pi. 3.2 Log into Raspberry Pi (Linux or Mac OS) ●Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi.
16 ● For lower versions of Windows OS, SSH is not built in, and you may log into the Raspberry Pi by referring to the official documentation Raspberry Pi[SSH using Windows]. ●Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi. Check the Management interface for your router, or download the app `Network Scanner` -> search for a device named `RASPBERRY` or `Raspberry Pi Foundation` to get the IP address.
17 wait until it's done. 3.5 Install Corresponding Dependent Libraries ● Follow the steps below to install the libraries if you wrote the image file to the SD card based on 2.1.1 Write 'Raspbian' to the SD card by `Raspberry Pi Imager` and 2.1.2 Download the image file `Raspbian` and write it to the SD card manually. ● Here a script is provided for installing all dependent libraries needed and configuration of starting the camera and other auto start programs.
18 ● If it fails to enter the page, log into the Raspberry Pi via SSH, type in the command below to end the program auto run at boot to release resources, or else issues like camera initialization failure or occupied ports. sudo killall python3 ●Type in the command below to run `webServer.py`: sudo python3 Adeept_RaspClaws/server/webServer.py ●Check whether there's any error and solve them based on instructions in the Q&A section below. 4 Raspberry Pi Structure Assembly and Precautions 4.
19 If your servo does not return to the original position automatically, you can manually run the server.py file and then try to connect the servo. ●Preparations before Assembly 1. Connect the Raspberry Pi Camera and the ribbon.
20 2. Connect the Raspberry Pi Camera and the RaspberryPi.
21
22 3. Fix four M2.5x10+6 Copper Standoffs on Raspberry Pi.
23 4.Connect the driver board to mpu-6050.
24 5.Connect the 18650 Battery Holder Set to the Adeept Motor HAT.
25 ●Install and Remove Batteries
26 6.Connect servos to Adeept Motor HAT.
27 ● Before switching on, you need to insert the configured SD card into the Raspberry Pi. For details, please refer to the third chapter of the document. Otherwise, the servo will not rotate to the middle position after booting. In the next installation, the servos need to be connected to the robot HAT. And the Raspberry Pi will automatically adjust the servo to the correct angle.
28 ●install the arm
29
30
31 ●install the feet
32
33 Fix a debugged servo to the acrylic plate
34
35 Fix Robot feet
36 ●install the Camera Fix Raspberry Pi Camera on Acrylic Plates
37
38
39
40 Connect Assemble Pi Camera
41 Fix car light on Acrylic Plates
42
43
44
45
46
47
48
49
50
51
52
53 4.2 Tips for Structural Assemblage ●Since many servos are used in the product, the servo installation is critical for the robot. Before installing the rocker arm to the servo, you need to connect the servo to power and make the servo shaft rotate to the central position, so the rocker arm installed at the designated degree will be in the central position. ●Generally Raspberry Pi will auto run `webServer.py` when booting, when `webServer.
54 ●You can also use a power lithium battery to power Motor HAT. Motor HAT supports the power supply that is below 15V. ●You can use a USB cable to supply power to Motor HAT when the installing the rocker arm of the servo during structural assembly. After the robot software is installed, the raspberry pi will control Motor HAT to set all servo ports to output the neutral signal after it is started up.
55 5 Controlling A Robot via WEB App ●The WEB app is developed for common users to control the robot in an easier way. It's convenient to use WEB app; you may use it to wirelessly control the robot on any device with a web browser (Google Chrome was used for testing). ●Generally Raspberry Pi will auto run `webServer.py` when booting and establish a web server in the LAN. You may then use any other computer, mobile or tablet in the same LAN to visit the web page and control the robot.
56 Raspberry Pi. ●`Actions`window control unique functions of the robot: ·`MOTION GET`: Motion detection function based on OpenCV. When objects move in the view of the camera, the program will circle the part in the `Video` window, and the LED light on the robot will show respective changes. ·`AUTO MATIC`: It is used to switch the fast and slow gait of the robot. Click it to switch the robot from the fast gait to the slow gait.
57 6 Common Problems and Solutions(Q&A) ●Where to find the IP address of the Raspberry Pi? Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi. Check the Management interface for your router, or download the app `Network Scanner` -> search for a device named `RASPBERRY` or `Raspberry Pi Foundation` to get the IP address.
58 ●The servo doesn't return to the central position when connected to the driver board. In general, the Raspberry Pi will auto run `webServer.py` when booting, and `webServer.py` will run and control the servo ports to send a signal of rotating to the central position. When assembling the servo, you can connect it to any servo port anytime.
59 ●After running the server, I get an error and can't find config.txt This is because the installation script did not copy con fi g.txt to the specified location due to permissions problems during installation. The new version of webServer will not use this file, only the old version of the server will use it. Copy the server folder of the Raspberry Pi to / etc / of the Raspberry Pi, use the following command sudo cp -f //home/pi/Adeept_RaspClaws/server/config.txt //etc/config.
60 7 Set The Program to Start Automatically 7.1 Set The Specified Program to Run Automatically at Boot ●This section only introduces the auto-run method used by our products. If you need more information about the Raspberry Pi auto-run program, you can refer to this document from itechfythe document Auto-Run. ● If you have used the operation steps of 3.5 or 3.6, then the script program has been configured to automatically run the program at startup.
61 ●For example, if we want to replace webServer.py with server.py, we only need to edit the following: Replace sudo python3 [RobotName]/server/webServer.py with sudo python3 [RobotName]/server/server.py ●Save and exit so that the robot will automatically run server.py instead of webServer.py the next time the robot is turned on. ●server.py is a socket server used when using pythonGUI.
62 8 Remote Operation of Raspberry Pi Via MobaXterm ●To make daily use of the Raspberry Pi more convenient, we usually do not connect peripherals such as mouse, keyboard, and monitor to the Raspberry Pi. Since our Raspberry Pi is installed inside the robot, often with peripherals to control the Raspberry Pi, the efficiency of programming and testing will be seriously affected. Therefore, we introduce a method of programming in the Raspberry Pi. ●There are many ways to program in the Raspberry Pi.
63 to the Raspberry Pi again, if there is no save username and password will need to input the user name and password, if the IP address of the Raspberry Pi changed, you need to start a new dialogue. ● After a successful login, the left column is replaced with a file transfer system, which allows you to interact with the system inside the Raspberry Pi. If you want to return to session selection, just click Sessions.
64 9 How to Control WS2812 RGB LED ●WS2812 LED light is a commonly used module on our robot products. There are three WS2812 lights on each module. Please pay attention when connecting. The signal line is different in direction, which needs to be connected to WS2812 after being led out from the Raspberry Pi.
65 self.LED_COUNT, self.LED_PIN, self.LED_FREQ_HZ, self.LED_DMA, self.LED_INVERT, self.LED_BRIGHTNESS, self.LED_CHANNEL ) self.strip.begin() def colorWipe(self, R, G, B): # This function is used to change the color of the LED light color = Color(R, G, B) for i in range(self.strip.numPixels()): # Only one LED light color can be set at a time, so we need to do a loop self.strip.setPixelColor(i, color) self.strip.
66 ●Instantiate the object and execute the method function. The function colorWipe () needs to pass in three parameters, namely R, G, and B, which correspond to the brightness of the three primary colors of red, green, and blue. The value range is 0- 255, the larger the value, the higher the brightness of the corresponding color channel. If the values of the three color channels are the same, white light is emitted. Specific examples are as follows: if __name__ == '__main__': LED = LED() try: while 1: LED.
67 10 How to Control The Servo 10.1 Control The Steering Gear to Rotate to A Certain Angle ● Since the servo can use the PWM signal to control the rotation angle of a mechanism, it is a more commonly used module on robot products. Walking robots, robotic arms and gimbals are all driven by the servo. In our Raspberry Pi The driver board Motor HAT has a dedicated PCA9685 chip for controlling the servo. The Raspberry Pi uses I2C to communicate with the PCA9685.
68 ● pwm.
69 10.3 Non-blocking Control ●You can find the RPIservo.py file in the server folder of the robot product, copy it to the same folder as the program you want to run, and then you can use this method in your program. import RPIservo import time # Import a library that uses multiple threads to control the steering gear sc = RPIservo.ServoCtrl() # Instantiate the object that controls the steering gear sc.start() # Start this thread, when the servo does not move, the thread is suspended while 1: sc.
70 11 Calling the API to Control the Robot ●For the reason that the movement and other functions of walking robots are much more complicated than wheeled robots: • The wheeled robot only needs to control the high and low levels of a few GPIO ports to control the movement of the robot.
71 init_pwm7 = 300 init_pwm8 = 300 init_pwm9 = 300 init_pwm10 = 300 init_pwm11 = 300 init_pwm12 = 300 init_pwm13 = 300 init_pwm14 = 300 init_pwm15 = 300 ●The code responsible for initialization is in move.py, call the initial value init_pwm in RPIservo.py through exec, use the for loop to take out all the initial values and assign it to pwm0, pwm1... variables: for i in range(0,16): exec('pwm%d=RPIservo.init_pwm%d'%(i,i)) def init_all(): pwm.set_pwm(0, 0, pwm0) pwm.set_pwm(1, 0, pwm1) pwm.
72 def robotCtrl(command_input, response): global direction_command, turn_command if 'forward' == command_input: #move forward direction_command = 'forward' move.commandInput(direction_command) #move backward elif 'backward' == command_input: direction_command = 'backward' move.commandInput(direction_command) elif 'DS' in command_input: #stop direction_command = 'stand' move.commandInput(direction_command) elif 'left' == command_input: #turn left turn_command = 'left' move.
73 ●In webserver.py, use the functionSelect() function to start the action of the robot: def functionSelect(command_input, response): global direction_command, turn_command, SmoothMode, steadyMode, functionMode if 'scan' == command_input: pass elif 'findColor' == command_input: flask_app.modeselect('findColor') elif 'motionGet' == command_input: # Moving object detection flask_app.modeselect('watchDog') elif 'stopCV' == command_input: flask_app.modeselect('none') switch.switch(1,0) switch.
74 ●We control the robot to move forward, move backward, turn left, turn right, switch between fast and slow gait, and maintain self-balance in move.py. When the webserver, py receives the forward command "command_input", it will call the commandInput() function in move.py.
75 step_set = 1 else: pass if turn_command != 'no': # gait of turning left and right if SmoothMode: dove(step_set,35,0.001,DPI,turn_command) step_set += 1 if step_set == 5: step_set = 1 else: move(step_set, 35, turn_command) time.sleep(0.1) step_set += 1 if step_set == 5: step_set = 1 else: pass if turn_command == 'no' and direction_command == 'stand': stand() step_set = 1 pass #Self-balancing else: steady_X() steady() class RobotM(threading.
76 def commandInput(command_input): global direction_command, turn_command, SmoothMode, steadyMode if 'forward' == command_input: direction_command = 'forward' #move forward rm.resume() elif 'backward' == command_input: direction_command = 'backward' #move backward rm.resume() elif 'stand' in command_input: direction_command = 'stand' #stand at attention rm.pause() elif 'left' == command_input: turn_command = 'left' #turn left rm.
77 12 Automatic Stabilization Function ● Based on the automatic stabilization function of the robot realized by MPU6050 , after starting this function, you can place the robot on a panel, and then tilt the panel. The robot will keep the body balanced by changing the height of the corresponding leg. At the same time, when this function is started, the robot cannot perform other movements. Click this button again to close the automatic stabilization function.
78 X = accelerometer_data['x'] X = kalman_filter_X.kalman(X) Y = accelerometer_data['y'] Y = kalman_filter_Y.kalman(Y) ''''' Carry out Kalman filtering on the data ''' X_fix_output += -X_pid.GenOut(X - target_X) X_fix_output = ctrl_range(X_fix_output, steady_range_Max, -steady_range_Max) Y_fix_output += -Y_pid.
79 13 Gait Generation Method of A Hexapod Robot ●For the hexapod robot, the gait generation method is almost the most complicated part of the program, because it needs to coordinate dozens of servos to move at the same time, and keeps every moment when walking forward and backward there must be at least three legs on the ground, which means that only three legs can be in the swing pair at least at any moment, and at least three legs should be in the support pair.
80 return if command == 'no': ''' When the speed is positive, the robot moves forward. When the speed is negative, the robot move s backward.
81 speed_I = speed - speed_I dove_Left_I(-speed_I, 3*speed_II) dove_Right_II(-speed_I, 3*speed_II) dove_Left_III(-speed_I, 3*speed_II) dove_Right_I(speed_I, -10) dove_Left_II(speed_I, -10) dove_Right_III(speed_I, -10) time.
82 speed_II = speed_I speed_I = speed - speed_I dove_Left_I(speed_II, 3*(speed-speed_II)) dove_Right_II(speed_II, 3*(speed-speed_II)) dove_Left_III(speed_II, 3*(speed-speed_II)) dove_Right_I(-speed_II, -10) dove_Left_II(-speed_II, -10) dove_Right_III(-speed_II, -10) time.
83 speed_II = speed_I speed_I = speed - speed_I dove_Left_I(speed_I, -10) dove_Right_II(speed_I, -10) dove_Left_III(speed_I, -10) dove_Right_I(-speed_I, 3*speed_II) dove_Left_II(-speed_I, 3*speed_II) dove_Right_III(-speed_I, 3*speed_II) time.
84 speed_II = speed_I speed_I = speed - speed_I dove_Left_I(-speed_II, -10) dove_Right_II(-speed_II, -10) dove_Left_III(-speed_II, -10) dove_Right_I(speed_II, 3*(speed-speed_II)) dove_Left_II(speed_II, 3*(speed-speed_II)) dove_Right_III(speed_II, 3*(speed-speed_II)) time.
85 for speed_I in range(0, (speed+int(speed/dpi)), int(speed/dpi)): if move_stu and command == 'no': speed_II = speed_I speed_I = speed - speed_I dove_Left_I(speed_I, 3*speed_II) dove_Right_II(speed_I, 3*speed_II) dove_Left_III(speed_I, 3*speed_II) dove_Right_I(-speed_I, -10) dove_Left_II(-speed_I, -10) dove_Right_III(-speed_I, -10) time.
86 pass elif step_input == 4: for speed_I in range(0, (speed+int(speed/dpi)), int(speed/dpi)): if move_stu and command == 'no': speed_II = speed_I speed_I = speed - speed_I dove_Left_I(speed_II, -10) dove_Right_II(speed_II, -10) dove_Left_III(speed_II, -10) dove_Right_I(-speed_II, 3*(speed-speed_II)) dove_Left_II(-speed_II, 3*(speed-speed_II)) dove_Right_III(-speed_II, 3*(speed-speed_II)) time.
87 14 Make A Police Light or Breathing Light 14.1 Multi-threading Introduction ● This chapter introduces the use of multi-threading to achieve some effects related to WS2812 LED lights. Multi-threading is a commonly used operation in robot projects. Because robots have high requirements for real-time response, when performing a certain task, try not to block main thread communication. ● Multi-threading is similar to executing multiple different programs or tasks simultaneously.
88 ''' Use the Threading module to create threads, inherit directly from threading.Thread, and then override the __init__ method and the run method ''' class RobotLight(threading.Thread): def __init__(self, *args, **kwargs): ''' Here initialize some settings about LED lights ''' self.LED_COUNT = 16 # Number of LED pixels. self.LED_PIN = 12 # GPIO pin connected to the pixels (18 uses PWM!). self.LED_DMA = 10 # DMA channel to use for generating signal (try 10) self.LED_FREQ_HZ = 800000 self.
89 super(RobotLight, self).__init__(*args, **kwargs) self.__flag = threading.Event() self.__flag.clear() # Define functions which animate LEDs in various ways. def setColor(self, R, G, B): ''' Set the color of all lights ''' color = Color(int(R),int(G),int(B)) for i in range(self.strip.numPixels()): self.strip.setPixelColor(i, color) self.strip.
90 ''' Call this function to turn on the police light mode ''' self.lightMode = 'police' self.resume() def policeProcessing(self): ''' The specific realization of the police light mode ''' while self.lightMode == 'police': ''' Blue flashes 3 times ''' for i in range(0,3): self.setSomeColor(0,0,255,[0,1,2,3,4,5,6,7,8,9,10,11]) time.sleep(0.05) self.setSomeColor(0,0,0,[0,1,2,3,4,5,6,7,8,9,10,11]) time.sleep(0.05) if self.lightMode != 'police': break time.sleep(0.
91 self.colorBreathB = B_input self.resume() def breathProcessing(self): ''' Specific realization method of breathing lamp ''' while self.lightMode == 'breath': ''' All lights gradually brighten ''' for i in range(0,self.breathSteps): if self.lightMode != 'breath': break self.setColor(self.colorBreathR*i/self.breathSteps, self.colorBreathG*i/self.breathSteps, self.colorBreathB*i/self.breathSteps) ''' time.sleep(0.03) All lights are getting darker ''' for i in range(0,self.breathSteps): if self.
92 def run(self): ''' Functions for multi-threaded tasks ''' while 1: self.__flag.wait() self.lightChange() pass if __name__ == '__main__': RL=RobotLight() RL.start() # Instantiate the object that controls the LED light # Start thread ''' Start breathing light mode and stop after 15 seconds ''' RL.breath(70,70,255) time.sleep(15) RL.pause() ''' Pause for 2 seconds ''' time.sleep(2) ''' Start the police light mode and stop after 15 seconds ''' RL.police() time.sleep(15) RL.pause() 14.
93 RL=robotLight.RobotLight() # Instantiate the object that controls the LED light RL.start() # Start thread ''' Start breathing light mode and stop after 15 seconds ''' RL.breath(70,70,255) time.sleep(15) RL.pause() ''' Pause for 2 seconds ''' time.sleep(2) ''' Start the police light mode and stop after 15 seconds ''' RL.police() time.sleep(15) RL.
94 15 Real-Time Video Transmission ● Real-time video and OpenCV function are the advantages of the Raspberry Pi robot. This chapter introduces the method of real-time video. In fact, there are many ways to transfer the images collected by the Raspberry Pi camera to other devices through the network The robot uses the open source project [ fl ask-video-streaming] from Github the MIT open source agreement, you can click the link to view the source code of the project.
95 display the Raspberry Pi on the page in real time. Camera screen. sudo nano app.py ●Here we use nano that comes with Raspbian to open app.py for editing in the console. Since it is just some operations for commenting and deleting comments, there is no need to use other IDEs for editing. ●After opening the IDE, we comment out the code: if os.environ.get('CAMERA'): Camera = import_module('camera_' + os.environ['CAMERA']).
96 from importlib import import_module import os from flask import Flask, render_template, Response # import camera driver ''' if os.environ.get('CAMERA'): Camera = import_module('camera_' + os.environ['CAMERA']).Camera else: from camera import Camera ''' # Raspberry Pi camera module (requires picamera package) from camera_pi import Camera app = Flask(__name__) @app.route('/') def index(): """Video streaming home page.""" return render_template('index.
97 sudo app.py ●Open the browser on the device on the same local area network as the Raspberry Pi (we use Google Chrome to test), and enter the IP address of the Raspberry Pi plus the video streaming port number: 5000 in the address bar, as shown in the following example: 192.168.3.157:5000 ● Now you can see the page created by the Raspberry Pi on the browser of your computer or mobile phone. After loading successfully, the page will display the real-time image of the Raspberry Pi camera.
98 16 Why OpenCV Uses Multi-threading to Process Video Frames ●The OpenCV function is based on the GitHub project flask-video-streaming, we changed the camera_opencv.py to perform OpenCV related operations. 16.1 Single Thread Processing of Video Frames ● First, we introduce the process of single-thread processing of video frames. Let ’s start with a simple one, so that you will understand why OpenCV uses multiple threads to process video frames.
99 make it abnormally stuck. 16.2 Multi-thread Processing of Video Frames ●Next, the process of multi-thread processing of video frames is introduced: ● Process explanation: In order to improve the frame rate, we separate the analysis task of the video frame from the process of acquisition and display, and place it in a background thread to execute and generate drawing information. ● We change the complete code of camera_opencv.
100 import threading import imutils class CVThread(threading.Thread): ''' This class is used to process OpenCV analysis of video frames in the background. For more basic usage principles of the multi-threaded class, please refer to 14.2 ''' def __init__(self, *args, **kwargs): self.CVThreading = 0 super(CVThread, self).__init__(*args, **kwargs) self.__flag = threading.Event() self.__flag.clear() def mode(self, imgInput): ''' This method is used for incoming video frames that need to be processed ''' self.
101 self.CVThreading = 0 def resume(self): ''' Resume the thread ''' self.__flag.set() def run(self): ''' Process video frames in the background thread ''' while 1: self.__flag.wait() self.CVThreading = 1 self.doOpenCV(self.imgCV) class Camera(BaseCamera): video_source = 0 def __init__(self): if os.environ.get('OPENCV_CAMERA_SOURCE'): Camera.set_video_source(int(os.environ['OPENCV_CAMERA_SOURCE'])) super(Camera, self).__init__() @staticmethod def set_video_source(source): Camera.
102 img = camera.read() if cvt.CVThreading: ''' If OpenCV is processing video frames, skip ''' pass else: ''' If OpenCV is not processing video frames, give the video frame processing thread a new video frame and resume the processing thread ''' cvt.mode(img) cvt.resume() ''' Draw elements on the screen ''' img = cvt.elementDraw(img) # encode as a jpeg image and return it yield cv2.imencode('.jpg', img)[1].tobytes() ●The above is the code principle of using multi-threading to process OpenCV.
103 17 OpenCV Learn to Use OpenCV ●The real-time video transmission function comes from the open source project of Github the MIT open source agreement flask-video-streaming. ●First, prepare two .py files in the same folder in the Raspberry Pi. The code is as follows: ·app.py #!/usr/bin/env python3 from importlib import import_module import os from flask import Flask, render_template, Response from camera_opencv import Camera app = Flask(__name__) def gen(camera): while True: frame = camera.
104 from thread import get_ident except ImportError: from _thread import get_ident class CameraEvent(object): """An Event-like class that signals all active clients when a new frame is available. """ def __init__(self): self.events = {} def wait(self): """Invoked from each client's thread to wait for the next frame.""" ident = get_ident() if ident not in self.events: # this is a new client # add an entry for it in the self.events dict # each entry has two elements, a threading.
105 """Invoked from each client's thread after a frame was processed.""" self.events[get_ident()][0].clear() class BaseCamera(object): thread = None # background thread that reads frames from camera frame = None # current frame is stored here by background thread last_access = 0 # time of last client access to the camera event = CameraEvent() def __init__(self): """Start the background camera thread if it isn't running yet.""" if BaseCamera.thread is None: BaseCamera.last_access = time.
106 for frame in frames_iterator: BaseCamera.frame = frame BaseCamera.event.set() # send signal to clients time.sleep(0) # if there hasn't been any clients asking for frames in # the last 10 seconds then stop the thread if time.time() - BaseCamera.last_access > 10: frames_iterator.close() print('Stopping camera thread due to inactivity.') break BaseCamera.thread = None ●When you use the follow-up tutorial to develop a certain OpenCV related function, you only need to put the corresponding camera_opencv.
107 18 Using OpenCV to Realize Color Recognition and Tracking 18.1 Color Recognition and Color Space ●For the development preparation and operation of OpenCV function, please refer to 18. ●Create camera_opencv.py in the folder where app.py and base_camera.py in 18. The code related to the OpenCV color tracking function to be introduced in this chapter is written in camera_opencv.py ●For safety reasons, this routine does not control the motor or servo motion, and only outputs OpenCV calculation results.
108 general process is as follows 18.3 Specific Code ●camera_opencv.
109 Set target color, HSV color space ''' colorUpper = np.array([44, 255, 255]) colorLower = np.array([24, 100, 100]) font = cv2.FONT_HERSHEY_SIMPLEX class Camera(BaseCamera): video_source = 0 def __init__(self): if os.environ.get('OPENCV_CAMERA_SOURCE'): Camera.set_video_source(int(os.environ['OPENCV_CAMERA_SOURCE'])) super(Camera, self).__init__() @staticmethod def set_video_source(source): Camera.video_source = source @staticmethod def frames(): camera = cv2.VideoCapture(Camera.
110 ''' c = max(cnts, key=cv2.contourArea) ((box_x, box_y), radius) = cv2.minEnclosingCircle(c) M = cv2.moments(c) center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"])) X = int(box_x) Y = int(box_y) ''' Get the center point coordinates of the target color object and output ''' print('Target color object detected') print('X:%d'%X) print('Y:%d'%Y) print('-------') ''' Write text on the screen:Target Detected ''' cv2.putText(img,'Target Detected',(40,60), font, 0.5,(255,255,255),1,cv2.
111 18.4 HSV Color Component Range in OpenCV HSV\color Black Grey White Red Orange Yellow Green Cyan Blue Purple H_min 0 0 0 0|156 11 26 35 78 100 125 H_max 180 180 180 10|180 25 34 77 99 124 155 S_min 0 0 0 43 43 43 43 43 43 43 S_max 255 43 30 255 255 255 255 255 255 255 V_min 0 46 221 46 46 46 46 46 46 46 V_max 46 220 255 255 255 255 255 255 255 255
112 19 Machine Line Tracking Based on OpenCV 19.1 Visual Line Inspection Process ●For the development preparation and operation of OpenCV function, please refer to 18. ●Create camera_opencv.py in the folder where app.py and base_camera.py in 18, the code related to the OpenCV visual line tracking function to be introduced in this chapter is written in camera_opencv.py. ●For safety reasons, this routine does not control the motor or servo motion, and only outputs OpenCV calculation results.
113 19.2 Specific Code import os import cv2 from base_camera import BaseCamera import numpy as np import time import threading import imutils ''' Set the color of the line, 255 is the white line, 0 is the black line ''' lineColorSet = 255 ''' Set the horizontal position of the reference, the larger the value, the lower, but not greater than the vertical resolution of the video (default 480) ''' linePos = 380 class Camera(BaseCamera): video_source = 0 def __init__(self): if os.environ.
114 Convert the picture to black and white, and then binarize (the value of each pixel in the picture is 255 except 0) ''' img = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) retval, img = cv2.threshold(img, 0, 255, cv2.THRESH_OTSU) img = cv2.erode(img, None, iterations=6) #Use Corrosion Denoising colorPos = img[linePos] #Get an array of pixel values for linePos try: lineColorCount_Pos = np.sum(colorPos == lineColorSet) #Get the number of pixels of line color (line width) lineIndex_Pos = np.
115 20 Create A WiFi Hotspot on The Raspberry Pi ●The method of turning on the WIFI hotspot in our robot product uses a project from GitHub create_ap. Our installation script will automatically install this program and related dependent libraries. If you have not run our installation script, you can use the following command to install create_ap: sudo git clone https://github.com/oblique/create_ap.
116 21 Install GUI Dependent Item under Window ● Our old version of robot programs all provide a desktop GUI program to control the robot. The GUI program is written in Python language, but this method uses a higher threshold and difficulty, and is not recommended for novices. ●This GUI program is only suitable for the Windows system for the time being. It is included in the client directory of the robot software package and is generally called GUI.py. 21.
117 22 How to Use GUI ●Because the web and the GUI are not connected, if you want to use the GUI to control the robot product, you need to manually run server.py. (The method is the same as manually running webserver.py, except that the object is changed to server.py). ●When the server.py in the Raspberry Pi runs successfully, enter the IP address of the Raspberry Pi on the GUI control terminal on the PC, and then click Connect to control the robot.
118 23 Enable UART ●UART is a more commonly used communication protocol between devices. Using UART, you can allow MCUs such as Arduino, STM32, or ESP32 to communicate with the Raspberry Pi, which can make your robot more powerful. ●However, for some Raspberry Pis, the UART that is enabled by default is not a full-featured UART, so you need to refer to the following steps to enable the full-featured UART.
119 23.1 Mini UART and CPU Core Frequency ●The baud rate of the mini UART is linked to the core frequency of the VPU on the VC4 GPU. This means that, as the VPU frequency governor varies the core frequency, the baud rate of the mini UART also changes. This makes the mini UART of limited use in the default state. By default, if the mini UART is selected for use as the primary UART, it will be disabled. To enable it, add enable_uart=1 to config.txt.
120 23.3 UART Output on GPIO Pins ●By default, the UART transmit and receive pins are on GPIO 14 and GPIO 15 respectively, which are pins 8 and 10 on the GPIO header. 23.4 UARTs and Device Tree ● Various UART Device Tree Overlay definitions can be found in the kernel github tree. The two most useful overlays are disable-bt and miniuart-bt . ● disable-bt disables the Bluetooth device and restores UART0/ttyAMA0 to GPIOs 14 and 15.
121 ●The mini UART has smaller FIFOs. Combined with the lack of flow control, this makes it more prone to losing characters at higher baudrates. It is also generally less capable than the PL011, mainly due to its baud rate link to the VPU clock speed.
122 24 Control Your Adeept_RaspClaws with An Android Device ●If you want to use a mobile phone or tablet to control the robot, we first recommend that you use the WEB application to control the robot, because the WEB application has more functions, maintenance and updates are more frequent, and most importantly, the WEB application can be cross-platform No matter whether you use Android system or iOS system, as long as Google Chrome is installed, you can use the WEB application to control the robot.
123
124 ●It should be noted that the port number when using the WEB application is 5000, the port number when using the GUI program is 10223, and the port number when using the mobile APP is 10123. ●The controller on the left can control the robot to move back and forth, left and right, and the controller on the right can control other movements of the robot. You can change the specific operation by editing appserver.py.
125