RM4 – Eyes Remake

This project was first of all the most challenging among the 4 remakes. Even with the help of the example codes, I still struggled a lot.

At first, I had trouble trying to understand the code provided, so I tried to use blob detection to detect the head as a whole with the blob coordinates as the values to be used to turn the servos. However, the blob detection did not work as it was difficult to filter one blob alone and it was unpredictable as to which part of the face the blob was detecting. Hence, this method is very unreliable and I decided to go back to the example code we were provided with.

Testing with ZigSim

I decided to work on the Pan only first for the remake because I was not confident of making 2 axis work. So first, I needed to test the servos with the ZigSim to make sure that the servos actually work and that the 2 scripts can communicate. From the example code of the “servos”, I changed the rpi.ip to the IP address of my raspberry pi as follows:

I also connected my wires in the form of red ( #4), yellow(#5) and black (#9). This is in terms of PIN number, not GPIO number. Hence, I also had to change the pinPan number to 3:

On the sensing side, there was not much to change except for the rpi.ip which is “172.20.10.12” and the IP address of my computer: “172.20.10.11” which was used to connect the ZigSim to my computer. One very important thing to note is to always enter “sudo pigpiod” in the terminal whenever the PyCharm is restarted. I kept forgetting about this part resulting in an AttributeError whenever I tried to communicate the motor and sensing code.

         

The next obstacle was the movement of the servos. I managed to get the communication between ZigSim and the servos going, but the rotation of the servos was unstable and laggy. I thought it was a problem with the code, but it turns out that the message rate that the ZigSim was sending to the code file was at 1 message per second, which resulted in a slow response on the servo side. I changed the message format to 60 per second and the movement became smooth. Hence, I could deduce that the code has no problem and I could move on to the facial recognition.

 

Testing with Facial Recognition

On the servo side, I did not have to change the code, so I focused on editing the sensing script. First, I had to remove all the lines relating to the ZigSim because I did not need them already. I then added the facial recognition script from RM3 to the sensing script as shown below. I edited the msg.add_arg(pAngle) from before to the line seen in the screenshot below. This instructs the script to use the coordinates of the facial recognition point as values to be sent to the servos script. 

I compared the facial recognition script from RM3 to the one I changed in RM4:

I deduced that the definition of normx = pAngle, so added this to the script too for pAngle:

I finally managed to get the facial recognition going. For the purpose of this remake, I only needed the coordinates of one point. Hence, I chose point #30 because it is located in the middle of the face.

Eventually, I managed to get the 2 scripts to communicate, but for some reason the servos were not moving. I compared the values communicated when the ZigSim was used and the ones being sent over by the facial recognition and found that while the angles sent by the ZigSim were between 0 – 180, the angles sent by the facial recognition was around 0.5. I guessed that maybe the problem was that the values sent by the facial recognition was in radians. I searched online and most of the information regarding the rotation of servos were in degrees. Hence, I decided to change the values sent over to degrees by multiplying the values and it worked.

Final Output

I thought that the rotation of the servo with respect to our head movements resembled a cctv camera, hence I made one and stuck it on the servo.

Leave a Reply