Automatic Webcam Tracking

For my mini project I wanted to make something that would move a around webcam to keep an object in the center of the screen. To do this I used two servos to change the x and y axis orientation of the webcam but before I got to attaching the webcam I needed to learn how to find objects and isolate them from a webcam, to do I used OpenCV and Python. The first program I made was to read in the feed from a webcam and display the normal feed, the hue, saturation and value(hsv) for the feed isolating the colour was looking for andthen a feed with the only the colour I was looking for, in the code the names for these feeds are frame, mask and res respectively. I referenced this website as an example. From the code from that website I was able to adapt it to find dark blue and scan through the image looking for the most top left white pixel in the mask and send this position through the serial port which would later be received by my PIC. I scanned through every pixel by looking at the each pixel value along the x axis and once it scanned every pixel on that row it would move to the next row. This would continue until either a white pixel was found or the code got to the end of the frame and scan through again. The pixel values of the mask where 0 if black and 255 if white, white representing a dark blue pixel in the original frame. The code and a picture are shown below:

import cv2
import numpy as np
import serial

# Setup Serial port and speed
ser = serial.Serial(2, baudrate=38400)
# Setup Webcam feed
cap = cv2.VideoCapture(1)
# Serial and webcam port may be different

# Take one frame of the webcam
_, frame =
# Find area of frame and Channel
y,x,c = frame.shape

# Find the limits of area being scanned, this case one third
# of the image
xlim = int(round(x*0.3))
ylim = int(round(y*0.3))
ymin = ylim
ymax = y - ylim
xmin = xlim
xmax = x - xlim 

# Find target area
error = 15
xtarget = int(round(x/2))
ytarget = int(round(y/2))
xtargetmin = xtarget - error
xtargetmax = xtarget + error
ytargetmin = ytarget - error
ytargetmax = ytarget + error
print xmin,xmax,ymin,ymax, xtarget, ytarget

# define range of blue color in HSV
# Different boundaries for different light settings
#lower_blue = np.array([80,50,10])
#upper_blue = np.array([170,255,255])
lower_blue = np.array([110,100,10])
upper_blue = np.array([170,255,255])

# Loop through reading in frames of the webcam,
# convert it to hsv, then highlight the blue range,
# then scan through the area with the limits and send
# first white value coordinates through serial port

	# Take each frame
	_, frame =

	# Smooth it
	frame = cv2.blur(frame,(5,5))

	# Convert BGR to HSV
	hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)

	# Threshold the HSV image to get only blue colors
	mask = cv2.inRange(hsv, lower_blue, upper_blue)

	# Print red and blue rectangle to show scanning and target area

	# Show both frames

	# Scan through frame looking for white
	y = ymin
	while y < ymax:
		x = xmin
		while x < xmax: 			if mask[y,x] > 0: # if pixel is not black send coordinates
				ser.write('x%03dy%03d' % (x,y))
				print('x%03dy%03d' % (x,y))
				x,y = xmax, ymax
			x = x + 1
		y = y + 1

	k = cv2.waitKey(5) & 0xFF
	if k == 27:			# wait for ESC key to exit
	elif k == ord('s'): # wait for 's' key to save and exit


Screenshot (2).pngA problem I had with this was that the more of the image I scanned, the slower the frame rate became so to get around this I decreased the scanning area, if the cam was continuously seeing the object the frame rate would increase as the scanning loop was ending every time it saw white. Another problem I had was getting the hsv ranges right for the object in different light settings changed the values of the object so I’d have to keep adjusting. To find out the ranges for each light setting I used GIMP, if you are interesting in doing it yourself you can follow the steps below.
The way I did it was loading a screenshot of the feed, opening it in GIMP, select Colors, Components and Decompose. From here select HSV and the image will split into three layers,hue, saturation and value turn off the two other layers when working on one. Select Colors again, Levels and then Edit Settings As Curve. From here move the curve around the x axis to change the range and edit this range so only your object is highlighted or as close as you can get because other colours could be in this range. Once you have this done for all three values, Take another screenshot of the object in a shadow or in light, keep doing this until only your object is shown in the feed. Pictures of this are shown below:

Now that the webcam was done I started working with the servos. Before I used bit banging but this time I decided to control them with pulse width modulation(PWM) which I had to play around with the values to get a formula, the formula I ended with was PDC = -9.44(a)+2150, “a” being the angle I want the servo to move to. Both servos use the same formula. I found the formula by adjusting the PDC until I got an angle such as 180 or 0 degrees and from knowing two of the angles I could use the equation of a line to find all the angles as the relationship between the servo angle and pulse were linear. The circuit diagram is shown below, this includes the serial to usb(UB232R):

webcam circuit

After getting the servos sorted I needed to get the PIC to read in values from the serial port to control the servo. The way I did this was by sending string like this “x180y090” through the UB232R serial to usb, this then sent the signal to the PIC to read. The PIC would wait until a “x” was sent then put the next three numbers into the x angle variable and repeated the same for the y angle. The code I used for reading in the serial values was referenced from here. To test this I used python on my computer to manually send in angles to the PIC to move the servos. A video of this is shown below:

If you are unsure about what port your serial to usb is you can download the software “ComPrinter” which will display what serial port is being usb, this software can be found here. When I got that working the last thing was to connect the python/opencv code to the PIC and calibrate the servos to move a certain angle depending on how far away the first white pixel was away from the middle of the screen. I would take location of this white pixel away from the target and called it the error and if the error was outside the target area the servo would move towards this location until it was inside this area. When scanning below the target area the frames would slow down because the code had to scan to more pixels to find the object, so to compensate for this, I moved the y axis servo more when moving up to speed up the response. The final code for the PIC code and videos are below:

// This code reads in serial values taken from a webcam, these values are the location
// of an object and tells the webcam which is mounted on the servo to turn towards the object
// Written by Ronan Byrne
// Last updated 03-12-2015

#include <xc.h>
#include <stdio.h>
#include 			<libpic30.h>

// Configuration settings
_FOSC(CSW_FSCM_OFF & FRC_PLL16); // Fosc=16x7.5MHz, Fcy=30MHz
_FWDT(WDT_OFF);                  // Watchdog timer off
_FBORPOR(MCLR_DIS);              // Disable reset pin

void setup();

int main()
    // Setup PWM and Serial Port

    while (1)
        if (U1STAbits.URXDA == 1)
            c = U1RXREG;
            // check that character is 'x'
            if (c != 'x') continue;

            // process three digits of x value
            while(U1STAbits.URXDA == 0);
            c = U1RXREG;
            d1 = c - 48;
            while(U1STAbits.URXDA == 0);
            c = U1RXREG;
            d2 = c - 48;
            while(U1STAbits.URXDA == 0);
            c = U1RXREG;
            d3 = c - 48;
            x = 100*d1 + d2*10 + d3;

            // wait for 'y' character
            while(U1STAbits.URXDA == 0);
            c = U1RXREG;
            if (c != 'y') continue;         

            // process three digits of y value
            while(U1STAbits.URXDA == 0);
            c = U1RXREG;
            d1 = c - 48;
            while(U1STAbits.URXDA == 0);
            c = U1RXREG;
            d2 = c - 48;
            while(U1STAbits.URXDA == 0);
            c = U1RXREG;
            d3 = c - 48;
            y = 100*d1 + d2*10 + d3;
            printf("x,y value is %d,%d\n", x,y);

            // Find the error
            xerror = xtarget - x;
            yerror = ytarget - y;

            // Check if error is outside target range, if yes change servo angle
            if ( xerror > 15) xangle = xangle - 1;
            else if(xerror < -15) xangle = xangle + 0.5;             if(yerror > 15) yangle = yangle + 0.5;
            else if (yerror < -15) yangle = yangle -1;              // Limit the angles the servo can go             if(xangle > 180) xangle = 180;
            if(xangle < 0) xangle = 0;             if(yangle > 180) yangle = 180;
            if(yangle < 0) yangle = 0;

            // Convert angle to PWM
            PDC1 = -9.44*(xangle) + 2150;
            PDC2 = -9.44*(yangle) + 2150;
    return 0;

// Setup the PWM and Serial port and move Servos to start position
void setup()
    // Setup UART
    U1BRG = 48;            // 38400 baud @ 30 MIPS
    U1MODEbits.UARTEN = 1; // Enable UART

    // Configure PWM for free running mode
    //   PWM period = Tcy * prescale * PTPER = 0.33ns * 64 * PTPER
    //   PWM pulse width = (Tcy/2) * prescale * PDCx
    PWMCON1 = 0x00FF;     // Enable all PWM pairs in complementary mode
    PTCONbits.PTCKPS = 3; // prescale=1:64 (0=1:1, 1=1:4, 2=1:16, 3=1:64)
    PTPER = 9470;         // 20ms PWM period (15-bit period value)
    PDC1 = 1300;          // 1.5ms pulse width on PWM channel 1
    PDC2 = 1300;          // 1.5ms pulse width on PWM channel 2
    PTMR = 0;             // Clear 15-bit PWM timer counter
    PTCONbits.PTEN = 1;   // Enable PWM time base
    _PTEN = 1;            // Enable PWM time base to start generating pulses
    __delay32(60000000);  // Allow time for servos to get to this position

I’m not sure why the first two videos are much slower than the third but they still demonstrate the project working. From the videos above you can see that the response can be slow and make the frames slower as well but the project does what I wanted it to do and I overcame the obstacles I encounter along the way but if I had more time I would like to find a better way to scan through the screen so I could take in the whole frame, look for a blob of white pixels rather than the first white pixel the code seen, this would help with ignoring the noise in the frame, to calibrate the servos to move more smoothly and not to over shoot the target as much and finally have the webcam zoom into the target dependent on the range the target was away. Applications for this project could be vloggers to keep track of them when they are recording videos or another application could be a turret for something like air-soft that would tracking and try and shoot certain teams wearing certain colours.
I may return to this project and make the improvements I mentioned but right now I’m happy with how it turned out. Also in the future I hope to make a post about how I made the power supply seen in the videos.


One thought on “Automatic Webcam Tracking

  1. This is brilliant stuff, Ronan. You had to tackle quite a few different challenges to get this up and running, so I’m delighted to see it working so well in the third video. Well done.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s