Pedestrian Detection using Histogram of Gradients (HoG) and a Random Forests Classifier

This entry is part 1 of 1 in the series Pedestrian Detection

1,178 64×128 sized images of pedestrians (positive samples) and 4,530 of the same sized negative samples were extracted from the INRIA dataset and divided randomly into two to give a training set and a validation set. HoG features on the training images were used to train a Random Forests classifier. The number of trees was set to 100 and the maximum depth of each tree set to 5.

For detection, the search was carried out only over one scale (for testing purposes), on a separate set of images collected from the lab.

Completely un-optimized code on a Core-Two duo laptop already runs at 20 fps.

Enclosure for Bug’s uc + Environment Sensing

This entry is part 10 of 10 in the series Android Robot

Printed out an enclosure for Bug’s circuit boards.

Enclosure for circuit boards

Enclosure for circuit boards

Also connected the 2 IR sensors on the bumper to the IOIO board for environment sensing. I’ve got the phone’s display to change colour based on whether or not something is detected in front of either of the two sensors.
Here’s the video:

However, the motors have stopped turning. I mistakenly connected the positive terminals from the sensors to Vin on the board instead of the 3.3V terminal and that seems to have damaged something. Oh well! I’ve just ordered a replacement IOIO board and a motor driver board.

Connecting DC motors to the IOIO through the TB6612FNG, a motor driver break out board

This entry is part 3 of 10 in the series Android Robot

The IOIO has the capability to output Pulse Width Modulated (PWM) signals on its pins. You can vary the duty cycle of PWM signals to control the speed of DC motors. It’s recommended that you don’t connect the IOIO directly to motors, but use a motor driver breakout board in between.

The board I’m using to power the DC motors on my Android robot is the TB6612FNG.

And this is the circuit diagram for connecting the IOIO to the TB6612FNG, modified from the book Getting Started With IOIO, by Simon Monk.

IOIO to TB6612FNG motor driver board circuit diagram, modified from the book Getting Started with IOIO (2011 Simon Monk)

The only modifications I’ve made to Simon’s original circuit diagram are:

1. I’m using a 9V power supply instead of his 6V.
2. Consequently, I’ve connected Vm, the motor’s power supply to the 5V
output on the IOIO board instead of directly to the batteries as the
motors I’m using are 6V DC motors.

So, the following are the steps to do this:

1. Solder a header onto the side of the TB6612FNG board with pins Vm to GND.

Solder header onto motor driver board

2. Solder 7 pins onto IOIO board from outputs 39 to 45. Do this so that the pins stick out off the top of the IOIO board and the TB6612FNG can later be threaded onto these pins.

Solder 7 pins on IOIO outputs 39-45.

3. Solder the TB6612FNG to the IOIO using the pins from step 2, so that the outputs 39 to 45 on the IOIO are connected to inputs PWMA to GND on the TB6612FNG.

Solder the TB6612FNG to the IOIO pins

4. Finally, wire all connections as shown in the circuit diagram shown earlier.

IOIO connected to DC motors on robot through the TB6612FNG

Note that I’ve not soldered on a switch as yet in the image above.

Also, the PowerTech power supply I was using in my previous post turned out to be woefully inadequate in powering the motors as it was pumping out a mere 500 mA. So I used a 9V alkaline battery instead.

Watch the video of the robot motors being driven from the phone here:

If your motors turn in the wrong direction, just reverse the motor output polarities on the pins A1/A2 and B1/B2 of the TB6612FNG.

And here’s the Java code to run the motors from an Android app.

/**
 * Modified from code from the book Getting Started with IOIO (2011 Simon Monk)
 * by Jay Chakravarty 2012
 *
 * Simple program to connect to two motors and drive them forward 
 * using Pulse Width Modulation (PWM)
 */

package com.ioiobook.rover;

import ioio.lib.api.DigitalOutput;
import ioio.lib.api.PwmOutput;
import ioio.lib.api.exception.ConnectionLostException;
//import ioio.lib.util.AbstractIOIOActivity;

import ioio.lib.util.BaseIOIOLooper;
import ioio.lib.util.IOIOLooper;

import ioio.lib.util.android.IOIOActivity;

import android.os.Bundle;

public class MainActivity extends IOIOActivity {

    private static final int AIN1_PIN = 41;
    private static final int AIN2_PIN = 40;
    private static final int PWMA_PIN = 39;
    private static final int PWMB_PIN = 45;
    private static final int BIN2_PIN = 44;
    private static final int BIN1_PIN = 43;
    private static final int STBY_PIN = 42;
    private static final int LED_PIN = 0;

    private static final int PWM_FREQ = 100;

    private float left_ = 0;
    private float right_ = 0;

    //private RoverControlView roverControlView_;

    /**
     * Called when the activity is first created. Here we normally initialize
     * our GUI.
     */
   @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        //roverControlView_ = new RoverControlView(this);
        //roverControlView_.setBackgroundColor(Color.WHITE);
        //setContentView(roverControlView_);
    }

    /**
     * This is the thread on which all the IOIO activity happens. It will be run
     * every time the application is resumed and aborted when it is paused. The
     * method setup() will be called right after a connection with the IOIO has
     * been established (which might happen several times!). Then, loop() will
     * be called repetitively until the IOIO gets disconnected.
     */
    class Looper extends BaseIOIOLooper {
        private DigitalOutput ain1_;
        private DigitalOutput ain2_;
        private PwmOutput pwma_;
        private PwmOutput pwmb_;
        private PwmOutput led_;
        private DigitalOutput bin2_;
        private DigitalOutput bin1_;
        private DigitalOutput stby_;

        /**
         * Called every time a connection with IOIO has been established.
         * Typically used to open pins.
         * 
         * @throws ConnectionLostException
         *             When IOIO connection is lost.
         * 
         * @see ioio.lib.util.AbstractIOIOActivity.IOIOThread#setup()
         */
        @Override
        protected void setup() throws ConnectionLostException {
            ain1_ = ioio_.openDigitalOutput(AIN1_PIN);
            ain2_ = ioio_.openDigitalOutput(AIN2_PIN);
            pwma_ = ioio_.openPwmOutput(PWMA_PIN, PWM_FREQ);
            pwmb_ = ioio_.openPwmOutput(PWMB_PIN, PWM_FREQ);
            //pwma_ = ioio.openPwmOutput(new DigitalOutput.Spec(PWMA_PIN, OPEN_DRAIN), PWM_FREQ);
            
            
            bin2_ = ioio_.openDigitalOutput(BIN2_PIN);
            bin1_ = ioio_.openDigitalOutput(BIN1_PIN);
            stby_ = ioio_.openDigitalOutput(STBY_PIN);
            stby_.write(true); // On
            led_  = ioio_.openPwmOutput(LED_PIN,PWM_FREQ);
        }

        /**
         * Called repetitively while the IOIO is connected.
         * 
         * @throws ConnectionLostException
         *             When IOIO connection is lost.
         * @throws InterruptedException
         * 
         * @see ioio.lib.util.AbstractIOIOActivity.IOIOThread#loop()
         */
        
        public void loop() throws ConnectionLostException {
        		led_.setDutyCycle((float)0.9);
        		
        		//Setting PWM Duty Cycle for 1st motor
        		//Note: This should not exceed 1.0
        		//Vary between 0 and 1 to achieve different speeds
        		pwma_.setDutyCycle((float)1.0);
        		ain1_.write(false);
        		ain2_.write(true);
        		
        		
        		//Setting PWM Duty Cycle for 2nd motor
        		//Note: This should not exceed 1.0
        		//Vary between 0 and 1 to achieve different speeds
        		pwmb_.setDutyCycle((float)1.0);
        		bin1_.write(false);
        		bin2_.write(true);
            

            try {
            	Thread.sleep(10);
        	} catch (InterruptedException e) {
			}
        }

    }

    /**
     * A method to create our IOIO thread.
     * 
     * @see ioio.lib.util.AbstractIOIOActivity#createIOIOThread()
     */
    @Override
    protected IOIOLooper createIOIOLooper() {
        return new Looper();
    }

}

Occupancy Grid

This entry is part 3 of 3 in the series Python Robotics Simulator

The easiest way for a robot to map its environment is using an occupancy grid. As the robot moves through its environment, the readings returned from its laser range finder (LRF) at different angles around it are marked off on a map. As a spot is marked more than once, its probability increases. This simulation shows the creation of an occupancy grid map. However, in the real world, odometry is not perfect and the robot’s position (as estimated by odometry) drifts from its real position, resulting in a skewed map. Hence, the requirement for a loop-closing technique like SLAM (Simultaneous Localization & Mapping) that corrects for this error when the robot doubles back on itself and visits a previously known location.

Occupancy Grid Simulation


See the simulation here:

Particle Filter Localization for a Simulated Mobile Robot

This entry is part 2 of 3 in the series Python Robotics Simulator

This simulation shows a robot (red) moving through a simulated environment and a Particle Filter (PF) trying to localize it in that environment. Initially, the particles (shown in different shades of grey indicating their probabilities) do not know the location of the robot and try and localize the robot from scratch. This is called global localization. After the robot moves about in its environment, the particles coalesce on the robot’s correct position. Thereafter, it is easier for the PF to track the position of the robot and this is called position tracking. Note that the environment is quite symmetric and this is why the PF has difficulty in finding the robot in the beginning, but gets there in the end.

Particle Filter Localization for a Simulated Mobile Robot


The measurements are simulated laser range finder readings (see previous post: Simulated Laser Range Finder Readings Using Python & OpenCV).
The code is written in Python using the OpenCV library (wrapper for python). It is adapted from PF code written by Juan Carlos Kuri Pinto and the Udacity team over at Udacity University’s online Autonomous Driving class (http://www.udacity-forums.com/cs373/questions/14131/unit-3-graphical-demo-of-ro­bot-localization-based-on-particle-filters).

See the video of the simulation here:

I’ll put the code up as soon as I’ve tidied it up a bit.