ROBOTC.net Blog  

ROBOTC News

Archive for November, 2011

VEX Mecanum Drive using ROBOTC

with one comment

One cool new product for the VEX Robotics system are the VEX Mecanum Wheels.

Where traditional wheels only allow your robot to move forward, backward, and turn, mecanum wheels allow your robot to  move in all four directions -  forward, backward, left and right, while still maintaining the ability to turn. Your robot is able to accomplish this thanks to their unique design. Each wheel (which must be connected to its own, independent motor) contains rollers offset by 45 degrees from the wheel and the robot’s drive train. Instead of the motors introducing force in just the forward-backward directions, each wheel introduces force at 45 degree angles (45, 135, 225, 315). The force vectors of the different wheels can then be combined and/or cancelled out to create motion in the X and Y axis.

Here’s video of the Mecanum wheels in action (the code running on the robot can be found later in the post):

To give the wheels a test run, we built the robot below. Notice that if you’re looking down on the robot, the spokes or rollers of the wheels should point inward to create an X. If you’re looking at the bottom of the robot, they should point out to create a baseball diamond. There are two sets of two wheels - all four are not identical.

[scrollGallery id=4]

When you decide to incorporate your own mecanum drive system, be sure to follow the X/diamond rule or your robot will not behave as expected. Also know that equal weight distribution and consistent build quality are far more important when working with a mecanum drive. Remember that the robot moves based on adding and/or cancelling out the different force vectors produced by the four individual motors – if one or two motors can’t match the others the entire motion of the robot will be thrown off.

A fairly simple program can be written  to test the mecanum drive system. This program will use the left joystick on the remote control to move the robot forward-backward and right-left. The X axis on the right joystick will control the rotation of the robot.

#pragma config(Motor,  port2,           frontRight,    tmotorNormal, openLoop)
#pragma config(Motor,  port3,           backRight,     tmotorNormal, openLoop)
#pragma config(Motor,  port4,           frontLeft,     tmotorNormal, openLoop, reversed)
#pragma config(Motor,  port5,           backLeft,      tmotorNormal, openLoop, reversed)
//*!!Code automatically generated by 'ROBOTC' configuration wizard               !!*//

/*+++++++++++++++++++++++++++++++++++++++++++++| Notes |++++++++++++++++++++++++++++++++++++++++++++
Mecanum Drive - Basic
- This program allows you to remotely control a robot with mecanum wheels.
- The left joystick Y-axis controls the robot's forward and backward movement.
- The left joystick X-axis controls the robot's left and right movement.
- The right joystick X-axis controls the robot's rotation.

[I/O Port]          [Name]              [Type]                [Description]
Motor Port 2        frontRight          VEX Motor             Front Right motor
Motor Port 3        backRight           VEX Motor             Back Right motor
Motor Port 4        frontLeft           VEX Motor             Front Left motor
Motor Port 5        backLeft            VEX Motor             Back Left motor
--------------------------------------------------------------------------------------------------*/

task main()
{
//Loop Forever
while(1 == 1)
{
//Remote Control Commands
motor[frontRight] = vexRT[Ch3] - vexRT[Ch1] - vexRT[Ch4];
motor[backRight] =  vexRT[Ch3] - vexRT[Ch1] + vexRT[Ch4];
motor[frontLeft] = vexRT[Ch3] + vexRT[Ch1] + vexRT[Ch4];
motor[backLeft] =  vexRT[Ch3] + vexRT[Ch1] - vexRT[Ch4];
}
}

The program above, however, isn’t perfect. If the joysticks don’t center perfectly at zero, or if the driver unintentionally moves the joysticks along multiple axis, the movement of the robot will be thrown off. With a few variables and a little bit of logic, we can ignore these erroneous values and provide smoother control of the robot:

#pragma config(Motor,  port2,           frontRight,    tmotorNormal, openLoop)
#pragma config(Motor,  port3,           backRight,     tmotorNormal, openLoop)
#pragma config(Motor,  port4,           frontLeft,     tmotorNormal, openLoop, reversed)
#pragma config(Motor,  port5,           backLeft,      tmotorNormal, openLoop, reversed)
//*!!Code automatically generated by 'ROBOTC' configuration wizard               !!*//

/*+++++++++++++++++++++++++++++++++++++++++++++| Notes |++++++++++++++++++++++++++++++++++++++++++++
Mecanum Drive with Deadzone Thresholds
- This program allows you to remotely control a robot with mecanum wheels.
- The left joystick Y-axis controls the robot's forward and backward movement.
- The left joystick X-axis controls the robot's left and right movement.
- The right joystick X-axis controls the robot's rotation.
- This program incorportes a threshold/deadzone that allows very low Joystick values to be ignored.
This allows the robot to ignore values from the Joysticks when they fail to center at 0,
and provides a margin of error for the driver when they only want the robot to move in one axis.

[I/O Port]          [Name]              [Type]                [Description]
Motor Port 2        frontRight          VEX Motor             Front Right motor
Motor Port 3        backRight           VEX Motor             Back Right motor
Motor Port 4        frontLeft           VEX Motor             Front Left motor
Motor Port 5        backLeft            VEX Motor             Back Left motor
--------------------------------------------------------------------------------------------------*/

task main()
{
//Create "deadzone" variables. Adjust threshold value to increase/decrease deadzone
int X2 = 0, Y1 = 0, X1 = 0, threshold = 15;

//Loop Forever
while(1 == 1)
{
//Create "deadzone" for Y1/Ch3
if(abs(vexRT[Ch3]) > threshold)
Y1 = vexRT[Ch3];
else
Y1 = 0;
//Create "deadzone" for X1/Ch4
if(abs(vexRT[Ch4]) > threshold)
X1 = vexRT[Ch4];
else
X1 = 0;
//Create "deadzone" for X2/Ch1
if(abs(vexRT[Ch1]) > threshold)
X2 = vexRT[Ch1];
else
X2 = 0;

//Remote Control Commands
motor[frontRight] = Y1 - X2 - X1;
motor[backRight] =  Y1 - X2 + X1;
motor[frontLeft] = Y1 + X2 + X1;
motor[backLeft] =  Y1 + X2 - X1;
}
}

Both of the above pieces of sample code will be included in ROBOTC 3.05. The VEX Mecanum wheels, along with all of your hardware needs can be purchased from the Robomatter store.

Written by Jesse Flot

November 22nd, 2011 at 2:36 pm

Michigan VEX Regionals

without comments

A few weeks ago I came to the VEX Regionals over at Monroe, Michigan which was held at  Monroe High School. This was their first year hosting the VEX competition, and Steve Ketron did a nice job organizing the event.

There were about two dozen teams, and the robots that were built by the teams were phenomenal.

I’ve been fortunate enough to go to several VEX competitions over the years, and this one had a very friendly and relaxed atmosphere. I’d chalk that up to the friendly teams surrounding Monroe that came, and also Steve’s planning for the competition.

Photos

I took several hundred photos AND video for the event. I will post the video as soon as I can get all of them onto youtube.

Note: If you want a photo removed, I completely understand! Just comment on this post letting me know which photo to remove and I will remove it immediately. Your comment will not show up below so it will be anonymous.

[scrollGallery id=2]

Written by Vu Nguyen

November 22nd, 2011 at 9:29 am

LEGO Street View Car v2.0

with one comment

Thanks To Mark over at www.mastincrosbie.com for creating this incredible project and providing the information. Also Thanks to Xander for providing the community with drivers to use the mentioned sensors in ROBOTC.

You might remember the original Lego Street View Car I built in April. It was very popular at the Google Zeitgeist event earlier this year.

I wanted to re-build the car to only use the Lego Mindstorms NXT motors. I was also keen to make it look more….car-like. The result, after 4 months of experimentation, is version 2.0 of the Lego Street View Car.

As you can see this version of the car is styled to look realistic. I also decided to use my iPhone to capture images on the car. With iOS 5 the iPhone will upload any photos to PhotoStream so I can access them directly in iPhoto.

The car uses the Dexter Industries dGPS sensor to record the current GPS coordinates.

The KML file that records the path taken by the car is transmitted using the Dexter Industries Wifi sensor once the car is within wireless network range.

Design details

The LEGO Street View Car is controlled manually using a second NXT acting as a Bluetooth remote. The remote control allows me to control the drive speed and steering of the car. I can also brake the car to stop it from colliding with obstacles. Finally pressing a button on the remote

Every time an image is captured the current latitude and longitude are recorded from the dGPS. The NXT creates a KML format file in the flash filesystem which is then uploaded from the NXT to a PC. Opening the KML file in Google Earth shows the path that the car drove, and also has placemarks for every picture you took along the way. Click on the placemark to see the picture.

For each GPS coordinate I create a KML Placemark entry that embeds descriptive HTML code using the CDATA tag. The image link in the HTML refers to the last image captured on disk.

The images are captured by triggering the camera on my iPhone. I use an app called SoundSnap which triggers the camera when a loud sound is heard by the phone. By placing the iPhone over the NXT speaker I can trigger the iPhone camera by playing a loud tone on the NXT. While this is not ideal (Bluetooth would be better) it does the job for now.

To get the photos from the iPhone I use the PhotoStream feature in iOS 5. I select the pictures in iPhoto and export them to my laptop. The iPhone will only upload photos when I am in range of a wireless network.

Finally the Dexter Industries Wifi sensor is used to wirelessly transmit the KML file to my laptop over the wireless network.


<Placemark>

<name>LSVC Snapshot 1</name>

<description><![CDATA[<img src='Images/IMG_1.jpg' width=640 height=480> ]]></description>

<Point>

<coordinates> -6.185952, 53.446190, 0</coordinates>

</Point>

</Placemark>

<Placemark>

<name>LSVC Snapshot 2</name>

<description><![CDATA[<img src='Images/IMG_2.jpg' width=640 height=480> ]]></description>

<Point>

<coordinates> -6.185952, 53.446190, 0</coordinates>

</Point>

</Placemark>

The snippet from the KML file gives you an idea of what each placemark should look like.

Once the car has finished driving press the orange button on the NXT to save the KML file. This writes a <pathstring> entry which records the actual path of the car. A path string is simply a list of coordinates that define a path in Google Earth along the Earth’s surface. For example:


<Placemark>

<name>LSVC Path</name>

<description>LSVC Path</description>

<styleUrl>#yellowLineGreenPoly</styleUrl>

<LineString>

<extrude>10</extrude>

<tessellate>10</tessellate>

<altitudeMode>clampToGround</altitudeMode>

<coordinates>

-6.185952, 53.446190, 0

-6.185952, 53.446180, 0

</coordinates>

</LineString>

</Placemark>

Is a path two coordinates not far from where I live.

From the NXT to Google Earth

How do we get the pictures and KML file from the NXT and into Google Earth? First of all we need to get all the data in one place. The KML file refers to the relative path of each image, so we can package the KML file and the images into a single directory.

An example of the output produced is shown below. In this test case I started indoors in my house and took a few pictures. As you can see the dGPS has trouble getting an accurate reading and so the pictures appear to be scattered around the map. I then drove the car outside and started to capture pictures as I drove. From Snapshot 10 onwards the images become more realistic based on where the car actually is.

Video

I shot some video of the car driving outside my house. It was a windy dull day, so the video is a little dark. The fun part is seeing the view from on-board the car!

More videos are coming soon…

Photos

[nggallery id=1]

Written by Vu Nguyen

November 14th, 2011 at 1:11 pm

3rd Party Sensors with VEX: Sharp IR Rangefinder

without comments

The VEX Robotics System includes a growing, powerful set of official sensors. That said, the electronics world is overflowing with other types of cool and useful sensors. The great news is, both the VEX PIC and the VEX Cortex microcontrollers are extremely versatile and can be used with many of these sensors. Both the VEX PIC and VEX Cortex can be used with Analog Sensors that operate between 0 and 5 volts (+/- .5 volts), and have Power, Ground, and Signal lines.

One such sensor is the Sharp GP2D12 IR Rangefinder.

The Sharp IR Rangefinder allows your robot to determine distance from the nearest object, much like the VEX Ultrasonic Rangefinder. Unlike the VEX Ultrasonic Rangefinder, which uses  sound waves to measure distance, the Sharp IR Rangefinder uses infrared light. On one side of the sensor an infrared LED shines a beam of light, which is reflected back by the closest object, and detected by the receiver on the other side of the sensor. The sensor then uses a method called triangulation to determine how far away the object was.

The basis for triangulation is that objects at different distances will reflect the infrared beam back to the receiver at different angles. The varying angles produce different voltage levels in the sensor, and in turn sensor values that can be used to calculate distance. See below:

The Sharp IR Rangefinder provides very reliable distance values ranging from 10 to 80 centimeters away. One big advantage with using the IR Rangefinder is that it’s not affected by soft for angled objects that cause the VEX Ultrasonic Rangefinder to fail.

One challenge with using the sensor is that the raw values provided do not directly correlate to useful distance values, and they’re also non-linear. This means that we must perform a calculation on the raw sensor data first. Documentation for the sensor tells us that we can calculate the distance using the following formula:

Voltage = 1 / ( Distance + 0.42 )

Which, when rearranged gives us:

Distance = (1/Voltage) – 0.42

One additional challenge is that we can’t directly access the voltage returned by the sensor in our program. What we can access is the sensor value, which is proportional to the voltage – we’ll just have to take a conversion factor into account.

Below is a sample program written for the Sharp IR Rangefinder. It contains the function “IRValue()” which will perform the necessary calculation and return the useful distance data, in centimeters. Note that the “sensorPotentiometer” type was used, since there is no “IR Rangefinder” sensor type in ROBOTC, and the Potentiometer type will return the full, unmodified analog data (unlike other sensor types like the Gyro).

#pragma config(Sensor, in5,    sharp,               sensorPotentiometer)
#pragma config(Motor,  port2,           rightMotor,    tmotorNormal, openLoop, reversed)
#pragma config(Motor,  port3,           leftMotor,     tmotorNormal, openLoop)
//*!!Code automatically generated by 'ROBOTC' configuration wizard               !!*//

//Function uses data from the IR Rangerinder to calculate and return distance
int IRValue(tSensors sensorPort, float conversionFactor = .0000469616, float k = .42)
{
return (1.0 / (SensorValue[sensorPort] * conversionFactor)) - k;
}

task main()
{
wait1Msec(2000);
clearLCDLine(0);
clearLCDLine(1);

//While true...
while(true)
{
//If the robot is greater than 25 cm away...
if(IRValue(sharp) > 25)
{
//Move Forward
motor[rightMotor] = 30;
motor[leftMotor] = 30;
}
else
{
//Stop
motor[rightMotor] = 0;
motor[leftMotor] = 0;
}

//Display IR Sensor Data to LCD
displayLCDString(0, 0, "IR Value:");
displayLCDNumber(0, 9, IRValue(sharp), 3);
displayNextLCDString("cm");
}
}

Our Sharp IR Rangefinder was previously configured wired for an older microcontroller called the Handy Board. Like the VEX Cortex and PIC, it had wires for Power, Ground, and Signal, so we were able to use simple jumper wires to adapt it for the Cortex.

Front of the robot with Sharp IR Rangefinder:

Top of the robot, with connection from Sharp IR Rangefinder to VEX Cortex Analog Port 5:

And finally, here’s a video of the code from above running on the robot:

For more information on interpreting and using the data provided by the Sharp IR Rangefinder, check out this tutorial.

Written by Jesse Flot

November 8th, 2011 at 11:13 am