ROBOTC.net Blog  

ROBOTC News

Archive for June, 2011

Bionic NXTPod 3.0 by DiMastero

with one comment

[Thanks to DiMastero for submitting this project!]

Introduction

Festo, founded in 1925, is a German engineering-driven company based in Esslingen am Neckar. Festo sells both pneumatic and electric actuators, and provides solutions from assembly lines to fully automated full automation solutions utilizing Festo and third party components. It also has a kind of R&D department, the Bionic Learning Network, where they’ve created some amazing projects including SmartBird (“bird flight deciphered”), AquaJelly, Robotino XT and much more. [source]

They also created the Bionic Tripod 3.0, an arm-like robot based on four flexible rods actuated from below. By moving the actuators to different positions, the rods bend and move the adaptive gripper to any position quickly and energy efficiently.

“Festo – Bionic Tripod 3.0″ demonstration video

The tripod has been partially replicated before, but I’ve found no evidence about it being done entirely of Lego MindStorms. Cue the Bionic NXTPod 3.0

The following image cannot be displayed: Bionic NXTPod 3.0

Hardware

  • 2 Lego Mindstorms NXT intelligent bricks – one 1.0 and one 2.0
  • 5 Lego Mindstorms NXT motors
  • 4 Lego Mindstorms NXT touch sensors
  • 1 Lego Mindstorms NXT 1.0 light sensor
  • 1 Lego Power Functions (PF) LED light
  • 1 Lego pneumatic actuator, switch and pump

The robot itself consists of these parts:

  • 4 actuators
  • 4 flexible rods
  • the pneumatic grabber
  • the main structure
  • PF LEDs and a light sensor for communication

Mechanically, the NXTPod’s most important parts are the four actuators. They’re made up of a single NXT servo motor, which spins a worm wheel up a four-part gear rack, moving a sledge up or down a 14 studs axle. It can move up to 19 rotations up or down, in about 11 seconds at the default speed of 75%.

The following image cannot be displayed: Bionic NXTPod 3.0's actuators initial designinitial design of one of the four linear actuators; has improved since

The last motor serves a double function, to perform a single task: it moves the pneumatic switch and pumps the pump, opening or closing the gripper.

The following image cannot be displayed: Bionic NXTPod 3.0's gripper with its motor attachedgripper and its motor, final design

Both NXTs take care of two of the actuators, which are color coded to make programming easier – the master controls the red and blue motors, while the slave takes care of the black and beige ones. The slave also controls the pneumatic gripper at the tripod’s top.

The following image cannot be displayed: Bionic NXTPod 3.0's four color coded actuators (highlighted)the four color coded motors (red, blue, black, beige)

To connect the NXTs, the master has a NXT-PF cable connected to motor port B to control the LEDs in front of the slave’s light sensor. The problem with this setup is that the master can’t get any feedback from the slave. Therefore, it’s got to take the time the slave takes to perform certain actions into account to avoid overlapping commands.

Programming

Generally, the NXTs are set up in a master-slave configuration, where the master sends commands to the slave using LEDs and a light sensor and then waits for the slave to finish to send a new task. This is how it works:

  1. The slave is started up by the user
  2. The master is started up by the user, and turns the LEDs on for a tenth of a second
  3. Both the master and slave calibrate their motors ´by moving the actuators down until the touch sensors are pressed once the light is turned off again
  4. Once its calibrated, the slave waits for the LEDs to turn on again, so it knows a command is coming
  5. The master calibrates and waits the remaining of the eleven seconds to make sure the slave has calibrated as well, so it doesn’t send any commands before the slave is ready
  6. The master reads the block of code containing the positions for all four actuators and the gripper, and converts this into 12 binary bytes
  7. The LEDs are turned on, and after a short wait, the master turns the lights on and off ten times a second, taking a total of 1300 mSecs, or 1.3 seconds per full message.
  8. When the slave receive the bytes, it decodes them
  9. Both of the NXTs start simultaneously, and go to their positions.
  10. At the same time, the master calculates how many degrees the slave has to turn, and converts it into the approximate waiting time to, again, avoid overlapping commands
  11. Steps 6-10 repeat until the master has run through all of the blocks of code, after which it shuts down. The slave has to be turned off manually; the slave must be restarted every time the master finishes, or it will interpret the calibration command incorrectly.

The robot is very easy to program, and the user only has to provide 6 lines of code per tripod position:

1 motor_blue_rotations = 1;
2 motor_red_rotations = 1;
3 motor_black_rotations = 10;
4 motor_beige_rotations = 10;
5 gripper_operation = 0;
6 wait_for_completion();
7 wait10mSec(100);

The program does the rest of the work, also making sure the robot doesn’t ever overrun anything, and calibrates as much as possible. You can download the RobotC code at my downloads page, over here. Below is a short demonstration video:

“Bionic NXTPod 3.0″ demonstration video

Completion date: 2011/06/12

Last updated: 2011/06/12

Disclaimer: This site is neither owned nor endorsed by Festo Group. The Bionic Learning Network, SmartBird, AquaJelly and Robotino are all copyrighted by Festo. The Bionic Tripod 3.0, on which this project is based, is also copyrighted by Festo.

Written by Vu Nguyen

June 17th, 2011 at 10:00 am

Posted in Cool projects,NXT

Bring on the Heat: Thermal Imaging with the NXT

with 2 comments

Lookin' Hot!I built a pan and tilt rig for the Dexter Industries Thermal IR Sensor with a great deal of gearing down to allow me to take a lot of measurements as the rig moved around. Initially I had it set for about 40×40 measurements but those didn’t look that great and I wanted a bit more. I reprogrammed it and made it spit out data at a resolution of about 90×80.

The data from the thermal sensor was streamed to the debug output console in ROBOTC from which I copy and pasted it to an Excel worksheet.  I made some 3D graphs from the thermal data and it looked pretty cool.

Excel graph for cold glass Excel graph for candle flame

The left one is a cold glass and the right one is a candle.  I wasn’t really happy with the results of the graphs so I decided to quickly whip up a .Net app to read my CSV data and make some more traditional thermal images.  A few hours later, the results really did look very cool.

Thermal image for cold glass Thermal image for candle flame

Again, the left one is the cold glass and the right one is the candle.  Now that you have a thermal image, you can see the heat from the candle a lot more clearly. I made a quick video of the whole rig so you can get an idea.

A few days after the initial post about my thermal imaging system using the Thermal Infrared Sensor, I made some improvements with both the speed and accuracy of the whole thing. I made the sensor sampling interval time based, rather than encoder value based. This proved to be a lot better at getting consistent sampling rates. I also doubled the horizontal motor speed so I would be more likely to be still awake by the time it was done taking an image.

The left image was made with the old system, the right one with the new system. It’s a lot less fuzzy and there are no black gaps where the number of samples were fewer than the maximum number of samples in a row.

image_thumb7 image_thumb8

Perhaps there are other ways to improve the program but I am quite happy with how this has turned out.

The driver and program will be part of the next Driver Suite version. You can download a preliminary driver and this program from here: [LINK].  The .Net program and CSV files can be downloaded here: [LINK]. You will need Visual Studio to compile it.  You can download a free (Express) version of C# from the Microsoft website.

Written by Xander Soldaat

June 16th, 2011 at 4:47 pm

Controlling the MINDS-i Lunar Rover with a VEX Cortex

with one comment

Article written by Steve Comer

Remote control cars are great for having fun. They can be driven off-road, taken off jumps, and raced among other things. VEX robots are great for learning. They can be used to teach programming, math, problem solving, and other engineering skills. What do you get if you put them together??

Well, I can tell you. You get a rugged 4WD truck that is still tons of fun to drive around outside, but can also be used as a teaching tool.

Follow this link for more photos: http://s1081.photobucket.com/albums/j353/comeste10/VEX%20Rover%20Extras/

I started off with a MINDS-i Lunar Rover kit which is driven by a 7.2V DC motor and steered with a standard hobby servo. I removed the solar panel from the Rover and in its place put a VEX Cortex microcontroller and an LCD screen. On each side, I attached a VEX flashlight from the VEXplorer kit and I mounted an ultrasonic sensor to the front. It just so happens that VEX bolts and nuts fit quite easily into the beams of the MINDS-i rover.

I did all the programming in RobotC. See bottom of the page to view my RobotC code.

In order to control the stock motor and servo with the Cortex, I had to make a few modifications. I soldered the two wires to a 2-pin head which I then connected to the Cortex with a VEX motor controller.

For the servo, I used three single male-to-male jumper cables.

The video demonstrates the rover in autonomous mode where it makes use of the ultrasonic sensor to avoid bumping into walls. Remote control is also demonstrated using the VEXnet controller over Wi-Fi.

This is just a small sampling of the possibilities with this combination type of platform. Don’t let my initial direction limit you. It would be great to see some new combination robots. Get out there and start building!

This is my RobotC Code for the behaviors seen in the video.

AUTONOMOUS MODE:


#pragma config(UART_Usage, UART2, VEX_2x16_LCD)
#pragma config(Sensor, dgtl1,  sonar,               sensorSONAR_inch)
#pragma config(Motor,  port1,           L,             tmotorServoStandard, openLoop)
#pragma config(Motor,  port2,           servo,         tmotorNormal, openLoop)
#pragma config(Motor,  port3,           drive,         tmotorNormal, openLoop)
#pragma config(Motor,  port10,          R,             tmotorNormal, openLoop)
//*!!Code automatically generated by 'ROBOTC' configuration wizard               !!*//
/////////////////////////////////////////////////////////////////////////
//  Author  : Steven Comer//  Program : Rover drives straight until near an object, it then slows,
//            stops, then backs up and turns.
//  Updated : 8 June 2011 @ 10:20 AM
/////////////////////////////////////////////////////////////////////////
task main()
{

  //pause at start and turn on headlights
  wait1Msec(2000);
  motor[R] = -127;
motor[L] = -127;
while(true)
{
clearLCDLine(0);
    displayNextLCDNumber(SensorValue(sonar), 3);

    //++++++++++++++++++++++++++++++CLEAR++++++++++++++++++++++++++++++++
if( SensorValue(sonar) > 20 || SensorValue(sonar) == -1 )
{
motor[servo] = -2;
motor[drive] = 50;
}
//+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
//+++++++++++++++++++++++++++++APPROACH++++++++++++++++++++++++++++++
else if( SensorValue(sonar) <= 20 && SensorValue(sonar) > 15 )
{
motor[drive] = SensorValue(sonar) + 25; //power decreases
}
//+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
//+++++++++++++++++++++++++++STOP AND TURN+++++++++++++++++++++++++++
else if( SensorValue(sonar) <= 15 )
{
//stop
motor[drive] = 0;
wait1Msec(500);
//back and turn
motor[servo] = random[50] + 60;
     //random degree
motor[drive] = -50;  //random power
wait1Msec(1000);
}
//+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 }
}

REMOTE CONTROL MODE:


#pragma config(UART_Usage, UART2, VEX_2x16_LCD)
#pragma config(Sensor, dgtl1,  sonar,               sensorSONAR_inch)
#pragma config(Motor,  port1,           L,             tmotorServoStandard, openLoop)
#pragma config(Motor,  port2,           servo,         tmotorNormal, openLoop)
#pragma config(Motor,  port3,           drive,         tmotorNormal, openLoop)
#pragma config(Motor,  port10,          R,             tmotorNormal, openLoop)
//*!!Code automatically generated by 'ROBOTC' configuration wizard               !!*//
/////////////////////////////////////////////////////////////////////////
//  Author  : Steven Comer
//  Program : Remote control rover with VEXnet controller
//  Notes   : Throttle is Ch 2 (right joystick)
//            Steering is Ch 4 (left joystick)
//            Headlights are Ch 5U/5D and 6U/6D (on/off)
//  Updated : 10 June 2011 @ 12:30 PM
/////////////////////////////////////////////////////////////////////////
task main ()
{
 while(true)
{
//right headlight
if(vexRT[Btn6U] == 1)
motor[R] = -127;
if(vexRT[Btn6D] == 1)
motor[R] = 0;
//left headlight
if(vexRT[Btn5U] == 1)
motor[L] = -127;
if(vexRT[Btn5D] == 1)
motor[L] = 0;
//driving
motor[servo] = vexRT[Ch4];   //steering
motor[drive] = vexRT[Ch2];   //throttle
//LCD screen
displayLCDCenteredString(0,"VEX");
displayLCDCenteredString(1,"ROBOTICS");
}
}

Written by Jesse Flot

June 9th, 2011 at 3:31 pm

Posted in Cool projects,VEX

Tagged with , ,

Mars Rover NXT/VEX Robot with Rocker-Bogie Suspension

with one comment

[Submitted by Fealves78 from the ROBOTC forums]

Fealves78 submitted an incredible looking Mars Rover robot using (eek!) a combination of both NXT and VEX parts.

Both the robot and the the joystick are controlled by NXT bricks. The robot also uses 6 regular motors, 6 servos, a VEX Camera, and a set of Vex Lights.

Here is a new video of the robot being demonstrated at the National Space Foundation. The robot can go over rocks too!

Rocker-Bogie Suspension

Rocker-Bogie Suspension (Source http://en.wikipedia.org/wiki/File:Rocker-bogie.jpg)

The Rocker-Bogie suspension is actually a popular setup, and was used for the Mars Rover (hence, the robot name!) robot. It’s still favored by NASA in their MARS robots.

The suspension is coined “Rocker” because of its rocking aspect of the larger links in the system. The two sides of the chassis are connected via a differential. This allows each “rocker” to be able to move up and down independent of each other. Thus, this system allows the robot to drive over uneven terrain as well as rocks.

The word “Bogie” actually refers to the links that have a drive wheel at each end. Bogies were commonly used as load wheels in the tracks of army tanks as idlers distributing the load over the terrain. Bogies were also quite commonly used on the trailers of semi trailer trucks.

[See more at the Rocker-Bogie page at Wikipedia]

Here’s a video of the robot running over a grassy area:

What inspired you to build the robot?

I am a graduate Computer Science student and robotics is one of my interests. I am teaching robotics for kids in Colorado Springs through Trailblazer Elementary School. My students’ age range from 6 to 11 years old, and this is the first year that they were studying with me. We were inspired to build the Mars Rover robot by the Space Foundation – SF, which is located in Colorado Springs, CO. Through a grant with Boeing, the Space Foundation has donated 2 NXT robotics kit to our school, and I myself gave the Vex Kit for the students. Then, the SF challenged us to build a demo robot using some of the materials they have provided and the Mars Rover was our first big project.

How long did it take to build the robot?

The students spent about a week researching the design of the robot structure. It took 2 weeks to put it together and 2 more weeks to program the robot using Robot C. We also used the NXTSERVO-V2 form Mindsensors.com to control the robot’s 12 motors, 2 Lights, and camera.

What are your future plans with the robot?

All the work that we are doing is volunteer work. We started with one teacher and one school (Trailblazer) and 16 kids in the beginning of 2011. Now, with the help of graduate students from Colorado Technical University – CTU, the IEEE chapter from that school, and help from companies like the Space Foundation and MITRE, we are expanding to 40 kids and 3 schools by the end of the year. We are also willing to help teachers from Elementary, Middle, and High schools, who are willing to take robotics to the classroom as means to facilitate science to their students and to motivate them towards STEM education. Most of the schools have neither materials nor budget to start a robotics club. We are surviving with small donations and volunteer work. If you or anyone is interested in helping, please let us know.

In the little time we have been working with these kids both their regular teachers and parents are noticing improvement in the kid’s interest towards science and in their grades. For us, the CTU volunteers (students and IEEE members), this is a way to gain work experience and give back to the community.

Written by Vu Nguyen

June 6th, 2011 at 10:57 am

Posted in Cool projects,NXT,VEX