ROBOTC.net Blog  

ROBOTC News

Archive for the ‘Cool projects’ Category

Michael’s Macro Mouse Project

with 2 comments

This project was submitted by ROBOTC user Michael B. He uses an NXT robot equipped with a HiTechnic EOPD (Electro Optical Proximity Detector) to determine the robots surroundings and then intelligently create and navigate a path through the maze maze.

From the creator:

It shows a robot solving a maze very similar to the micro mouse challenge. It’s an excellent application of 2D arrays. It’s also the most accessible task I could conceive of that would require students to build robots that remembered stuff about their surroundings, related that information and build on it, and then use that information to make intelligent decisions.

 

Here’s video of the Macro Mouse in action, with lots of additional detail:

Written by Jesse Flot

June 11th, 2012 at 9:36 am

Posted in Cool projects,NXT

Tagged with , ,

Bucket ‘o’ Bricks Brick Sorter

without comments

NeXT-Generation, over on the ROBOTC forums, posted a very cool project he’s been working on for the last two months.  It’s an automated brick sorter made with a combination of Mindstorms NXT, Power Functions and Pneumatics.


YouTube Direct Link 

The video might be long but it’s well worth watching!

Naturally, we asked him questions about his creation:

What motivated you to make this?

I wanted to build a robot that was interactive and would entertain smaller kids, and be mechanically interesting to older ones, and even adults. Here’s what happened: I planned for it to be able to “learn” where the colors were supposed to go. You could tell it if it put the brick in the right or wrong area until it learned where they all belonged. But, mechanical glitches in the construction that I didn’t have time to fix prevented that from happening. I probably would have made another console with the other NXT with the yes/no buttons, and it could make sounds and use the display to interact.

How long did it take?

Well, if you count total time it’s been built, about two months. But, now here’s the catch: I’ve really only been working on it for about one month, because I got sick twice over the last two months, so in total I was out of it for about a month. During that month I was also working on other stuff. Probably about a week was lost to messing with my Boe-Bot and Pololu 3Pi.

Do you have any plans for future improvements or modifications?

I plan to revisit the same kind of concept, but with no deadline so that I can work out any problems that come up.

What is the average air speed of a laden swallow?

The average airspeed of a laden swallow is 42.

A very cool project, indeed!

Written by Xander Soldaat

April 30th, 2012 at 11:46 am

Posted in Cool projects

Tagged with , ,

Neat video of an NXT game made in ROBOTC

with one comment

Here’s a neat video we found of a user who made a Pong / Brick Breaker type game in ROBOTC.

They’re using a wheel in order to control the platform to keep the ball from exiting the screen.Take a look!

Written by Vu Nguyen

April 24th, 2012 at 10:10 pm

Posted in Cool projects,NXT

NXT Tortoise feeding robot

without comments

[Thanks to -adrian- for submitting this project!]

Description:

Based off a modified version of the NXT shoot bot, the automatic tortoise feeder has three main components: a top feeding food hopper with a motor operated paddle that dispenses food, a color sensor for line following, and a reverse mounted touch sensor which acts as like a pull trigger. The touch sensor has a colored ball mounted in it which entices the tortoises to bite at it. Once that happens the robot dispenses the food, and then executes the line following program for several seconds and stops. At first the tortoises would just bite at it because it was brightly colored but I believe after only a few gos they’ve figured out now that pulling out it gives food which is a pretty impressive feat of reptile intelligence as far as I can tell.

What inspired you to build this robot?

I found out about NXT after watching a video sent around at work of the cubestormer robot and thought immediately it would be fun to do a robot that could interact with our two pet redfoot tortoises. Reptiles aren’t particularly trainable animals though ours are very food motivated so a robotic feeder seemed like a fun project to try. Fortunately tortoises are relatively slow moving and benign so building something to interact with them wasn’t that difficult. I also hadn’t seen any examples of NXT robots interacting with animals (though I think a friend of mine used the remote control shootbot to terrorize his cats?)

How long did it take you to make this?

Hard to say as I started and stopped several times? The programming took about a day once I went through the ROBOTC tutorials from Carnegie Mellon. I almost gave up initially trying to program it with the included NXT-G software and left the project alone until I found out about ROBOTC The construction maybe a week or two? I tried a few different designs before the current one that all had various problems. It took a while to figure out a way to mount the touch sensor in a way that would allow a tortoise to trigger it.

What are your future plans with the robot?

I’d like to try a modified mechanism for dispensing the food. The vertical mounted hopper and the irregular size of tortoise pellets makes the amount dispensed each time really difficult to control. My current idea is to try mounting the dispenser horizontally and use either one of the rubber treads or maybe a track from a lego technic set to dispense the foot more like a conveyor belt. Also might try a different way to move the robot around than using a line follow, possibly the distance sensor and some simple wall avoidance?

Written by Vu Nguyen

February 13th, 2012 at 12:11 pm

Posted in Cool projects,NXT

VEX Balancing Robot

with 2 comments

[Thanks to hmoor14 for submitting this project!]

hmoor14 put together a fun little (Ok, it’s not THAT little… ) robot. It’s a VEX robot that is able to keep upright while simultaneously acting as a punching bag! Take a look:

I asked hmoor14 a few questions about his robot:

1) What inspired you to build this robot?

I wanted to start learning about robots and how to control them. So, when I saw a video on a balancing robot, I decided I would try that project.

2) How long did it take you to make this?

This was my first robot, so it probably took longer than it should have!
I pretty much did it over the Christmas holidays and then some. So about a month part time. Most of the time was not actually spent building the actual robot but learning how to design it and test the pieces. Just getting around the deadzone in the motors took me a few days.

3) What are your future plans with the robot?

I’m fixing to take it apart, I need the parts for my next robot :( But, I am going to keep what I’ve learned (which was so, so much).

Close up of the robot:

Great job hmoor14!

Written by Vu Nguyen

February 9th, 2012 at 11:17 am

Mindsensors RCX Multiplexer controlled via Android and ROBOTC

with one comment

[All work done by Burf, original link: http://www.burf.org.uk/2012/01/01/mindsensors-rcx-multiplexer-controlled-via-android-and-robotc/]

We found another one of Burf’s work on his blog. If you don’t know Burf, he was the creator of a previous Cool Project on our blog, LEGO George.

Here’s another amazing post from his work that utilizes the RCX Multiplexer and an Android phone!

His blog reads,

——————————————————————————————————————————————

As you may be aware I have been building a Robot called Wheeler out of old parts (old grey and RCX 9V motors etc).  I was hoping to have it finished over the Christmas break but had hit a small issue with driving the wheels with the new weight of the body.  Anyway what I did managed to get up and running is the top half of Wheeer and the controller which is a Android phone (Dell Streak).

Mindsensors RCX Multiplexer

I was utterly impressed with the Mindsensors.com RCX Multiplexer and using Xanders driver suite (check BotBench) how fast I was up and running.  I wish there was a way to run the RCX Multiplexer off the NXT power supply but thats a small thing compared to how useful it is.  I wish I had 3 more of them so that I could control 16 RCX motors!

Android NXT Remote Control

So to try and work out how to control the NXT via Android, I stumbled across the NXT Remote Control project which is free to download.  This uses Lego’s Direct Commands to control the 3 motor ports on the NXT.  This means it bypasses your own code and you have no control over it.  However, what I managed to do is reduced it down to a very simple program that sends messages to the NXT which you can deal with in your own program.  In RobotC, it sends messages that are compatible with the MessageParam command and so you can send a message ID and 2 params to the NXT and deal with them in robotC anyway you want to.  Code will be available soon once I have tidied it up.

Written by Vu Nguyen

January 20th, 2012 at 2:24 pm

Posted in Cool projects,NXT

Skype-Controlled Mindstorms NXT Car

without comments

First of all, let me introduce myself: I’m Leon (aka dimastero/ dimasterooo), and I was recently invited to contribute to this blog. So as, my first post, I’d like to tell you about my new Skype-controlled LEGO Mindstorms NXT Car.

I’ve been creating websites for a while now, and I was trying to think of a way to combine it with Mindstorms NXT. This project is the result of that. The project’s webpage is fairly simple – it’s got three arrows (one forward, two to the sides), a start button, and a stop button. It’s also got instructions on it. Clicking the start arrow will begin a Skype conversation with my computer, after which you should share your screen; the NXT standing in front of my computer can then “see” the webpage with the arrows via your computer.

That’s where the cool part kicks in – when you any one of the arrows or the stop button, the page will change to a different shade of gray. This shade of gray is then picked up by the NXT, which turns it into a Bluetooth message for the other NXT on the car. The car then drives in the direction the user tells it to, while remaining within a fenced off area where the webcam can see it.

So, until January the 18th, you can drive a LEGO Mindstorms NXT car, from the comfort of your own home. To learn how and find out more about this project, click the link below:

http://worldofmindstorms.com/2012/01/04/interactive-skype-controlled-mindstorms-nxt-car/

Written by DiMastero

January 10th, 2012 at 9:00 am

Posted in Cool projects,NXT

Facial recognition using an NXT and an iPhone

with 2 comments

This is a robot that uses Face Recognition  in order to follow around a human. It uses an iPhone in conjunction with an NXT. Take a look!

You can download the Xcode Project and ROBOTC code here:  http://code.google.com/p/follow-me-robot/

How it works

The iOS code uses iOS 5’s face detection algorithm to find the position of the face within the video frame.  I then needed a way to communicate with the NXT robot and steer it.  Since I didn’t want to go through the trouble of communicating through bluetooth with it (and I don’t know how to do it!), I chose to communicate with the NXT using the Light Sensor that comes with the NXT.

If I want the robot to go to the left, I dim the lower portion of the iPhone screen and if I want it to go to the right I increase its intensity.  Also, when the phone does not see a face, I turn the lower portion of the screen black.  This tells the robot that it needs to not move forward and spin in-place until it finds a face.

In the ROBOTC code, I also make use of the sound sensor to start and stop the robot.  A loud sound is used to toggle between start and stop.

The ROBOTC and iOS code is very simple.

ROBOTC code

(Code subject to change. Download the latest version of the code!)


#pragma config(Sensor, S1,     lightSensor,         sensorLightInactive)
#pragma config(Sensor, S2,     soundSensor,         sensorSoundDB)
#pragma config(Motor,  motorA,          mA,            tmotorNormal, PIDControl, encoder)

task main()
{
wait1Msec(50);                         // The program waits 50 milliseconds to initialize the light sensor.
/*
float x;
while (1)
x = SensorValue[lightSensor];
*/

float minLight, maxLight, d, a, c, v, alpha = 0.01, stopGo=0.0;
int l, sound, startMotors = 0, lostFace, faceFound = 0;

a = 0.60;
minLight = 9;
maxLight = 34;
lostFace = 5;
v=20;

c = (minLight+maxLight)/2.0;

while (1) {

sound = SensorValue[soundSensor];
if(sound > 85) {
startMotors++;
startMotors %= 2;
wait10Msec(50);
}

l = SensorValue[lightSensor];
d = a*(l-c);

faceFound = (l > lostFace) ? 1:0;

stopGo = alpha*faceFound + (1-alpha)*stopGo;

motor[motorB] = (-d+v*stopGo)*startMotors;
motor[motorC] = (d+v*stopGo)*startMotors;
}

Written by ramin

January 9th, 2012 at 8:58 am

Posted in Cool projects,NXT

Line tracking and book climbing NXT robot

without comments

Here’s a video that a ROBOTC user shared with us. The NXT robot is able to line track and also climb a book that sits along the path. Take a look:

 

Written by Vu Nguyen

December 8th, 2011 at 3:00 pm

Posted in Cool projects,NXT

LEGO Street View Car v2.0

with one comment

Thanks To Mark over at www.mastincrosbie.com for creating this incredible project and providing the information. Also Thanks to Xander for providing the community with drivers to use the mentioned sensors in ROBOTC.

You might remember the original Lego Street View Car I built in April. It was very popular at the Google Zeitgeist event earlier this year.

I wanted to re-build the car to only use the Lego Mindstorms NXT motors. I was also keen to make it look more….car-like. The result, after 4 months of experimentation, is version 2.0 of the Lego Street View Car.

As you can see this version of the car is styled to look realistic. I also decided to use my iPhone to capture images on the car. With iOS 5 the iPhone will upload any photos to PhotoStream so I can access them directly in iPhoto.

The car uses the Dexter Industries dGPS sensor to record the current GPS coordinates.

The KML file that records the path taken by the car is transmitted using the Dexter Industries Wifi sensor once the car is within wireless network range.

Design details

The LEGO Street View Car is controlled manually using a second NXT acting as a Bluetooth remote. The remote control allows me to control the drive speed and steering of the car. I can also brake the car to stop it from colliding with obstacles. Finally pressing a button on the remote

Every time an image is captured the current latitude and longitude are recorded from the dGPS. The NXT creates a KML format file in the flash filesystem which is then uploaded from the NXT to a PC. Opening the KML file in Google Earth shows the path that the car drove, and also has placemarks for every picture you took along the way. Click on the placemark to see the picture.

For each GPS coordinate I create a KML Placemark entry that embeds descriptive HTML code using the CDATA tag. The image link in the HTML refers to the last image captured on disk.

The images are captured by triggering the camera on my iPhone. I use an app called SoundSnap which triggers the camera when a loud sound is heard by the phone. By placing the iPhone over the NXT speaker I can trigger the iPhone camera by playing a loud tone on the NXT. While this is not ideal (Bluetooth would be better) it does the job for now.

To get the photos from the iPhone I use the PhotoStream feature in iOS 5. I select the pictures in iPhoto and export them to my laptop. The iPhone will only upload photos when I am in range of a wireless network.

Finally the Dexter Industries Wifi sensor is used to wirelessly transmit the KML file to my laptop over the wireless network.


<Placemark>

<name>LSVC Snapshot 1</name>

<description><![CDATA[<img src='Images/IMG_1.jpg' width=640 height=480> ]]></description>

<Point>

<coordinates> -6.185952, 53.446190, 0</coordinates>

</Point>

</Placemark>

<Placemark>

<name>LSVC Snapshot 2</name>

<description><![CDATA[<img src='Images/IMG_2.jpg' width=640 height=480> ]]></description>

<Point>

<coordinates> -6.185952, 53.446190, 0</coordinates>

</Point>

</Placemark>

The snippet from the KML file gives you an idea of what each placemark should look like.

Once the car has finished driving press the orange button on the NXT to save the KML file. This writes a <pathstring> entry which records the actual path of the car. A path string is simply a list of coordinates that define a path in Google Earth along the Earth’s surface. For example:


<Placemark>

<name>LSVC Path</name>

<description>LSVC Path</description>

<styleUrl>#yellowLineGreenPoly</styleUrl>

<LineString>

<extrude>10</extrude>

<tessellate>10</tessellate>

<altitudeMode>clampToGround</altitudeMode>

<coordinates>

-6.185952, 53.446190, 0

-6.185952, 53.446180, 0

</coordinates>

</LineString>

</Placemark>

Is a path two coordinates not far from where I live.

From the NXT to Google Earth

How do we get the pictures and KML file from the NXT and into Google Earth? First of all we need to get all the data in one place. The KML file refers to the relative path of each image, so we can package the KML file and the images into a single directory.

An example of the output produced is shown below. In this test case I started indoors in my house and took a few pictures. As you can see the dGPS has trouble getting an accurate reading and so the pictures appear to be scattered around the map. I then drove the car outside and started to capture pictures as I drove. From Snapshot 10 onwards the images become more realistic based on where the car actually is.

Video

I shot some video of the car driving outside my house. It was a windy dull day, so the video is a little dark. The fun part is seeing the view from on-board the car!

More videos are coming soon…

Photos

[nggallery id=1]

Written by Vu Nguyen

November 14th, 2011 at 1:11 pm