|[Home] [Credit Search] [Category Browser] [Staff Roll Call]||The LINUX.COM Article Archive|
|Originally Published: Monday, 15 October 2001||Author: S.A. Hayes, Linux.com|
|Published to: develop_articles/Development Articles||Page: 3/6 - [Printable]|
Linux.com Interview: Mark Micire of the Center for Robotic Assisted Search and Rescue at the University of South Florida.
Mark Micire helps build and deploy the robots used to speed the search for survivors in the rubble of the World Trade Center. The CRASAR Lab that create these remarkable robots use Linux for most of their computing needs. Linux, and the geeks, researchers and professors who use it are speeding the development of critical technologies like USAR Robotics, technologies that help save lives. Linux.com got in touch with this lab in South Florida to get a closer look at who they are, what they do, and what they do it with.
|The Robots||<< Page 3 of 6 >>|
The RobotsLinux.com: How many kinds of USAR robots are there?
Mark Micire: Many. I wouldn't even try to innumerate the different "kinds" and capabilities. The USF team has three primary ones as described earlier. We were one of four robotics teams in NY, and there were at least five other kinds of robots associated with those teams.
Credit for all photo media in this article goes to the Center for Robot-Assisted Search and Rescue.
Linux.com: Do they all run Linux?
Mark Micire: No. Some don't run anything at all, as in the case if the Inuktuns. Of the ones with processors, I can pretty safely say yes; Linux in some form.
Linux.com: What are the robots capable of, physically and "intellectually"?
Mark Micire: Part 1: Physical
The physical capabilities depend on the robot. We have many. I'll try to summarize the USAR ones:
Inuktun Microtracs: These were our New York work force. Rugged and strong little beasts, but relatively stupid. They have a Motorola HC11 for motor and logistic control. Originally designed for chemical inspection, they are no larger than a shoe box and have a tether to provide power and feedback to the teleoperator. Because of their size and "packability" they were able to crawl into very small spaces. Some of them have the ability to change shape which is very useful in tight spaces. Sensing includes visible-light cameras, two-way audio, and twin halogens.
iRobot Urbans: Although not used in NY, these are used a lot in our training. These have Pentium II class processor boards and are able to perform more complex tasks like image processing and control assistance. They have the ability to climb stairs and are fairly rugged. They can be equipped with a wide range of sensing capabilities including sonar range finding, laser range finding, visible-light cameras, infrared-thermal imaging cameras, and halogen lighting.
iRobot ATRV2: Again, went to NY, but not used. This is the "mother ship" for our three Urbans. It has off road knobby wheels and is very powerful. With four deep-cycle marine batteries and a geared motor for each wheel, it can run for hours. It has the capability to carry three of the Urbans into a hazardous site and deploy them with a large docking arm. Again, Pentium II class processors allow this machine to perform higher level tasks. Because of its large size, just about any sensor can be mounted on this platform. All of the above mentioned sensors have been on this system at one time or another.
Part 2: Intellectual As for "intellectual" abilities, we have worked on many projects. Our main focus in USAR has been in victim detection and sensor fusion. Through our training we found that one of the hardest tasks is interpreting all of the data coming from the sensors. The AI in our system is used for fusing the various sensors in such a way that the user is alerted to victims that might otherwise have been missed if viewed by a single sensor. Another goal of this fusion is the intelligent selection of sensors given different environmental characteristics or sensor failure.
I will let a simple example illustrate. Let's take one of our Urban platforms and put a color camera and a thermal imaging camera on it. Now, rather than make the operator view two video streams, we fuse the two together. The image processing on the thermal camera attempts to look for something in the image around 98 degrees. The color camera is looking for something flesh colored.
Individually, finding something in this classification might not be a human. Seeing both of these characteristics in the same region is significant though. Now repeat this over three or four sensors and you quickly create a very efficient victim detection system that can alert the operator and let them know that there is possibly something here they should look at.
Linux.com: Wow! So, what's the complete sensor set?
Mark Micire: Things we have now: Color Cameras - good old fashioned color CCD cameras. Omnidirectional Cameras - cameras that can see in 360-degrees. Infrared Imaging Cameras - cameras that can see heat. Infrared Thermal Meters - probes that can give relatively accurate "point" temperatures. Laser Range finders - uses an infrared laser to give a 180-degree range plane that shows the robot's proximity to things in the environment. Sonar Range finders - uses ultrasonic pulses to give a point distance to objects in the environment. Two-way audio - speakers and microphones to communicate and listen for victims. GPS - Global positioning system. Tilt sensor - Gives pitch and roll characteristics. Thermometer - temperature inside the robot. Compass - magnetic compass.
Possibilities in the near future: Air quality meters - to detect flammables, oxygen levels, etc. Bio-stat sensors - to give victim health status. Penetrating microwave radar - to look through walls and floors for movement and even heartbeats.
Linux.com: For those who haven't seen them yet, how do the robots move, for the most part? Did you try any alternate movement designs that might be interesting to mention?
Mark Micire: All of our USAR robots are tracked vehicles with the exception of the ATRV2. All use tracked steering. Because we do not actually manufacture the robots here at US, we have tried to find what similar industries we might be able to exploit. As I said before, the Inuktuns were designed for chemical inspection such as power plant pipes and tanks. By taking that system and adapting it for our own needs, we have been able to take commercial robots and turn them into specialized USAR robots.
Linux.com: What are the robots best at in USAR at the moment?
Mark Micire: In NY, our niche was very small voids in the pile. Two things contributed to this. First, some of these holes were so deep that you could not see to the bottom but were too small for even a search dog to get into. Our little Inuktuns filled this need very quickly. Second, the heat in the bottom of some of these holes was very intense. Even if you could get a dog or person in the hole, you would not want to expose anyone to that kind of heat. In one case, one of the tracks was heated enough that it fell off the robot while in the hole.
Long term, sensing will be the big winner for robots. By adding AI into the sensing capabilities of the robots, we can make the victim sensing much more easy to use and accurate at the same time.
Linux.com: Do the robots move rubble and debris intelligently yet?
Mark Micire: Right now, our robots are primarily tele-operated in rubble. We have done some "guarded motion" work that uses the range sensing capabilities of the robot to help the operator keep from running into walls and debris.
Linux.com: What might these kinds of robots be able to do 5 or 10 years from now?
Mark Micire: All of the above with regards to sensing, and hopefully more. There is a lot needing to be done mechanically before the robots can navigate the rubble piles autonomously. Once that hurdle is cleared, the potential is mind boggling. In 10 years I hope to be reading about the first human rescued by a robot.
|The Robots||<< Page 3 of 6 >>|