Monthly Technical Report

March 1999


We have combined the technical reports for April and May 1999. We did not send a report for April 1999 since the deadline for sending it was shortly before the Third Quarterly IPR Meeting.

Dr. Nahid Sidki and Mr. Joe Urda visited our lab on April 9. During this visit, we introduced the members of our group to them, described the experimental set-up we have assembled in our lab, and demonstrated some of our software running on the robots, as well as other algorithms running in simulation.

Based on a request made by Col. John Blitch during a site visit to our lab on March 17, we prepared a video that demonstrates:

  1. the range data gathered by the SICK laser as the robot moves,
  2. the different trade-offs in the next-best view algorithm for map-building,
  3. how our target-finding algorithm in 2D adapts to different properties of the sensor (specifically, omnidirectional sensors versus sensors with cone vision), and
  4. our target-finding algorithm for an aerial observer.
We have given a copy of this video to Dr. Sidki.

We also have responded to concerns expressed by Col. John Blitch by writing a memo describing how we address the issue of adapting our algorithms to perform robustly and efficiently in three dimensions.

We attended the Third Quarterly IPR Meeting held in Detriot MI held from May 10-13, 1999. We presented the milestones we have achieved in the last quarter. During the demonstration at TARDEC on May 12, we showed our next-best-view and art gallery algorithms running in simulation. Unfortunately, we were unable to show a robotic demonstration over the internet since the TARDEC internet connection broke down just before the demonstrations started. However, we had successfully run our demo a short moment before the connection failed.

In the last two months, we have achieved progress on several fronts.

We have set up initial contacts for collaboration with SRI, SAIC, and JPL. In particular, we are combining SRI's expertise in perception with our expertise planning in the context of map-building and visual tracking. With SAIC, we have started initial discussions on sharing our software for target finding and target tracking.

We have completed the implementation of our next-best view technique for building 2D maps. As we demonstrated during the Third Quarterly IPR meeting, this algorithm computes a small set of positions from which the robot needs to observe the environment in order to build a map of the environment. The algorithm has the useful property that it is sensor independent; it can adapt its performance to the specific sensor properties that are input to it. We have experimentally demonstrated this adaptivity of the planner.

We have also made significant progress on target-tracking. On top of a basic target-tracking algorithm running on our SuperScout robots equipped with pan-tilt cameras, we have implemented a more sophisticated target-tracking planner that uses the model of the environment constructed in the map-building phase in order to track the target more robustly. Our algorithm computes how quickly the target can move out of the visibility region of the tracker and moves the tracker in such a way as to maximize the escape time of the target. We are now engaged in porting this algorithm to the SuperScout.

We have acquired a new Sick laser sensor. This sensor can be used in several ways:

  1. mounted horizontally on a second SuperScout robot, it will allow us to perform multi-robot map-building experiments;
  2. mounted vertically on a SuperScout, it will allow us to build 3D models of an environment; and
  3. mounted horizontally on the same SuperScout as the first laser sensor, it will make it possible to build 2D maps in two different horizontal "slices" of the environment.
In mid-June, one undergraduate student and two masters students will join our TMR group. They will work on our TMR project for the summer quarter. We plan to use their skills to do some of the programming tasks that we need to complete in the TMR project.