We are now performing three-dimensional scans with the Sick laser mounted vertically on our SuperScout robot. The scanner scans continuously while the robot performs a 360 deg rotation. It takes 30 seconds to perform one full scan with half a degree resolution in both dimensions. As for the 2D scanner, we developed a "daemon" that provides a transparent interface through TCP/IP for any client that wants to obtain the results of a scan. These results can either be sent as a cloud of points, or, to compress the data, as a set of vertical polygonal lines. Some pictures of the results obtained in both formats can be found at:
As the laser is not mounted exactly at the center of the robot, we need to calibrate carefully its position. Otherwise, the map stitching cannot work properly. We performed this calibration for the 2D setup and are currently doing it for the 3D setup.
We are now able to do a better map stitching of parallel lines by using corners in the overlaid maps. We have integrated our powerful line fitting code into the map stiching algorithm. In doing so, we have discovered that the data from SRI on which we were working has a bias incompatible with the way in which the laser scanner works (there should be only radial noise and no angular noise). As we are making assumptions about the structure of the noise in our line fitting code, we were obtaining suboptimal results on the SRI data. Therefore, we have been creating our own data bank on which we are testing our map stitching algorithm, and line fitting now works as expected.
The camera we are using for target tracking is a Sony EVI/D30 and is equiped with pen/tilt/zoom capabilities. We now take advantage of the pan capability to follow the target even when the observer is not moving. In practice, this means that the camera performs as though it had a 180 degree vision cone. The camera motion is implemented at the level of the visual target server and adds a negligible CPU overhead.
After several disappointments, we finally got the controller to work, while at the same time removing the oscillations that were happening occasionally and accelerating its behavior. We are working on making it work more smoothly so as to avoid a lot of time wasted in rotations.
We found a bug in the software library provided by Nomadic that was making the controller use up much more CPU time than was necessary (up to 50% of the CPU cycles). We corrected this bug and the controller now uses less than 1% of the CPU cycles. This is important as it means we won't have to add another CPU on-board the robot.
We started working on the problem of two observers and two targets. We have devised several possible strategies for collaboration between the observers, and we are planning to implement them soon. We have done some work at the system level to allow two observers on different machines to communicate with a planner running on a third one.
We compiled a partial list of deliverables, available at:
This list is being updated as the research proceeds.