Cyber Robotics Lab.



Director: Prof. Seinosuke Narita

Our laboratory is doing research on Tele-Robotics and the analysis of space structure for a robots' path planning. We have developed a virtual robot simulator for the empirical research of Tele-Robotics and have proposed a method for the analysis of space structure from scene images from an ocellus aspect.






 



Virtual Wabian


Aspect from robot (Top)

Walking Simulation (Right)

Virtual robot simulator



We have developed a software test bed for tele-robotics, which considers delays in remote control and communication overheads. By utilizing an actual robot WABIAN, an empirical study of tele-robotics was done.

Since a remote PC is connected to the host which has a robot in a virtual space with LAN, behaviors of the robot are displayed which can be watched on a remote system. Moreover, remote control using the Internet can be done from anywhere if DirectX and the software to operate the robot are installed in the terminal. In addition, the delay accompanied with random changes in communication can be freely set in the simulator. Therefore, the maximum delay time that we can expect when operating the virtual robot at remote places without having a sense of incongruity, can be estimated.







 



Original picture image


Extraction of edges


Detection of parallel lines


Making of an environmental map


Analysis of space structure from a scene image



We are researching the analysis of a space structure from a scene image from an ocellus aspect to achieve creation of an autonomous mobile robot. In cases of space recognition by stereo vision, we detected the correspondence point from two images with two or more cameras, and the parallax was analyzed and the distance was measured. However, in cases of the ocellus aspect, since neither the correspondence point nor the parallax is utilized, we use a line information on the edge (outline) of the object within the plane image. Then, we analyzed the space by using the characteristics of a "disappearance point" which is where the extensions of two parallel lines in a three-dimensional space always intersect by one point when projected onto a plane.

First, the Sobel filter is used on the original picture image so that the edges can be extracted. Next, the segments are continuously emphasized, supplemented and integrated. In addition, an extra segment is removed. Moreover, the segments of which the extension line intersects by one point (disappearance point) are detected as parallel lines. Finally, space compositions such as the floors, ceiling, walls, and corners are specified from the intersection of the parallel lines and the point of contact, etc., and an environmental map is created.

The aforementioned processing was programmed utilizing C language, and we experimented on the computer with 40 images of a room which were taken beforehand. As a result, the absolute coordinates were not necessary for this technique, like strict space analysis by stereo viewing. However, when we presume the range of paths needed to pass, we confirmed that it was effective in rooms where a lot of parallel lines exist.










Content Top    


Copyright by Humanoid Robotics Institute, Waseda University. All rights reserved.