Inicio > ain_tech > Proyectos > FUSION Project

FUSION Project

Sensor merging of image, lidar and navigation data to characterize unknown and dynamic environments (SENSOR MERGING)


FUSION Project

CLIENTS

BUDGET

882.000 鈧 (50% funding)

PROGRAMME

Research and Development Projects

PROJECT START/END

January 2010 / December 2012

DESCRIPTION

The objective of the Project is to develop visual perception techniques to be applied to autonomous robots used in aerial, terrestrial or aquatic missions in unknown environments. On the one side, we aim to develop pan optical system avoiding collision risk and on the other side, we look to generate, at the same time, a 3D map of this environment. Image based systems are an attractive solution given their low cost and weight.

So, the Project addresses the problem of the dense tridimensional reconstruction starting from sequence of images with monocular vision. This 3D reconstruction presents some barriers that have been overcame by merging this information with the one provided by an active optic system (lidar 2D) and other location sensors (GPS) and inertial unit.

By means of the development of the suitable processing algorithms and data merging techniques, it is possible:

  • To characterize the movement of the own robot (3 angular and 3 linear speed components).
  • To segment scene elements having independent movements (by instance other robots) and characterize their movements as well.
  • To assess the risk of collisions whenever the vehicle is moving.
  • To generate maps of one given area, including a probabilistic image of site occupation by means of processing cumulative time information.

ACTIVITY DEVELOPED BY AINTECH

  • Development of the visual odometry algorithms.
  • Inner and outer sensor calibration.
  • Sensors synchronisation.
  • Probabilistic area occupation maps (octrees).
  • Analysis and characterization of independent robots movements.

PHOTOS

Image 1. Sensors y and associated coordinates systems

Image 2. a) Scene and Area occupation maps at different resolutions of b) 0.1m,聽 b) 0.2 m y c) 0.4m built with de data from lidar and optical disperse flow. The occupation map is drawn under a different perspective to show that, in spite of working with 2d projections of a 3D space, it is possible to revert this projection by means of the movement information (see the trees).

Image 3. Detection and characterization of independent movements by means of optical filtering.






Asociaci贸n de la Industria Navarra
Ctra. Pamplona, 1 - Edificio AIN | 31191 | Cordovilla | Pamplona Tel.948 421 101 Fax.948 421 100 email: ain@ain.es