IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Information Fusion of Multi-Sensor Images

Information Fusion of Multi-Sensor Images
View Sample PDF
Author(s): Yu-Jin Zhang (Tsinghua University, Beijing, China)
Copyright: 2009
Pages: 7
Source title: Encyclopedia of Information Science and Technology, Second Edition
Source Author(s)/Editor(s): Mehdi Khosrow-Pour, D.B.A. (Information Resources Management Association, USA)
DOI: 10.4018/978-1-60566-026-4.ch307

Purchase

View Information Fusion of Multi-Sensor Images on the publisher's website for pricing and purchasing information.

Abstract

The human perception to the outside world is the results of action among brain and many organs. For example, the intelligent robots that people currently investigate can have many sensors for sense of vision, sense of hearing, sense of taste, sense of smell, sense of touch, sense of pain, sense of heat, sense of force, sense of slide, sense of approach (Luo, 2002). All these sensors provide different profile information of scene in same environment. To use suitable techniques for assorting with various sensors and combining their obtained information, the theories and methods of multi-sensor fusion are required. Multi-sensor information fusion is a basic ability of human beings. Single sensor can only provide incomplete, un-accurate, vague, uncertainty information. Sometimes, information obtained by different sensors can even be contradictory. Human beings have the ability to combine the information obtained by different organs and then make estimation and decision for environment and events. Using computer to perform multi-sensor information fusion can be considered as a simulation of the function of human brain for treating complex problems. Multi-sensor information fusion consists of operating on the information data come from various sensors and obtaining more comprehensive, accurate, and robust results than that obtained from single sensor. Fusion can be defined as the process of combined treating of data acquired from multiple sensors, as well as assorting, optimizing and conforming of these data to increase the ability of extracting information and improving the decision capability. Fusion can extend the coverage for space and time information, reducing the fuzziness, increasing the reliability of making decision, and the robustness of systems. Image fusion is a particular type of multi-sensor fusion, which takes images as operating objects. In a more general sense of image engineering (Zhang, 2006), the combination of multi-resolution images also can be counted as a fusion process. In this article, however, the emphasis is put on the information fusion of multi-sensor images.

Related Content

Christine Kosmopoulos. © 2022. 22 pages.
Melkamu Beyene, Solomon Mekonnen Tekle, Daniel Gelaw Alemneh. © 2022. 21 pages.
Rajkumari Sofia Devi, Ch. Ibohal Singh. © 2022. 21 pages.
Ida Fajar Priyanto. © 2022. 16 pages.
Murtala Ismail Adakawa. © 2022. 27 pages.
Shimelis Getu Assefa. © 2022. 17 pages.
Angela Y. Ford, Daniel Gelaw Alemneh. © 2022. 22 pages.
Body Bottom