Warning: Undefined array key 8 in /customers/2/c/3/ijaiem.org/httpd.www/pabstract2.php on line 40
Call of Papers for Current Volume **************** OnLine Submission of Paper

Title: Object Detection and Image Recovery Techniques: A Review


Object Detection and Image Recovery Techniques: A Review
Author Name:
Upasana Sharma, Dr. Vineet Richariya
Human gestures are expressive, meaningful body motions involving physical movements of the fingers, hands, arms, head or body with the intent to convey meaningful information or to communicate with the environment. With the rapid development of computer technology, human computer interaction has become an ubiquitous activity in our daily life. More attention has been focused on translating these human gestures into computer-understandable language in the past few years. Many gesture tracking and recognition technologies have been proposed. So there are two kinds of activity recognition technology, one is vision-based detection, the other is inertial sensor based detection. Vision-based detection is a traditional method for gesture recognition, however, in certain applications, it is not that convenient to use such as moving in the dark or monitoring continuously for really a long time and so on. Nowadays, as the low prices of inertial sensor systems and the fast development of wireless sensor network, it is more and more convenient to use inertial sensors to detect human activities in certain conditions. But it is bit expensive too. Image registration technique has been a hot research field of image processing and also encountered with several challenges. MAGE registration can be described as a process of geometrically aligning two images, the reference image and the sensed image. Most of the applications above require high efficiency and accuracy. For instance, a registration accuracy of less than one-fifth of a pixel is required for acquiring a change detection error of less than 10%. However, it is challenging for most of the existing methods to satisfy this registration accuracy. That is the reason that still we are looking promising research in this area.