Implementation of Image Processing System using Handover Technique with Map Reduce Based on Big Data

Implementation of Image Processing System using Handover Technique with Map Reduce Based on Big Data in the Cloud Environment

Mehraj Ali and John Kumar

Department of Computer Application, Thiagarajar College of Engineering, India


Abstract: Cloud computing is the one of the emerging techniques to process the big data. Cloud computing is also, known as service on demand. Large set or large volume of data is known as big data. Processing big data (MRI images and DICOM images) normally takes more time. Hard tasks such as handling big data can be solved by using the concepts of hadoop. Enhancing the hadoop concept will help the user to process the large set of images. The Hadoop Distributed File System (HDFS) and MapReduce are the two default main functions which is used to enhance hadoop.  HDFS is a hadoop file storing system, which is used for storing and retrieving the data. MapReduce is the combination of two functions namely map and reduce. Map is the process of splitting the inputs and reduce is the process of integrating the output of map’s input. Recently, medical experts experienced problems like machine failure and fault tolerance while processing the result for the scanned data. A unique optimized time scheduling algorithm, called Dynamic Handover Reduce Function (DHRF) algorithm is introduced in the reduce function. Enhancement of hadoop and cloud and introduction of DHRF helps to overcome the processing risks, to get optimized result with less waiting time and reduction in error percentage of the output image.

Keywords: Cloud computing, big data, HDFS, mapreduce, DHRF algorithm.

Received September 13, 2013; accepted March 20, 2014

Full Text

 

 

 

Read 2614 times Last modified on Wednesday, 01 April 2015 06:51
Share
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…