Monday, 31 December 2018 04:02

An Automatic Localization of Optic Disc in Low Resolution Retinal Images by Modified Directional Matched Filter

Murugan Raman1, Reeba Korah2, and Kavitha Tamilselvan3

1College of Engineering Guindy, Anna University, India

2Alliance College of Engineering and Design, Alliance University, India

3New Prince Shri Bhavani College of Engineering and Technology, Anna University, India

Abstract: An automatic optic disc localization in retinal images used to screen eye related diseases like diabetic retinopathy. Many techniques are available to detect Optic Disc (OD) in high-resolution retinal images. Unfortunately, there are no efficient methods available to detect OD in low-resolution retinal images. The objective of this research paper is to develop an automated method for localization of Optic Disc in low resolution retinal images. This paper proposes a modified directional matched filter parameters of the retinal blood vessels to localize the center of optic disc. The proposed method was implemented in MATLAB and evaluated both normal and abnormal low resolution retinal images using the subset of Optic Nerve Head Segmentation Dataset (ONHSD) and the success percentage was found to be an average of 96.96% with 23seconds.

Keywords: Retinal image processing, dabetic retinopathy, optic disc, bood vessels, modified directional matched filter.

Received May 7, 2015; accepted October 7, 2015 
 
Monday, 31 December 2018 04:00

GLCM Based Parallel Texture Segmentation using

A Multicore Processor

Shefa Dawwd

Department of Computer Engineering, Mosul University, Iraq

Abstract: This paper investigates the using of Gray Level Co-Occurrence Matrix (GLCM) based on supervised texture segmentation. In most texture segmentation methods, the processing algorithm is applied to a window of the original image rather than to the entire image using sliding scheme. To attain a good segmentation accuracy especially in the boundaries, optimal size of window is determined, or windows of variant sizes are used. Both options are very time consuming. Here, a new technique is proposed to build an efficient GLCM based texture segmentation system. This scheme uses a fixed window of variant apertures. This will reduce the computation overhead and recourses that required to compute GLCM, and will improve the segmentation accuracy. Image's windows are multiplied with a matrix of local operators. After that, GLCM is computed and features are extracted and classified and the segmented image is produced. In order to reduce the segmentation time, two similarity metrics are used to classify the texture pixels. Euclidean metric is used to find the distance between the current and previous GLCM. If it is above a predefined threshold, then the computation of GLCM descriptors are required. Gaussian metric is used as a distance measure between two GLCM descriptors. Furthermore, a median filter is applied to the segmented image. Finally, the transition and misclassified regions are refined. The proposed system is parallelized and implemented on a multicore processor.

Keywords: GLCM, haralick descriptors, median filter, moving window, texture segmentation.

Received November 13, 2014; accepted September 19, 2016
 
Monday, 31 December 2018 03:58

A Real Time Extreme Learning Machine for

Software Development Effort Estimation

Kanakasabhapathi Pillai1 and Muthayyan Jeyakumar2

1Department of Electrical and Electronics Engineering, Kalaivanar Nagercoil Sudalaimuthu Krishnan College of Engineering, India

2Department of Computer Applications, Noorul Islam University, India

Abstract: Software development effort estimation always remains a challenging task for project managers in a software industry. New techniques are applied to estimate effort. Evaluation of accuracy is a major activity as many methods are proposed in the literature. Here, we have developed a new algorithm called Real Time Extreme Learning Machine (RT-ELM) based on online sequential learning algorithm. The online sequential learning algorithm is modified so that the extreme learning machine learns continuously as new projects are developed in a software development organization. Performance of the real time extreme learning machine is compared with training and testing methodology. Studies were also conducted using radial basis function and additive hidden node. The accuracy of the Real time Extreme Learning machine with continuous learning is better than the conventional training and testing method. The results also indicate that the performance of radial basis function and additive hidden nodes is data dependent. The results are validated using data from academic setting and industry.

Keywords: Software effort estimation, extreme learning machine, real time, radial basis function.

Received October 5, 2014; accepted March 30, 2016 
 
Monday, 31 December 2018 03:57

Contactless Palmprint Verification System using

2-D Gabor Filter and Principal Component Analysis

Satya Verma and Saravanan Chandran

Computer Centre, National Institute of Technology, India

Abstract: The palmprint verification system is gaining popularity in the Biometrics research area. The palmprint provides many advantages over other biometric systems such as low-cost acquisition device, high verification accuracy, fast feature extraction, stability, and unique characteristics. In this research article a new palmprint verification model is proposed using Sobel Edge Detection, 2D Gabor Filter, and Principal Component Analysis (PCA). The proposed new model is tested with the Indian Institute of Technology Delhi (IITD) palmprint database. The experimental results of the proposed new model achieves 99.5% Total Success Rate and 0.5% Equal Error Rate. The experimental result confirms that the proposed new model is more suitable compared to other existing biometric techniques.

Keywords: Biometric, palmprint, 2-D gabor filter, PCA.

Received August 19, 2015; accepted January 13, 2016

 

Monday, 31 December 2018 03:55

Human Facial Image Age Group Classification

Based On Third Order Four Pixel Pattern (TOFP)

of Wavelet Image

Rajendra Chikkala1, Sreenivasa Edara2, and Prabhakara Bhima3

1Department of Computer Science and Engineering, Research Scholar, India

2Department of Computer Science and Engineering, Dean Acharya Nagarjuna University College of Engineering and Technology, India

3Department of Electronics and Communication Engineering, Jawaharlal Nehru Technological University Kakinada, India

Abstract: The present paper proposes a novel scheme for age group classification based on Third Order Four pixel Pattern (TOFP). This paper identified TOFP patterns in two forms of diamond pattern which have four pixels i.e., outer diamond and inner diamond patterns in Third Order neighborhood. The paper derives Grey-Level Co-occurrence Matrix (GLCM) of a Wavelet image based on the values of Outer Diamond Corner Pixels (ODCP) of TOFP and Inner Diamond Corner Pixels (IDCP) of TOFP on wavelet image which is generated from the original image without using the standard method for generating the co-occurrence matrix. Four GLCM features are extracted from the generated matrix. Based on these feature values, the age group of the human facial image was categorized. In this paper, human age is classified into six age groups such as Child: 0-9 years, Adolescent: 10-19 years, Young Adult: 20-35 years, Middle-Aged Adults: 36 -45 years, Senior Adults 46–60 years, Senior Citizen: age > 60. The proposed method is tested on different databases and comparative results are given.

Keywords: GLCM, pixel pattern, age group classification, four pixel pattern, outer diamond, inner diamond.

Received July 23, 2015; accepted March 24, 2016 
 
Monday, 31 December 2018 03:53

Enhancement of Human Visual Perception-Based

Image Quality Analyzer for Assessment of Contrast

Enhancement Methods

Soong Chen1, Tiagrajah Janahiraman2, and Azizah Suliman1

1College of Computer Science and Information Technology, Universiti Tenaga Nasional, Malaysia

2College of Engineering, Universiti Tenaga Nasional, Malaysia

Abstract: Prior to this work, Human Visual Perception (HVP) -based Image Quality Analyzer (IQA) has been proposed. The HVP-based IQA correlates with human judgment better than the existing IQAs which are commonly used for the assessment of contrast enhancement techniques. This paper highlights the shortcomings of the HVP-based IQA such as high computational complexity, excessive (six) threshold parameter tuning and high performance sensitivity to the change in the threshold parameters’ value. In order to overcome the aforementioned problems, this paper proposes several enhancements such as replacement of local entropy with edge magnitude in sub-image texture analysis, down-sampling of image spatial resolution, removal of luminance masking and incorporation of famous Weber-Fechner Law on human perception. The enhanced HVP-based IQA requires far less computation (>189 times lesser) while still showing excellent correlation (Pearson Correlation Coefficient, PCC > 0.90, Root Mean Square Error, RMSE<0.3410) with human judgment. Besides, it requires fewer (two) threshold parameter tuning while maintaining consistent performance across wide range of threshold parameters’ value, making it feasible for real-time video processing.

Keywords: Contrast enhancement, histogram equalization, image quality, noise, weber fechner.

Received October 4, 2015; accepted March 30, 2016
 
Monday, 31 December 2018 03:51

Shamir’s Key Based Confidentiality on Cloud Data Storage

Kamalraj Durai

Department of Computer Science, Bharathiar University, India

Abstract: Cloud computing is a flexible, cost effective and proven delivery platform for providing business or consumer services over the Internet. Cloud computing supports distributed service over the Internet as service oriented architecture, multi-user, and multi-domain administrative infrastructure, hence it is more easily affected by security threats and vulnerabilities. Cloud computing acts as a new paradigm where it provides a dynamic environment for end users and also guarantees Quality of Service (QoS) on data confidentiality. Trusted Third Party ensures the authentication, integrity and confidentiality of involved data and communications but fails on maintain the higher percentage of confidential rate on the horizontal level of privacy cloud services. TrustedDB on the cloud privacy preservation fails to secure the query parsers result for generating efficient query plans. To generate efficient privacy preserving query plans on the cloud data, we propose Shamir’s Key Distribution based Confidentiality (SKDC) Scheme to achieve a higher percentage of confidentiality by residing the cloud data with polynomial interpolation. The SKDC scheme creates a polynomial of degree with the secret as the first coefficient and the remaining coefficients picked up at random to improve the privacy preserving level on the cloud infrastructure. The experimental evaluation using SKDC is carried out on the factors such as system execution time, confidentiality rate and query processing rate, which improves the efficiency of confidentiality rate and query processing while storing and retrieving in cloud.

Keywords: Confidentiality, privacy, cloud computing, SKDC, privacy preserving and polynomial interpolation.

Received April 25, 2015; accepted January 28, 2016
 
Monday, 31 December 2018 03:47

Towards Automated Testing of Multi-Agent

Systems Using Prometheus Design Models


Shafiq Ur Rehman1, Aamer Nadeem1, and Muddassar Sindhu2

1Center for Software Dependability, Capital University of Science and Technology, Pakistan

2Department of Computer Science, Quaid i Azam University, Pakistan


Abstract: Multi-Agent Systems (MAS) are used for a wide range of applications. Goals and plans are the key premise to achieve MAS targets. Correct and proper execution and coverage of plans and achievement of goals ensures confidence in MAS. Proper identification of all possible faults in MAS working plays its role towards gaining such confidence. In this paper, we devise a model based approach which ensures goals and plans coverage. A Fault model has been defined covering faults in MAS related to goal and plan execution and interactions. We have created a test model using Prometheus design artifacts, i.e., Goal overview diagram, Scenario overview, Agent and Capability overview diagrams. New coverage criteria have been defined for fault identification. Test Paths have been identified from test model. Test cases have been generated from test paths. Our technique is then evaluated on actual implementation of MAS in JACK Intelligent Agents is a framework in Java for multi-agent system development (JACK) by executing more than 100 different test cases. Code has been instrumented for coverage analysis and faults have been injected in MAS. This approach successfully finds the injected faults by applying test cases for coverage criteria paths on MAS execution. ‘Goal plan coverage’ criterion has been more effective with respect to fault detection while scenario, capability and agent coverage criteria have relatively less scope in fault identification.

Keywords: Goal sub goals coverage, MAS faults identification, model based goal plan coverage.

Received April 18, 2016; accepted September 19, 2016
 
Monday, 31 December 2018 03:45

New Fool Proof Examination System through Color

Visual Cryptography and Signature Authentication

Mohamed Fathimal1 and Arockia Jansirani2

1Department of Computer Science and Engineering, SRM Institute of Science and Technology, India

2Department of Computer Science and Engineering, Manonmaniam Sundaranar University, India

Abstract: There have been widespread allegations about the question papers leakage for a number of subjects in the recently held Secondary School Leaving Certificate examinations. The leakage is due to the practice of using printed question papers. Such incidents and subsequent cancellation of examinations are happening frequently. This creates political and social embarrassment and causes loss of money and time. This paper proposes a new system of foolproof examination by tamperproof e-question paper preparation and secure transmission using secret sharing scheme. The application is perfectly secure because the proposed method automatically embeds the corresponding institute seal in the form of the key. As a result, it is easy to trace out the source culprit for the leakage of question papers. This scheme has reduced reconstruction time because the reconstruction process involves only Exclusive-OR (XOR) operation apart from authentication. The proposed method recovers the original secret image without any loss. The existing visual cryptographic scheme recovers half-toned secret image with average Peak Signal-to-Noise Ratio (PSNR) value 24dB. Further, it shall be stated that the proposed method with authentication recovers the image with 64.7dB PSNR value, which is greater than that of the existing method. In addition, this method does not suffer from pixel Expansion.

Keywords: Visual cryptography, secret sharing scheme, examination system, information security, authentication.

Revived August 20, 2014; accepted March 9, 2016
 
Monday, 31 December 2018 03:43

A Reliable Peer-to-Peer Protocol for Multi-Robot

Operating in Mobile Ad-Hoc Wireless Networks

Tarek Dandashy1, Mayez Al-Mouhamed2, and Irfan Khan2

1Department of Computer Science, Balamand University, Lebanon

2Department of Computer Engineering, King Fahd University of Petroleum and Minerals, KSA

Abstract: Cooperative behaviour in multi-robot systems are based on distributed negotiation mechanisms. A set of autonomous robots playing soccer may cooperate in deciding a suitable game strategy or role playing. Degradation in broadcast and multicast services are widely observed due to the lack of reliable broadcast in current IEEE 802.11. A reliable, Peer-To-Peer (P2P), fast auction-based broadcast is proposed for a team of robots playing soccer interconnected using an ad-hoc wireless mobile network. Auction broadcast includes a sequence order to determine the reply order of all nodes. This helps minimizing the potential of Medium Access Control (MAC) conflicts. Repeated back-off are not desired especially at low load. Uncoordinated negotiation lead to multiple outstanding auctions originated by distinct nodes. In this case, the sequence order becomes useless as auction times are interleaved. An adaptive MAC is proposed to dynamically adjust the reply. Protocols are implemented as symmetric multi-threaded software on an experimental Wireless Local Area Network (WLAN) embedded system. Evaluation reports the distribution of auction completion times for peer-to-peer operations for both static and mobile nodes. Protocol trade-offs with respect to auction response time, symmetry and fairness, and power consumption are discussed. Proposed protocols are embedded as a library for multi-robot Cooperative Behaviours (CBs). Evaluation shows the proposed protocol preferences versus the behavioural primitives with specific communication patterns.

Keywords: Auction communication, cooperative multi-robot, distributed intelligence, peer-to-peer, wireless protocol.

Received December 6, 2015; accepted May 5, 2016
 
Monday, 31 December 2018 03:40

Flexible Fair and Collusion Resistant Pseudonym

Providing System

Belal Amro1, Albert Levi2, and Yucel Saygin2

1College of IT, Hebron University, Palestine

2Faculty of Engineering and Natural Sciences, Sabanci University, Turkey

Abstract: In service providing systems, user authentication is required for different purposes such as billing, restricting unauthorized access, etc., to protect the privacy of users, their real identities should not be linked to the services that they use during authentication. A good solution is to use pseudonyms as temporary identities. On the other hand, it may also be required to have a backdoor in pseudonym systems for identity revealing that can be used by law enforcement agencies for legal reasons. Existing systems that retain a backdoor are either punitive (full user anonymity is revealed), or they are restrictive by revealing only current pseudonym identity of. In addition to that, existing systems are designed for a particular service and may not fit into others. In this paper, we address this gap and we propose a novel pseudonym providing and management system. Our system is flexible and can be tuned to fit into services for different service providers. The system is privacy-preserving and guarantees a level of anonymity for a particular number of users. Trust in our system is distributed among all system entities instead of centralizing it into a single trusted third party. More importantly, our system is highly resistant to collusions among the trusted entities. Our system also has the ability to reveal user identity fairly in case of a request by law enforcement. Analytical and simulation based performance evaluation showed that Collusion Resistant Pseudonym Providing System (CoRPPS) provides high level of anonymity with strong resistance against collusion attacks.

Keywords: Security, privacy, pseudonym, anonymity, access control.

Received August 7, 2015; accepted October 24, 2016
 
Monday, 31 December 2018 03:38

Security Enhancement and Certificate

Revocation in MANET using Position

and Energy based Monitoring

Karpura Dheepan

Department of Computer Science and Engineering, Vel Tech Rangarajan Dr.Sagunthala R&D Institute of Science and Technology, India

Abstract: Mobile Ad-hoc Network (MANET) has an advantage over their mobility and ease of deployment but it is vulnerable to various attacks to degrade the security in the network. Using cluster based certificate revocation with combination of both voting and non-voting based mechanism, attacker’s certificate is revocated. But this mechanism is vulnerable to the detection of false accusation in quicker time and attacks related to high energy consumption like stretch and carousel attack. To overcome this issue and to enhance the security, Cluster based scheme along with position and energy based monitoring is proposed to revocate the certificate of the attacker node by the Cluster Authority (CA) node. Guaranteed secure network services and low energy consumption of 9% and 13% is obtained after avoiding stretch and carousel attacks respectively. It increases the Quality of Service (QoS) and reduces the packet loss in the network.

Keywords: MANET, cluster formation, certificate revocation, false accusation, position monitoring, energy monitoring.

Received June 7, 2015; accepted September 20, 2015
 
Monday, 31 December 2018 03:36

Assessing Impact of Class Change by Mining Class Associations

Anshu Parashar and Jitender Chhabra

Department of Computer Engineering, National Institute of Technology, India

Abstract: Data mining plays vital role in data analysis and also encompasses immense potential of mining software engineering data to manage design and maintenance issues. Change impact assessment is one of the crucial issues in software maintenance. In Object Oriented (OO) software system, classes are the core components and changes to the classes are always inevitable. So, OO software system must support the expected changes. In this paper, to assess impact of change in the class, we have proposed changeability measures by mining associations among the classes. These measures estimate a) change propagation by identifying its ripple effect; b) change impact set of the classes; c) changeability rank of the classes and d) class change cost. Further, we have performed the empirically study and evaluation to analysis our results. Our results indicate that by mining associations among the classes, the development team can effectively estimate the probable impact of the class change. These measures can be very helpful to perform changes to the classes while maintaining the software system.

Keywords: Mining software engineering data, object oriented system development, change propagation, change impact.

Received September 7, 2015; accepted February 21, 2016
 
Monday, 31 December 2018 03:34

The Shuffle on Trajectories of Infinite Arrays

Devi Velayutham

Department of Mathematics, Hindustan College of Arts and Science, India

Abstract: In this paper authors study and investigate the shuffle on trajectories on infinite array languages. Like finite array languages this approach is applicable to concurrency providing a method to define parallel composition of processes. It is also applicable to parallel computation. The operations are introduced using a uniform method based on the notion of ww-trajectory. Authors introduce an Array Grammar with Shuffle on Trajectories (AGST) and compare it with other array grammars for generative poauthorsr. Authors prove closure properties for different classes of array languages with respect to the shuffle on trajectories.

Keywords: Büchi two-dimensional online tessellation automaton, ww-trajectory, ww-recognizable array language, column shuffle ww-recognizable array language.

Received April 16, 2015; accepted May 22, 2016 
 
Monday, 31 December 2018 03:32

A Steganography Scheme on JPEG Compressed

Cover Image with High Embedding Capacity

 Arup Kumar Pal1, Kshiramani Naik1, and Rohit Agarwal2

1Department of Computer Science and Engineering, Indian Institute of Technology(ISM), India

2Department of Computer Science and Engineering, JSS Academy of Technical Education, India

Abstract: Joint Photographic Experts Group (JPEG) is one of the widely used lossy image compression standard and in general JPEG based compressed version images are commonly used during transmission over the public channel like the Internet. In this paper, the authors have proposed a steganography scheme where the secret message is considered for embedding into the JPEG version of a cover image. The steganography scheme initially employs block based Discrete Cosine Transformation (DCT) followed by some suitable quantization process on the cover image to produce the transformed coefficients. The obtained coefficients are considered for embedding the secret message bits. In general, most of the earlier works hide one bit message into each selected coefficient, where hiding is carried out either directly modifying the coefficients, like employing the LSB method or indirectly modifying the magnitude of the coefficients, like flipping the sign bit of the coefficients. In the proposed scheme, instead of embedding the secret message bits directly into the coefficients, a suitable indirect approach is adopted to hide two bits of the secret message into some selected DCT coefficients. As per the conventional approach, the modified coefficients are further compressed by entropy encoding. The scheme has been tested on several standard gray scale images and the obtained experimental results show the comparative performance with some existing related works.

Keywords: Chi-square attack; (DCT); Histogram; (JPEG); statistical steganalysis; steganography.

Received May 27, 2015; accepted October 19, 2015
 
Monday, 31 December 2018 03:29

A Model for English to Urdu and Hindi Machine

Translation System using Translation Rules

and Artificial Neural Network

Shahnawaz Khan1 and Imran Usman2

1Department of Information Technology, University College of Bahrain, Bahrain

2College of Computing and Informatics, Saudi Electronic University, Saudi Arabia


Abstract: This paper illustrates the architecture and working of a proposed multilingual machine translation system which is able to translate from English to Urdu and Hindi. The system applies translation rules based approach with artificial neural network.The efficient pattern matching and the ability of learning by examples makes neural networks suitable for implementation of a translation rule based machine translation system.This paper also describes the importance of machine translation systems and status of the languages in a multilingual country like India.Machine translation evaluation score for translation output obtained from the system has been calculated using various methods such as n-gram bleu score, F-measure, Meteor and precision, recall. The evaluation scores achieved by the system for around 500 Hinditest sentences are as: n-gram bleu score 0.5903; Metric for Evaluation of Translation with Explicit ORdering (METEOR) score achieved is 0.7956 and F-score of 0.7916 and for Urdu n-gram bleu score achieved by thesystem is 0.6054; METEOR score achieved is 0.8083 and F-score of 0.8250.

Keywords: Machine translation, artificial neural network, english, hindi, urdu.

Received September 19, 2015; accepted June 8, 2016
 
Monday, 31 December 2018 03:26

A Novel Approach for Segmentation of Human Metaphase Chromosome Images Using Region Based Active Contours

Tanvi Arora

Department of Computer Science and Engineering, Dr. B.R Ambedkar National Institute of Technology, India

Abstract: The chromosomes are the genetic information carries. A healthy human being has 46 chromosomes. Any alteration in either the number of chromosomes or the structure of chromosomes in a human being is diagnosed as a genetic defect. To uncover the genetic defects the metaphase chromosomes are imaged and analyzed. The metaphase chromosome images often contain intensity inhomogeneity that makes the image segmentation task difficult. The difficulties caused by intensity inhomogeneity can be resolved by using region based active contours techniques. These techniques uses the local intensity values of the nearby regions of the objects and find the approximate intensity values along both sides of the contour. In the proposed work a segmentation technique has been proposed to segment the objects present in the human metaphase chromosome images using region based active contours. The proposed technique has been quite efficient from prospective of number of objects segmented. The method has been tested on Advanced Digital Imaging Research (ADIR) dataset. The experimental results have shown quite good performance.

Keywords: Chromosomes, segmentation, active Contours, intensity inhomogeneity.

Received December 8, 2015; accepted April 17, 2016
 
Monday, 31 December 2018 03:23

Comprehensive Stemmer for Morphologically Rich Urdu Language

Mubashir Ali1, Shehzad Khalid2, and Muhammad Saleemi2

1Department of Computer Science & IT, University of Lahore, Gujrat Campus, Pakistan

2Department of Computer Engineering, Bahria University Islamabad, Pakistan

Abstract: Urdu language is used by approximately 200 million people for spoken and written communication. Bulk of unstructured Urdu textual data is available in the world. We can employ data mining techniques to extract useful information from such a large potential information base. There are many text processing systems that are available. However, these systems are mostly language specific with the large proportion of systems are applicable to English text. This is primarily due to the language dependant pre-processing systems mainly the stemming requirement. Stemming is a vital pre-processing step in the text mining process and its core aim is to reduce many grammatical words form e.g., parts of speech, gender, tense etc. to their root form. In this proposed work, we have developed a rule based comprehensive stemming method for Urdu text. This proposed Urdu stemmer has the ability to generate the stem of Urdu words as well as loan words (words belonging to borrowed language i.e. Arabic, Persian, Turkish, etc) by removing prefix infix, and suffix. This proposed stemming technique introduced six novel Urdu infix words classes and minimum word length rule. In order to cope with the challenge of Urdu infix stemming, we have developed infix stripping rules for introduced infix words classes and generic rules for prefix and suffix stemming. The experimental results show the superiority of our proposed stemming approach as compared to existing technique.

Keywords: Urdu stemmer, infix classes, infix rules, stemming rules, stemming lists.

Received September 5, 2015; accepted Jun 1, 2016 

  
Monday, 31 December 2018 01:34

A High Capacity Data Hiding Scheme Using Modified AMBTC Compression Technique

Aruna Malik, Geeta Sikka, and Harsh Verma

Department of Computer Science and Engineering, Dr B R Ambedkar National Institute of Technology, India

Abstract: In this paper, a data hiding scheme is proposed which modifies the Absolute Moment Block Truncation Coding (AMBTC) technique to embed a large amount of secret data. This scheme employs a user-defined threshold value to classify the AMBTC compressed blocks as complex block and smooth block. In the case of smooth blocks, the bit plane is replaced with the secret data bits. Later, the quantization levels are re-calculated so that distortion is minimized. While for complex blocks, the bit plane is reconstructed in which every pixel is represented by two bits instead of just one bit. Now, the secret data is embedded into the first LSB of the bit plane. Finally, four new quantization levels are calculated for preserving the closeness of the resultant block to the original block. Thus, the proposed scheme is able to utilize each and every pixel of the cover image to hide the secret data while maintaining the image quality. This scheme achieves 1 bit per pixel data hiding capacity for every image. Experimental results show that our scheme is superior to the other existing schemes in terms of both hiding capacity and image quality.

Keywords: Data hiding, quantization level, secret data, stego-image, absolute moment block truncation coding.

Received September 17, 2015; accepted January 11, 2016

 
Monday, 31 December 2018 01:00

New Algorithm for Speech Compression Based on Discrete Hartley Transform

Noureddine Aloui1, Souha Bousselmi2, and Adnane Cherif 2

1Centre for Research on Microelectronics and Nanotechnology, Sousse Technology Park, Tunisia

2Innov’Com Laboratory, Sciences Faculty of Tunis, University of Tunis El-Manar, Tunisia

Abstract: This paper presents an algorithm for speech signal compression based on the discrete Hartley transform. The developed algorithm presents the advantages to ensure low bit rate and to achieve high speech compression efficiency, while preserving the quality of the reconstructed signal. The numerical results included in this paper show that the developed algorithm is more effective than the discrete wavelet transform for speech signal compression.

 Keywords: Speech signal compression, discrete hartley transform, discrete wavelet transform.

Received April 13, 2015; accepted May 2, 2016 
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…