November 2018, No. 6
Print E-mail


Accelerated Parallel Training of Logistic Regression

using OpenCL

HamadaM. Zahera, Ashraf El-Sisi

Faculty of Computers and Information, Menoufia University, Egypt

Abstract: This paper presents an accelerated approach for training logistic regression in parallel and running on Graphics Processing Units (GPU). Many prediction applications employed logistic regression for building an accomplished prediction model. This process requires a long time of training and building an accurate prediction model. Many scientists have worked out in boosting performance of logistic regression using different techniques. Our study focuses on showing the tremendous capabilities of GPU processing and OpenCL framework. GPU and OpenCL are the low cost and high performance solution for scaling up logistic regression to handle large datasets. We implemented the proposed approach in OpenCL C/C++ and tested on different data sets. All results showed a significant improvement in execution time in large datasets, which is reduced almost with the available GPU devices.

Keywords: Logistic Regression, Parallel Computing, GPU, OpenCL.

Received August 13, 2015; accepted March 24, 2016

 
Print E-mail

Image Quality Assessment Employing RMS Contrast and Histogram Similarity

Al-Amin Bhuiyan1 and Abdul Raouf Khan2

1Department of Computer Engineering, King Faisal University, KSA

2Department of Computer Science, King Faisal University, KSA

Abstract: This paper presents a new approach for evaluating image quality. The method is based on the histogram similarity computation between images and is organized with assessing quality index factors due to the contributions of correlation coefficient, average luminance distortion and rms contrast measurement. The effectiveness of this proposed RMS Contrast and Histogram Similarity (RCHS) based hybrid quality index has been justified over Lena images under different well known distortions and standard image databases. Experimental results demonstrate that this image quality assessment method performs better than those of widely used image distortion quality metric Mean Squared Error (MSE), Structural SIMilarity (SSIM) and Histogram based Image Quality (HIQ).

Keywords: Image quality measures, RMS contrast, histogram similarity, SSIM, HIQ, minkowski distance metric.

Received July 25, 2015; accepted November 29, 2015

 
Print E-mail

Development of a Hindi Named Entity Recognition System without Using Manually Annotated Training Corpus

Sujan Saha1 and Mukta Majumder2

1Department of Computer Science and Engineering, Birla Institute of Technology, India

2Department of Computer Centre, Vidyasagar University, India

Abstract: Machine learning based approach for named entity recognition (NER) requires sufficient annotated corpus to train the classifier. Other NER resources like gazetteers are also required to make the classifier more accurate. But in many languages and domains relevant NER resources are still not available. Creation of adequate and relevant resources is costly and time consuming. However a large amount of resources and several NER systems are available in resource-rich languages, like English. Suitable language adaptation techniques, NER resources of a resource-rich language and minimally supervised learning might help to overcome such scenarios. In this paper we have studied a few such techniques in order to develop a Hindi NER system. Without using any Hindi NE annotated corpus we have achieved a reasonable accuracy of F-Measure 73.87 in the developed system.

Keywords: Natural Language Processing, Machine Learning, Named Entity Recognition, Resource Scarcity, Language Transfer, Semi-supervised Learning.

Received July 22, 2015; accepted October 7, 2015


 
Print E-mail

Security Mechanism against Sybil Attacks for High- 

Throughput Multicast Routing in Wireless Mesh

Networks

1Anitha Periasamy and 2Periasamy Pappampalayam

1MCA Department, KSR College of Engineering, India

2ECE Department, KSR College of Engineering, India

 

Abstract: Wireless Mesh Networks (WMNs) have become one of the important domains in wireless communications. They comprise of a number of static wireless routers which form an access network for end users to IP-based services. Unlike conventional WLAN deployments, wireless mesh networks offer multihop routing, facilitating an easy and cost-effective deployment. In this paper, an efficient and secure multicast routing on such wireless mesh networks is concentrated. This paper identifies the novel attacks against high throughput multicast protocols in wireless mesh networks through S-ODMRP protocol. Recently, Sybil attack is observed to be the most harmful attack in WMNs, where a node illegitimately claims multiple identities. This paper systematically analyzes the threat posed by the Sybil attack to WMN. The Sybil attack is encountered by the defense mechanism called Random Key Predistribution technique (RKP). The performance of the proposed approach which integrates the S-ODMRP and RKP is evaluated using the throughput performance metric. It is observed from the experimental result that the proposed approach provides good security against Sybil attack with very high throughput.

Keywords: secure multicast routing, Sybil attack, Random key predistribution.

Received June 29, 2015; accepted December 14, 2015

 

 
Print E-mail

Impulse Noise Reduction for Texture Images Using Real Word Spelling Correction Algorithm and Local Binary Patterns

Shervan Fekriershad, Seyed Fakhrahmad, and Farshad Tajeripour

Department of Computer Science, Shiraz University, Iran

Abstract: Noise Reduction is one of the most important steps in very broad domain of image processing applications such as face identification, motion tracking, visual pattern recognition and etc. Texture images are covered a huge number of images where are collected as database in these applications. In this paper an approach is proposed for noise reduction in texture images which is based on real word spelling correction theory in natural language processing.The proposed approach is included two main steps. In the first step, most similar pixels to noisy desired pixel in terms of textural features are generated using local binary pattern. Next, best one of the candidates is selected based on two-gram algorithm. The quality of the proposed approach is compared with some of state of the art noise reduction filters in the result part. High accuracy, Low blurring effect, and low computational complexity are some advantages of the proposed approach.

Keywords: Image Noise Reduction, Local Binary Pattern, Real Word Spelling Correction, Texture Analysis

Received June 22, 2015; accepted March 9, 2016

 

Full text 

 

 
Print E-mail

Recognition of Handwritten Characters Based on

Wavelet Transform and SVM Classifier

Malika Ait Aider1, Kamal Hammouche1, and Djamel Gaceb2

1Laboratoire Vision Artificielle et Automatique des Systèmes, Université Mouloud Mammeri, Algérie

2Laboratoire LIRIS, INSA de Lyon, Bât. Jules Verne 20, France

Abstract: This paper is devoted to the off-line handwritten character recognition based on the two dimensional wavelet transform and a single support vector machine classifier. The wavelet transform provides a representation of the image in independent frequency bands. It performs a local analysis to characterize images of characters in time and scale space. The wavelet transform provides at each level of decomposition four sub-images: a smooth or approximation sub-image and three detail sub-images. In handwritten character recognition, the wavelet transform has received more attention and its performance is related not only to the use of the type of wavelet but also to the type of a sub-image used to provide features. Our objective here is thus to study these two previous points by conducting several tests using several wavelet families and several combinational features derived from sub-images. They show that the symlet wavelet of order 8 is the most efficient and the features derived from the approximation sub-image allow the best discrimination between the handwritten digits.

Keywords: Feature extraction; wavelet transform, handwritten character recognition; support vector machine; OCR

Received June 10, 2015; accepted May 16, 2016

 

 
Print E-mail

Security Enhancement and Certificate Revocation

in MANET using Position and Energy based

Monitoring

Karpura Dheepan, Anusuya vaithiyanathan, and Karthikeyan Venugopal

Department of Electronics and Communication Engineering, PSG College of Technology, India

Abstract: Mobile Ad-hoc Network (MANET) has an advantage over their mobility and ease of deployment but it is vulnerable to various attacks to degrade the security in the network. Using cluster based certificate revocation with combination of both voting and non-voting based mechanism, attacker’s certificate is revocated. But this mechanism is vulnerable to the detection of false accusation in quicker time and attacks related to high energy consumption like stretch and carousel attack. To overcome this issue and to enhance the security, Cluster based scheme along with position and energy based monitoring is proposed to revocate the certificate of the attacker node by the Cluster Authority (CA) node. Guaranteed secure network services and low energy consumption of 9% and 13% is obtained after avoiding stretch and carousel attacks respectively. It increases the Quality of Service (QoS) and reduces the packet loss in the network.

 Keywords: MANET, Cluster formation, Certificate revocation, False accusation, Position monitoring, Energy monitoring.

Received June 7, 2015; accepted September 20, 2015

                                                                                    

 

 
 
Print E-mail

A Novel Handwriting Grading System Using

Gurmukhi Characters


Munish Kumar1, Manish Jindal2, and Rajendra Sharma3

1Department of Computer Science, Panjab University Rural Centre, India

2Department of Computer Science and Applications, Panjab University Regional Centre, India

3Department of Computer Science and Engineering, Thapar University, India

Abstract: This paper presents a new technique for grading the writers based on their handwriting. This process of grading shall be helpful in organizing handwriting competitions and then deciding the winners on the basis of an automated process. For testing data set, we have collected samples from one hundred different writers. In order to establish the correctness of our approach, we have also considered these characters, taken from one printed Gurmukhi font (Anandpur Sahib) in testing data set. For training data set, we have considered these characters, taken from four printed Gurmukhi fonts, namely, LMP Taran, Maharaja, Granthi and Gurmukhi_Lys. Nearest Neighbour classifier has been used for obtaining a classification score for each writer. Finally, the writers are graded based on their classification score.

Keywords: Gradation; Feature extraction; Peak extent based features; Modified division point based features; NN.

Received June 7, 2015; accepted January 13, 2016

 
Print E-mail

Semi Fragile Watermarking for Content based

Image Authentication and Recovery in the DWT-

DCT domains

Jayashree S. Pillai1 and Padma Theagarajan2

1Department of Computer Science, Mother Teresa Women’s University, India

2Department of Computer Applications, Sona College of Technology, India

Abstract: Content authentication requires that the image watermarks must highlight malicious attacks while tolerating incidental modifications that do not alter the image contents beyond a certain tolerance limit. This paper proposed an authentication scheme that uses content invariant features of the image as a self authenticating watermark and a quantized down sampled approximation of the original image as a recovery watermark, both embedded securely using a pseudorandom sequence into multiple sub bands in the DWT domain. The scheme is blind as it does not require the original image during the authentication stage and highly tolerant to Jpeg2000 compression. The scheme also ensures highly imperceptible watermarked images and is suitable in application with low tolerance to image quality degradation after watermarking.  Both DCT and DWT transform domain are used in the watermark generation and embedding process.

Keywords: Content authentication, self authentication, recovery watermark, DWT, PQ Sequence

Received May 29, 2015; accepted February 22, 2016

 Full text

 
Print E-mail

An Empirical Study to Evaluate the Relationship of Object-Oriented Metrics and Change Proneness

Ruchika Malhotra and Megha Khanna

Department of Software Engineering, Delhi Technological University, India

Abstract:Software maintenance deals with changes or modifications which a software goes through. Change prediction models help in identification of classes/modules which are prone to change in future releases of a software product. As change prone classes are probable sources of defects and modifications, they represent the weak areas of a product. Thus, change prediction models would aid software developers in delivering an effective software quality product by allocating more resources to change prone classes/modules as they need greater attention and resources for verification and meticulous testing. This would reduce the probability of defects in future releases and would yield a better quality product and satisfied customers.This study deals with the identification of change prone classes in an Object-Oriented (OO) software in order to evaluate whether a relationship exists between OO metrics and change proneness attribute of a class. The study also compares the effectiveness of two sets of methods for change prediction tasks i.e. the traditional statistical methods (logistic regression) and the recently widely used machine learning methods like Bagging, Multi-layer perceptron etc.

Keywords:Change Proneness, Empirical Validation, Machine Learning, Object-Oriented and Software Quality.

Received May 29, 2015; accepted September 20, 2015

                                                                                    

 
Print E-mail

Detection of Neovascularization in Proliferative

Diabetic Retinopathy Fundus images

Suma Gandhimathi and Kavitha Pillai

Department of CSE, University College of Engineering, Kanchipuram, India

Abstract: Neovascularization is a serious visual consequence disease arising from Proliferative Diabetic Retinopathy (PDR). The condition causes progressive retinal damage in persons suffering from Diabetes mellitus, and is characterized by busted growth of abnormal blood vessels from the normal vasculature, which hampers proper blood flow into the retina because of oxygen insufficiency in retinal capillaries. The present paper aims at detecting PDR neovascularization with the help of the Adaptive Histogram Equalization technique, which enhances the green plane of the fundus image, resulting in enrichment of the details presented in the fundus image. The neovascularization blood vessels and the normal blood vessels were both segmented from the equalized image, using the Fuzzy C-means clustering technique. Marking of the neovascularization region, was achieved with a function matrix box based on a compactness classifier, which applied morphological and threshold techniques on the segmented image. Subsequently, the Feed Forward Back-propagation Neural Network interacted with extracted features (e.g., number of segments, gradient variation, mean, variance, standard deviation, contrast, correlation, entropy, energy, homogeneity, cluster shade towards the neovascularization detection region), in an attempt to achieve accurate identification. The above method was tested on images from three online datasets, as well as two hospital eye clinics. The performance of the detection technique was evaluated on these five image sources, and found to show an overall accuracy of 94.5% for sensitivity of 95.4% and of specificity 49.3% respectively, thus reiterating that the method would play a vital role in the study and analysis of Diabetic Retinopathy.

 

Keywords: Diabetic Retinopathy, Neovascularization, Fuzzy C-means clustering, Compactness classifier, Feature extraction, Neural network.

 

Received May 28, 2015; accepted December 27, 2015

 Full text 

 

 
Print E-mail

SynchroState: A SPEM-based Solution for Synchronizing Activities and Products through State Transitions

 

Amal Rochd1, Maria Zrikem1, Thierry Millan2, Christian Percebois2, Claude Baron3, and Abderrahmane Ayadi1

1 Laboratory of Modeling and Information Technologies, University of Cadi Ayyad, Morocco

2 Institut de Recherche en Informatique de Toulouse, Université de Toulouse, France

3 Laboratoire d’Analyse et d’Architecture des Systèmes, Université de Toulouse, France

Abstract: Software engineering research was always focused around the efficiency of software development processes. Recently, we noticed an increasing interest in model-driven approaches in this context. Models that were once merely descriptive, are nowadays playing a productive role in defining engineering processes and managing their lifecycles. However, there is a problem that has not been considered enough; it is about sustaining consistency between products and the implicated activities during the process lifecycle. This issue, identified in this paper as the synchronization problem, needs to be resolved in order to guarantee a flawless execution of a software process. In this paper, we present a SPEM-based solution named SynchroState that highlights the relationship between process activities and products. SynchroState's goal is to ensure synchronization between activities and products in order that if one of these two entities undergoes a change, the dependents entities should be notified and evolved to sustain consistency In order to evaluate SynchroState, we have implemented the solution using the AspectJ language and validated it through a case study inspired from the ISPW-6 software process example. Results of this study demonstrate the automation of synchronization of product state following a change in the activity state during the evolution of the process execution.

Keywords: SynchroState, SPEM, metamodeling, process model, synchronization, AspectJ.

 

Received April 17, 2015; accepted June 9, 2016



 
Print E-mail

Machine Learning based Intelligent Framework for Data Preprocessing

Sohail Sarwar1,  Zia UI Qayyum2, Muhammad Safyan1, Abdul Kaleem1

1 Department of Computing Iqra University Islamabad, Pakistan

2 National University of Computing and Emerging Sciences Islamabad, Pakistan

Abstract: Data preprocessing having a pivotal role in data mining ensures reduction in cost by catering inconsistent, incomplete and irrelevant data through data cleansing to assist knowledge workers in making effective decisions through knowledge extraction. Prevalent techniques are not much effective for having more manual effort, increased processing time, less accuracy percentage etc with constrained data volumes. In this research, a comprehensive, semi-automatic pre-processing framework based on hybrid of two machine learning techniques namely Conditional Random Fields (CRF) and Hidden Markov Model (HMM) is devised for data cleansing. Proposed framework is envisaged to be effective and flexible enough to manipulate data set of any size. A bucket of inconsistent dataset (comprising of customer’s address directory) of Pakistan Telecommunication Company (PTCL) is used to conduct different experiments for training and validation of proposed approach. Small percentage of semi cleansed data (output of preprocessing) is passed to hybrid of HMM and CRF for learning and rest of the data is used for testing the model. Experiments depict superiority of higher average accuracy of 95.50% for proposed hybrid approach compared to CRF (84.5%) and HMM (88.6%) when applied in separately.

Keywords: Machine Learning, Hidden Markov Model, Conditional Random Fields, Preprocessing.

Received March 14, 2015; accepted June 14, 2016

 

Full text   

 


 
Print E-mail

Explicitly Symplectic Algorithm for Long-time

Simulation of Ultra-flexible Cloth

Xiaohui Tan1, Yachun Fan2, 3, and Kang Wang1, 3

1College of Information and Engineering, Capital Normal University, China

2College of Information Science and Technology, Beijing Normal University, China

3College of Information Science and Technology, Beijing Normal University, China

Abstract: In this paper, a symplectic structure-preserved algorithm is presented to solve Hamiltonian dynamic model of ultra-flexible cloth simulation with high computation stability. Our method can preserve the conserved quantity of a Hamiltonian, which enables a long-time stable simulation of ultra-flexible cloth. Firstly, the dynamic equation of ultra-flexible cloth simulation is transferred into Hamiltonian system which is slightly perturbed from the original one, but with generalized structure preservability. Secondly, semi-implicit symplecticRunge-Kutta and Euler algorithms are constructed, and able to be converted into explicit algorithms for the separable dynamic models. Thirdly, in order to show the advantages, the presented algorithms are utilized to solve a conservative system which is the primary ultra-flexible cloth model unit. The results show that the presented algorithms can preserve the system energy constant and can give the exact results even at large time-step, however the ordinary non-symplectic explicit methodsexhabit large error with the increasing of time-step. Finally, the presented algorithms are adopted to simulate a large-areaultra-flexible cloth to validate the computation capability and stability. The method employs the symplectial features and analytically integrates the force for better stability and accuracy while keeping the integration scheme is still explicit. Experiment results show that our symplectic schemes are more powerful for integrating Hamiltonian systems than non-symplectic methods. Our method is a common scheme for physically based system to simultaneously maintain real-time and long-time simulation.It has been implemented in the scene building platform - World Max Studio.

Keywords: flexible cloth  simulation, numerical integration, symplectic method, scene building system

Received March 13, 2015; accepted June 14, 2016

 


 
Print E-mail

Modified Binary Bat Algorithm for Feature Selection in Unsupervised Learning  

 Rajalaxmi Ramasamy and Sylvia Rani

Department of Computer Science and Engineering, Kongu Engineering College, India

Abstract: Feature selection is the process of selecting a subset of optimal features by removing redundant and irrelevant features. In  supervised learning, feature selection process uses class label. But feature selection is difficult in unsupervised learning since class labels are not present. In this paper, we present a wrapper based unsupervised feature selection method with the modified binary bat approach with k-means clustering algorithm. To ensure diversification in the search space, mutation operator is introduced in the proposed algorithm. To validate the selected features by our method, classification algorithms like decision tree induction, Support Vector Machine and Naïve Bayesian classifier are used. The results show that the proposed method identifies a minimal number of features with improved accuracy when compared with the other methods.

Keywords: Feature selection, Unsupervised learning, Binary Bat algorithm, Mutation

Received March 10, 2015; accepted December 21, 2015

 

 
<
 
Copyright 2006-2009 Zarqa Private University. All rights reserved.
Print ISSN: 1683-3198.
 
 
301 Moved Permanently

301 Moved Permanently

The requested resource has been assigned a new permanent URI.


Powered by Tengine/1.4.2