September 2014, No.5
Print E-mail

Hardening the ElGamal Cryptosystem in the Setting of the Second Group of Units

 

Ramzi Haraty, AbdulNasser ElKassar, and Suzan Fanous

Department of Computer Science and Mathematics, Lebanese American University, Lebanon

 

Abstract: The Elgamal encryption scheme is best described in the setting of any finite cyclic group. Its classic case is typically presented in the multiplicative group of the ring of integers modulo a prime p and the multiplicative groups of finite fields of characteristic two. The Elgamal cryptosystem was modified to deal with Gaussian integers, and extended to work with group of units of Zp[x]/<x2>. In this paper, we consider yet another extension to the Elgamal cryptosystem employing the second group of units of Zn and the second group of units of Z2[x]/<h(x)>, where h(x) is an irreducible polynomial. We describe the arithmetic needed in the new setting, and present examples, proofs and algorithms to illustrate the applicability of the proposed scheme. We implement our algorithms and conduct testing to evaluate the accuracy, efficiency and security of the modified cryptographic scheme.       

Keywords: Second group of units of znand zn[x]/<h(x)>, elgamal cryptosystem, and baby step giant step attack algorithm.

Received April 24, 2012; accepted October 25, 2012; published online February 26, 2014

 

Full Text

 

 

 
Print E-mail

Universal Forgery Attack on a Strong Designated Verifier Signature Scheme

Chien-Lung Hsu1 and Han-Yu Lin2
1Department of Information Management, Chang Gung University, Taiwan
2Department of Computer Science and Engineering, National Taiwan Ocean University, Taiwan

 
Abstract: Based on the bilinear Diffie-Hellman assumption, in 2009, Kang et al. proposed an identity-based strong designated verifier signature scheme which only allows the intended verifier to verify the signature. Besides, the designated verifier is not capable of transferring the conviction to any third party. Their scheme was proved secure in the random oracle model. In this paper, however, we will demonstrate that their scheme is still vulnerable to the universal forgery attack for arbitrarily chosen messages. Moreover, an efficient and provably secure improvement to eliminate the security weakness is presented.

Keywords: Universal forgery, identity-based, designated verifier, digital signature, bilinear pairing.
 
 
  Received September 15, 2012; accepted February 27, 2013
 

Full Text

 
Print E-mail

Anticipatory Bound Selection Procedure (ABSP) for Vertex K-Center Problem

Rattan Ran and Deepak Garg
Computer Science and Engineering Department, Thapar University, India

 
Abstract: Vertex k-center problem introduces the notion to recognize k locations as centers in a given network of n connected nodes holding the condition of triangle inequality. This paper presents an efficient algorithm that provides a better solution for vertex k-center problem. Anticipatory Bound Selection Procedure (ABSP) is deployed to find the initial threshold distance (or radius) and eradicating the gap between minimum distance (lower bound) and maximum distance (upper bound). The jump based scheme is applied to find the optimal coverage distance. Main aspect of this algorithm is to provide optimal solution with less iteration which has not been achieved so far.


Keywords: Facility planning and design, facility location allocation, k-center, set covering problem.
 
 
  Received February 21, 2012; accepted March 19, 2013
 

Full Text

 
Print E-mail

Grey Relational Effort Analysis Technique using Regression Methods for Software Estimation

Nagpal Geeta1, Uddin Moin2, and Kaur Arvinder3
1Department of Computer Science and Engineering, National Institute of Technology, India
2Faculty of Engineering and Technology, Delhi Technological University, India
3Department of Information Technology, University School of Information Technology, India

 
Abstract: Software project planning and estimation is the most important confront for software developers and researchers. It incorporates estimating the size of the software project to be produced, estimating the effort required, developing initial project schedules, and ultimately, estimating on the whole cost of the project. Numerous empirical explorations have been performed on the existing methods, but they lack convergence in choosing the best prediction methodology. Analogy based estimation is still one of the most extensively used method in industry which is based on finding effort from similar projects from the project repository. Two alternative approaches using analogy for estimation have been proposed in this study. Firstly, a precise and comprehensible predictive model based on the integration of Grey Relational Analysis (GRA) and regression has been discussed. Second approach deals with the uncertainty in the software projects, and how fuzzy set theory in fusion with grey relational analysis can minimize this uncertainty. Empirical results attained are remarkable indicating that the methodologies have a great potential and can be used as a candidate approaches for software effort estimation. The results obtained using both the methods are subjected to rigorous statistical testing using Wilcoxon signed rank test.


Keywords: Software Estimations, estimation by analogy, fuzzy clustering, robust regression, GRA.
 
 
  Received April 9, 2012; accepted March 19, 2013
 

Full Text

 
Print E-mail

An Optimal Feature Subset Selection using GA for Leaf Classification

Valliammal Narayan and Geethalakshmi Subbarayan
Department of Computer Science, Avinashilingam Institute of Home Science and Higher Education, Women Deemed University, India

 
Abstract: This paper describes an optimal approach for feature extraction and selection for classification of leaves based on Genetic Algorithm (GA). The selection of the optimal features subset and the classification has become an important methodology in the field of Leaf classification. The deterministic feature sequence is extracted from the leaf images using GA technique, and these extracted features are further used to train the Support Vector Machine (SVM). GA is applied to optimize the features of color and boundary sequences, and to improve the overall generalization performance based on the matching accuracy. SVM is applied to produce the false positive and false negative features. Our experimental results indicate that the application of GA for feature subset selection using SVM as a classifier proves computationally effective and improves the accuracy compared to KNN to classify the leaf patterns.


Keywords: Feature extraction, feature selection, classification, GA, SVM, geometric, color, boundary and ripple features.
 
 
  Received January 5, 2012; accepted March 21, 2013
 

Full Text

 
Print E-mail

Mining Frequent User Query Patterns from XML Query Streams

Tsui-Ping Chang
Department of Information Technology, Ling Tung University, Taiwan

 
Abstract: An XML query stream is a massive and unbounded sequence of queries that are continuously generated at a fast speed from users over the Internet. Compared with traditional approaches of mining frequent user query patterns in static XML query databases, pattern mining in XML query streams is more challenging since several extra requirements need to be satisfied. In this paper, a mining algorithm is proposed to discover frequent user query patterns over an XML query stream. Unlike most of existing algorithms, the proposed algorithm works based on a novel encoding scheme. Through the scheme, only the leaf nodes of XML query trees are considered in the system and result in higher mining performance. The performance of the proposed algorithm is tested and analyzed through a series of experiments. These experiment results show that the XSM outperforms other algorithms in its execution time.


Keywords: Frequent XML query pattern, XML query stream mining, encoding scheme, database.
 
 
  Received May 25, 2012; accepted March 23, 2013
 

Full Text

 
Print E-mail

An Information Theoretic Scoring Function in Belief Network

Muhammed Naeem1 and Sohail Asghar2
1Department of Computer Science, Mohammad Ali Jinnah University Islamabad, Pakistan
2PMAS-Arid Agriculture University Instrtute of Information Technology Rawalpindi, Pakistan

 
Abstract: We proposed a novel measure of mutual information known as Integration to Segregation (I2S) explaining the relationship between two features. We investigated its nontrivial characteristics while comparing its performance in terms of class imbalance measures. We have shown that I2S possesses characteristics useful in identifying sink and source (parent) in a conventional directed acyclic graph in structure learning technique such as Bayesian Belief Network. We empirically indicated that identifying sink and its parent using conventional scoring function is not much impressive in maximizing discriminant function because it is unable to identify best topology. However, I2S is capable of significantly maximizing discriminant function with the potential of identifying the network topology in structure learning.


Keywords: Mutual dependence, information theory, structure learning, scoring function.
 
 
  Received September 9, 2012; accepted March 21, 2013
 

Full Text

 
Print E-mail

Blind Restoration of Radiological Images using Hybrid Swarm Optimized Model Implemented on FPGA

Slami Saadi1, Abderrezak Guessoum2, Maamar Bettayeb3, and Kamel Abdelhafidi4
1Department of Technology, University of Djelfa, Algeria
2Department of Electronics, University of Blida, Algeria
3Department of Electrical and Computer Engineering, University of Sharjah, UAE
4LMetallic and Semiconducting Materials, University of Biskra, Algeria

 
Abstract: Image restoration step is important in many image processing applications. In this work, we attempt to restore radiological images degraded during acquisition and processing. Details of the work, carried out to optimize a neural network (NN) for identifying an autoregressive moving average (ARMA) model used for nonlinearly degraded image restoration, are presented in this paper. The degraded image is expressed as an ARMA process. To improve the learning performance, the NN is fast trained using a hybrid swarm intelligence optimization approach based on the synergy of Particle Swarm (PSO) and Bacterial Foraging (BFO) Algorithms, which is compared with other training techniques such as: the back propagation, Quasi-Newton and Levenberg-Marquardt Algorithms. Both original image and blur function are identified through this model. The optimized ARMA-NN model is implemented on a Xilinx reconfigurable field-programmable gate array (FPGA) using hardware description language: VHDL. This VHDL code is tested on the rapid prototyping platform named ML505 based on a Virtex5-LXT FPGA chip of Xilinx. Simulation results using some test and real images are presented to support the applicability of this approach compared to the standard blind deconvolution method that maximizes the likelihood using an iterative process. The comparison is based on performance evaluation using some recent image quality metrics.


Keywords: Radiological image restoration, ARMA, NN, PSO, BFO, FPGA.
 
 
  Received January 25, 2012; accepted March 28, 2013
 

Full Text

 
Print E-mail

An Optimized Method for B-Mode Echocardiographic Video Compression Based on Motion Estimation and Wavelet

Nima Sahba1, Keivan Maghooli1, and Vahid Tavakoli2
1Department of Biomedical Engineering, Science and Research Branch, Islamic Azad University, Iran
2Department of Electrical and Computer Engineering, University of Louiville, USA

 
Abstract: In this paper, a new approach for echocardiography image compression is developed. To achieve a high rate of image compression as well as preserving the image information, motion detection and wavelet transform are combined. In the first step, a Region of Interest (ROI) is determined and the image is divided into several (8×8 pixels) blocks. Thereafter, the motion vectors of each block are estimated to predict the subsequent frame (predicted model frame). Additionally, the wavelet component is created by applying the wavelet transform to the main image (which should be predicted); whereas the extracted wavelet component is utilized as a predicted frame error compensator.   Subsequently, entropy of the motion vectors of each block is extracted as a criterion to determine the level of quantization which is used for wavelet frame quantization. Wavelet frame is quantized based on the Lloyd’s algorithm. Finally, the correlation of each block of the predicted model frame with the corresponding block of the main image is calculated to evaluate the rate of the accuracy of the result. If the calculated correlation is more than 0.5, an optimized combination of the predicted model and the corresponding block of the wavelet frame is utilized as the final block. Otherwise, the block of the wavelet frame is considered as the final result.The results were analyzed using PSNR, MSE, GLCM and expert-based quality image validation. The proposed algorithm was compared to standard MPEG standards, H.264 and VC1 which proved the out-performance of the proposed algorithm.


Keywords: B-mode echocardiography, video compression, motion estimation.
 
 
  Received October 6, 2012; accepted March 28, 2013
  

Full Text

 
Print E-mail

A Reference Comments Crawler for Assisting Research Paper Writing

Hocheol Jeon
Agency for Defense Development, Korea

 
Abstract: When writing a research paper, significant effort is spent comparing the current work to other related studies. In general, these comparisons comprise the ‘Related Work’ section of the paper, with the relevant reference papers cited and analyzed. An automatic method for gathering and managing information about other researchers’ reviews of reference papers would reduce the time and effort needed for such comparisons. Thus, in this paper, we propose a crawler that gathers the comments made by other researchers on the papers that are cited in the ‘Related Work’ section and listed in the ‘Bibliography’ or ‘References’ section of a research paper. The Reference Comments Crawler (RCC) system collects the text pertaining to the reference papers, providing useful information to researchers by extracting relevant data from the comments. The RCC considers different types of reference identifiers and the comment sentences are extracted based on these reference identifiers and user-defined extraction rule. Also, the RCC system extracts and provides the previous and subsequent sentences, labeled as PreSentences and PostSentences, as well as the comment sentences including the reference identifier. A series of experiments were performed to evaluate precision and recall, and the results showed that the RCC system can provide useful information with a high degree of precision and recall to the user. Furthermore, through these experiments, our system can assist researchers by reducing the time and effort spent comparing and analyzing related work.


Keywords: RCC, comment sentences, reference identifiers, research paper writing.

  Received March 13, 2012; accepted April 11, 2013
  

Full Text

 
Print E-mail

Strategy to Reduce False Alarms in Intrusion Detection and Prevention Systems

Qais Qassim, Ahmed Patel, and Abdullah Mohd Zin
Faculty of Information Science and Technology, Universiti Kebangsaan Malaysia, Malaysia

 
Abstract: Pervasive and sustained cyber attacks against information systems continue to pose a potentially devastating impact. Security of information systems and the networks that connect them is becoming increasingly significant nowadays than before as the number of security incidents steadily climbs. The traditional ways of protection with firewall and encryption software are no longer sufficient and effective. In this struggle to secure the data and the systems on which it is stored, Intrusion Detection and Prevention System (IDPS) can prove to be an invaluable tool. IDPS can also, be a very useful tool for recording forensic evidence that may be used in legal proceeding. The intrusion detection and prevention system have provided a high detection rate in detecting attack attempts. However, IDPS performance is hindered by the high false alarm rates it produces. This is a serious concern in information security because every false alarm can onset a severe impact to the system such as the disruption of information availability because of IDPS blockage in suspecting the information to be an attack attempt. The aim of this paper is to propose a strategy to reduce these false alarm rates to an acceptable level to maintain the total security against serious attacks by implementing a fuzzy logic-risk analysis technique for analyzing the generated alarms.


Keywords: information security, intrusion detection, intrusion prevention, anomaly detection, risk analysis.

  Received January 30, 2012; accepted April 15, 2013
  

Full Text

 
Print E-mail

Speech to Text Engine for Jawi Language

Zaini Arifah Othman1, Nor Aniza Abdullah2, Zaidi Razak3, and Mohd Yakub Mohd Yusoff4
1,2,3Faculty of Computer Science and Information Technology, University of Malaya, Malaysia
4Academy of Islamic Studies, University of Malaya, Malaysia

 
Abstract: This paper focused on the development of speech translation to special character that is Malay speech to Jawi text engine. Jawi is a unique character derived from Arabic but it is read in Malay language. There are not many research can be found on speech technology developed for Jawi and  this research would be useful to researcher who wish to venture its benefit to many related ICT applications. The use of Zero Crossing Rate (ZCR) as a robust algorithm for accurate automatic detection of speech signal syllable boundary has been discussed. The combination of LPC and ANN are used in this research to extract and classify the speech signals with backpropagation training method. This paper also, discussed on the use of Jawi Unicode in the final character tagging process to represent each of the Jawi character existed in the spoken word. As there are no standard lists of Jawi Unicode published, in this research, the existing of Jawi Unicode table produced by previous research is further investigated and enhanced in order to have better accuracy in Jawi character-phoneme representation. This list is based on the combination of Traditional Arabic and other scripts. A prototype educational learning tool was also, developed to enable school children to recognize and read Jawi text, check their pronunciation, and learn from their mistakes independently.


Keywords: Speech-to-text, jawi unicode, linear predictive coding, artificial neural network, ZCR.

  Received May 24, 2012; accepted April 11, 2013
  

Full Text

 
Print E-mail

Analysis of Visual Features in Local Descriptor for Multi-Modality Medical Image

Hizmawati Madzin1, Roziati Zainuddin2, and Nur Sabirin Mohamed3
1Multimedia Department, University Putra Malaysia, Malaysia
2Artificial Intelligence Department, University of Malaya, Malaysia
3Centre for Foundation Studies in Science, University of Malaya, Malaysia

 
Abstract: In medical application, the usage of multiple medical images generated by computer tomography such as x-ray, magnetic resonance imaging (MRI) and CT-scan images is a standard tool of medical procedure for physicians. The major problems in analyzing various modality of medical image are the inconsistent orientation and position of the body-parts of interest. In this research, local descriptor of texture, shape and color are used to extract features from multi-modality medical image in patches and interest point’s descriptor. The main advantage of using local descriptor is that these features do not need preprocessing method of segmentation and also,  robust to local changes These features are then will be classified based on its modality using support vector machine (SVM) and k-nearest neighborhood (k-NN) classifiers. It shows that different modality have different characteristic and the importance of selecting significance features.


Keywords: Multi-modality medical images, texture, shape and color features analysis, local patches, local interest points.
 
Received November 7, 2011; accepted December 27, 2012
 
<
 
Copyright 2006-2009 Zarqa Private University. All rights reserved.
Print ISSN: 1683-3198.
 
 
Warning: fsockopen(): php_network_getaddresses: getaddrinfo failed: Name or service not known in /hsphere/local/home/ccis2k/ccis2k.org/iajit/templates/rt_chromatophore/index.php on line 251 Warning: fsockopen(): unable to connect to oucha.net:80 (php_network_getaddresses: getaddrinfo failed: Name or service not known) in /hsphere/local/home/ccis2k/ccis2k.org/iajit/templates/rt_chromatophore/index.php on line 251 skterr