Securing RSA Algorithm against Timing Attack
Amuthan Arjunan, Praveena Narayanan, and Kaviarasan Ramu
Department of Computer Science and Engineering, Pondicherry Engineering College, India
Department of Information Technology, Alpha College of Engineering and Technology, India
Department of Computer Science and Engineering, Alpha College of Engineering and Technology, India
Abstract: Security plays an important role in many embedded systems. All security based algorithms are implemented in hardware or software, and on physical devices which interact with the systems and influenced by their environments. The attacker extracts, investigate and monitor these physical interactions and extracts side channel information which is used in cryptanalysis. This type of cryptanalysis is known as side channel cryptanalysis and attacks performed by using this method is known side channel attacks. There are different types of side channel attacks based on side channel information like time, power, electromagnetic information and faulty output emitted from the cryptographic devices during implementation. The attack that occurs based on the run-time by which the information gained from physical characteristics of cryptosystems to retrieve the secret key is known as the timing attack. The side channel attacks are vulnerable to both symmetric and asymmetric algorithms. RSA is an asymmetric algorithm which plays an important role in most of the applications, but this algorithm is vulnerable to timing attack. So a new technique is proposed called “Randomness Algorithm” and Optical Asymmetric Encryption Padding (OAEP) technique to improve the robustness of RSA algorithm against timing attack, by introducing randomness in computation of decryption process to make the timing information unusable to the attacker.
Keywords: Cryptanalysis, side channel attacks, timing attack, RSA, OAEP.
Threshold-based Steganography: A Novel Technique for Improved Payload and SNR
Zakir Khan1, Mohsin Shah2, Muhammad Naeem3, Toqeer Mahmood4, Shah Nawaz Ali Khan5,
Noor Ul Amin6, and Danish Shahzad7
1, 2, 3, 5, 6Department of Information Technology, Hazara University Mansehra, Pakistan
4Department of Computer Engineering, University of Engineering and Technology, Pakistan
7Department of Computer Engineering, Kadir Has University, Turkey
Abstract: Steganography is the art of hiding user information in various file types including image, audio and video. Security of steganography lies in imperceptibility of secret information in the cover image. Human Visual System (HVS) is not able to detect the changes in low color values of an image. To the best of our knowledge, none of the available steganographic techniques have exploited this weakness of HVS. In this paper, a new LSB technique is presented which hides information in the cover image taking into account the pixel value of color or grey level of every pixel. Our experiments show that the proposed technique has a high payload and low perceptibility of secret information hidden in the cover image as compared to the existing Least Significant Bit (LSB) based algorithms. We have used MATLAB for the implementation of proposed algorithm.
Keywords: LSB, cover image, stego image, HVS, data hiding.
Received April 29, 2014; accepted July 9, 2014
Introd
An Approach to Automatic Reconstruction of Apictorial Hand Torn Paper Document
Rayappan Lotus1, Justin Varghese2, and Subash Saudia1
1Centre for Information Technology and Engineering, Manonmaniam Sundaranar University, India
2College of Computer Science, King Khalid University, Saudi Arabia
Abstract: Digital automation in reconstruction of apictorial hand torn paper document increases efficacy and reduces human effort. Reconstruction of torn document has importance in various fields like archaeology, art conservation and forensic sciences. The devised novel technique for hand torn paper document, consists of pre-processing, feature extraction and reconstruction phase. Torn fragment’s boundaries are simplified as polygons using douglas peucker polyline simplification algorithm. Features such as Euclidean distance and number of sudden changes in contour orientation are extracted. Our matching criteria identify the matching counterparts. Proposed features curtail ambiguity and enriches efficacy in reconstruction. Reconstructed results of hand torn paper document favour the proposed methodology.
Keywords: Reconstruction, apictorial, torn documents.
Received October 17, 2013; accepted May 29, 2014
Selectivity Estimation of Range Queries in Data Streams using Micro-Clustering
Sudhanshu Gupta and Deepak Garg
Computer Science and Engineering Department, Thapar University, India
Abstract: Selectivity estimation is an important task for query optimization. The common data mining techniques are not applicable on large, fast and continuous data streams as they require one pass processing of data. These requirements make range query estimation a challenging task. We propose a technique to perform range query estimation using micro-clustering. The technique maintains cluster statistics in terms of micro-clusters. These micro-clusters also maintain data distribution information of the cluster values using cosine coefficients. These cosine coefficients are used for estimating range queries. The estimation can be done over a range of data values spread over a number of clusters. The technique has been compared with cosine series technique for selectivity estimation. Experiments have been conducted on both synthetic and real datasets of varying sizes and results confirm that our technique offers substantial improvements in accuracy over other methods.
Keywords: Selectivity estimation, range query, data streams, micro-clustering.
Received September 22, 2012; accepted December 24, 2013
Hyperspectral Image Compression Based on DWT and TD with ALS Method
Kala Sundar Rajan1 and Vasuki Siva Murugesan2
1Department of Electronics and Communication Engineering, Sri Subramanya College of Engineering and Technology, India
2Department of Electronics and Communication Engineering, Velammal College of Engineering and Technology, India
Abstract: Compression of Hyper Spectral Image (HSI) is an important issue in remote sensing applications due to its huge data size. An efficient technique for HSI compression is proposed based on Discrete Wavelet Transform (DWT) and Tucker Decomposition (TD) with Adaptive Least Squares (ALS) method. This technique exploits both the spatial and spectral information in the images. ALS method is used to compute the TD which is applied on the DWT coefficients of HSI spectral bands. DWT is used to segment the HSIs into various sub-images, while TD is used to conserve the energy of the sub-images. Run Length Encoding (RLE) performs quantization of the component matrices and encoding of core tensors. The experiments are conducted with HSI compression based on DWT, TD with ALS method and HSI compression methods based on lossless JPEG (JPEG-LS), JPEG2000, Set Partitioning Embedded Block (SPECK), Object Based (OB)-SPEC and 3D-SPECK and the results of our work are found to be good in terms of Compression Ratio (CR) and Peak Signal-to-Noise Ratio (PSNR).
Keywords: ALS, CR, DWT, HSI, PSNR, RLE, TD.
Received June 17, 2013; accepted September 26, 2013
Task Scheduling Using Probabilistic Ant Colony Heuristics
Umarani Srikanth1, Uma Maheswari2, Shanthi Palaniswami3, and Arul Siromoney3
1Department of PG Studies in Engineering, S.A. Engineering College, India
2Department of Information Science and Technology, Anna University, India
3Department of Computer Science and Engineering, Anna University, India
Abstract: The problem of determining whether a set of tasks can be assigned to a set of heterogeneous processors in general is NP-hard. Generating an efficient schedule of tasks for a given application is critical for achieving high performance in a heterogeneous computing environment. This paper presents a novel algorithm based on Ant Colony Optimization (ACO) for the scheduling problem. An attempt is made to arrive at a feasible schedule for a task set on heterogeneous processors ensuring fair load balancing across the processors within a reasonable amount of time. Three parameters: Average waiting time of tasks, utilization of individual processors and the scheduling time of tasks are computed. The results are compared with those of the First Come First Served (FCFS) algorithm and it is found that ACO performs better than FCFS with respect to the waiting time and individual processor utilization. On comparison with the FCFS approach, the ACO method balances the load fairly among the different processors with the standard deviation of processors utilization being 88.7% less than that of FCFS. The average waiting time of the tasks is also found to be 34.3% less than that of the FCFS algorithm. However, there is a 35.5% increase in the scheduling time for the ACO algorithm.
Keywords: ACO, tasks scheduling problem, processor utilization, heterogeneous processors.
Received July 6, 2013; accepted September 19, 2013
Parameters Affecting The Judgment Task While Summarizing Documents
Syed Sabir Mohammed1 and Shanmugasundaram Hariharan2
2 Department of Computer Science and Engineering, TRP Engineering College, India
Abstract: Due to the instability in user agreement, necessity for improvements in document summary generation has paid huge attention. This paper, presents improvements to document summarization by analyzing the sentence position and recommendations of sentences from other sentences for making an efficient document summarizer. As we know humans are good in summarizing the contents, we have made few studies to improve the efficiency of our system much closer to user selection (manual summaries). This article focuses only on providing improvements for news articles. We have attempted to obtain summaries much closer or equal to manually generated summaries and the results obtained were promising. We also, show that term frequency approach combined with position weight gives better results, while adding node weight with the above feature yield results that were significantly better than the former approach. The paper also, illustrates some studies on some common evaluation criterion to generate a unique summary by various users. The results were also, compared with commercially existing microsoft summarizer. The results produced by us were better as compared to the existing summarizers.
Keywords: Single document summarization, stemming, stop words term frequency, sentence extraction.
Received July 4, 2013; accepted March 17, 2014
An Innovative Two-Stage Fuzzy kNN-DST Classifier for Unknown Intrusion Detection
Xueyan Jing1, Yingtao Bi2, and Hai Deng1
1Department of Electrical and Computer Engineering, Florida International University, USA
2Feinberg School of Medicine, Northwestern University, USA
Abstract: Intrusion detection is the essential part of network security in combating against illegal network access or malicious attacks. Due to constantly evolving nature of network attacks, it has been a technical challenge for an Intrusion Detection System (IDS) to recognize unknown attacks or known attacks with inadequate training data. In this work, an innovative fuzzy classifier is proposed for effectively detecting both unknown attacks and known attacks with insufficient or inaccurate training information. A Fuzzy C-Means (FCM) algorithm is firstly employed to softly compute and optimise clustering centers of the training datasets with some degree of fuzziness counting for inaccuracy and ambiguity in the training data. Subsequently, a distance-weighted k-Nearest Neighbors (k-NN) classifier, combined with the Dempster Shafer Theory (DST) is introduced to assess the belief functions and pignistic probabilities of the incoming data associated with each of known classes. Finally, a two-stage intrusion detection scheme is implemented based on the obtained pignistic probabilities and their entropy function to determine if the input data are normal, one of the known attacks or an unknown attack. The proposed intrusion detection algorithm is evaluated through the application of the KDD’99 datasets and their variants containing known and unknown attacks. The experimental results show that the new algorithm outperforms other intrusion detection algorithms and is especially effective in detecting unknown attacks.
Keywords: Network security, intrusion detection, FCM, classifiers, DST.
Received June 10, 2014; accepted February 10, 2015
On Demand Ciphering Engine Using Artificial Neural Network
Qutaiba Ali and Shefa Dawwd
Department of Computer Engineering, Mosul University, Iraq
Abstract: In this paper, a new light weight highly secured ciphering engine creation methodology we called On Demand Ciphering Engine (ODCE) was suggested. The main feature of this method is that both, the ciphering engine and the secret key, are kept secrets and created by the user or the system administrator when initiating his transmission, then the involved sides exchange these secrets. The design methodology and structure of this system was described and prototyped using Artificial Neural Network (ANN) as the main building block in the system. Software and hardware implementations of the suggested system were performed to prove its practicability. Also, different experimental tests were achieved to determine the impact of the suggested method on both network delay and system performance.
Keywords: Network security, cryptography, neural network parallel implementation, ODCE, normalized ciphering engine.
Received November 22, 2013; accepted October 26, 2014
HMM/GMM Classification for Articulation Disorder Correction Among Algerian Children
Abed Ahcène1, 2 and Guerti Mhania1
1Laboratoire Signal et Communications, Ecole Nationale Polytechnique, Algeria
2Scientific and Technical Research Center for the Development of the Arabic Language, Algeria
Abstract: In this paper, we propose an automatic classification for Arabic phonemic substitution using an Hidden Markov Model/Gaussian Mixture Model (HMM/GMM) systems. The main objective is to help Algerian children in the correction of articulation problems. Five cases are analyzed in the experiments, 20 Arabic words are recorded by a 20 Algerian children, with age range between 4 and 6 years old. Signals are recorded and stored as wave format with 16kHz as sampling rate, 12 Mel Frequency Cepstral Coefficients (MFCC), with their first and second derivates, respectively Δ and ΔΔ are extracted from each signal and used to the training and recognition phases. The proposed system achieved its best accuracy recognition 85.73%, with 5-stats HMM when the output function is modelled by a GMM with 8 gaussian components.
Keywords: Phonemic substitution, HMM/GMM, algerian dialectal, speech recognition, MFCC.
Received August 29, 2014; accepted October 26, 2014
Arabic/Farsi Handwritten Digit Recognition using Histogram of Oriented Gradient and Chain Code Histo
Arabic/Farsi Handwritten Digit Recognition using Histogram of Oriented Gradient and Chain Code Histogram
Seyyed Khorashadizadeh and Ali Latif
Abstract: The aim of this paper is to propose a novel technique for Arabic/Farsi handwritten digit recognition. We constructed an invariant and efficient feature set by combination of four directional Chain Code Histogram (CCH) and Histogram of Oriented Gradient (HOG). To achieve higher recognition rate, we extracted local features at two levels with grids 2×2, 1×1 and it causes a partial overlapping of zones. Our proposed feature set has 164 dimensions. For classification phase, Support Vector Machine (SVM) with radial basis function kernel was used. The system was evaluated on HODA handwritten digit dataset which consist of 60000 and 20000 training and test samples, respectively. The experimental results represent 99.31% classification rate. Further, 5-fold cross validation was applied on whole 80000 samples and 99.58% accuracy was obtained.
Keywords: Arabic/farsi handwritten digit recognition, CCH, HOG, SVM.
Received November 30, 2014; accepted February 4, 2015
Efficient Parallel Compression and Decompression for Large XML Files
Mohammad Ali and Minhaj Khan
Department of Computer Science, Bahauddin Zakariya University, Pakistan
Abstract: eXtensible Markup Language (XML) is gaining popularity and is being used widely on internet for storing and exchanging data. Large XML files when transferred on network create bottleneck and also degrade the query performance. Therefore, efficient mechanisms of compression and decompression are applied to XML files. In this paper, an algorithm for performing XML compression and decompression is suggested. The suggested approach reads an XML file, removes tags, divides the XML file into different parts and then compresses each different part on a separate core for achieving efficiency. We compare performance results of the proposed algorithm with parallel compression and decompression of XML files using GZIP. The performance results show that the suggested algorithm performs 24%, 53% and 72% better than the parallel GZIP compression and decompression on Intel Xeon, Intel core i7 and Intel core i3 based architectures respectively.
Keywords: XML, distributed computing, XML compression, GZIP, performance.
Received May 26, 2014; accepted January 27, 2015
A Novel Fast Otsu Digital Image Segmentation Method
Duaa AlSaeed1, 2, Ahmed Bouridane1, 2, and Ali El-Zaart3
1Department of Computer Science and Digital Technologies, Northumbria University at Newcastle, UK
2King Saud University, Saudi Arabia
3Beirut Arab University, Lebanon
Abstract: Digital image segmentation based on Otsu’s method is one of the most widely used technique for threshold selection. With Otsu’s method, an optimum threshold is found by maximizing the between-class variance and the algorithm assumes that the image contains two classes of pixels or bi-modal histogram (e.g., foreground and background). It then calculates the optimal threshold value separating these two classes so that, their between class variance is maximal. The optimum threshold value is found by an exhaustive search among the full range of gray levels (e.g., 256 levels of intensity). The objective of this paper is to develop a fast algorithm for the Otsu method that reduces the number of search iterations. A new search technique is developed and compared with the original Otsu method. Experiments on several images show that the proposed Otsu-Checkpoints fast method give the same estimated threshold value with less number of iterations thus resulting in a much less computational complexity.
Keywords: Image thresholding, otsu method, optimized search technique.
Received April 23, 2013; accepted June 23, 2014
Illicit Material Detection using Dual-Energy X-Ray Images
Reza Hassanpour
Computer Engineering Department, Cankaya University, Turkey
Abstract: Dual energy X-ray inspection systems are widely used in security and controlling systems. The performance of these systems however, degrades with the poor performance of human operators. Computer vision based systems are of vital importance in improving the detection rate of illicit materials, while keeping false alarms at a reasonably low level. In this study, a novel method is proposed for detecting material overlapping and reconstructing multiple images by alleviating these overlaps. Evaluation tests were conducted on images taken from luggage inspection X-ray screening devices used in shopping centres. The experimental results indicate that the reconstructed images are much easier to inspect by human operators than the unprocessed original images.
Keywords: X-ray inspection, segmentation, security systems.
Received March 18, 2013; accepted October 26, 2014
Control and Management of Coal Mines with Control Information Systems
Miroljub Grozdanovic1, Dobrivoje Marjanovic2, and Goran Janackovic3
1Engineering Academy of Serbia, Serbia
2Ei R and D Institute Nis, Serbia
3Faculty of Occupational Safety, University of Nis, Serbia
Keywords: Control centre, control panel, information filtering, O-computer dialogue, ergonomic design.
Received January 12, 2014; accepted September 9, 2014
Biometric Cryptosystems based Fuzzy Commitment Scheme: A Security Evaluation
Maryam Lafkih1, Mounia Mikram1, 2, Sanaa Ghouzali1, 3, Mohamed El Haziti1, 4, Driss Aboutajdine1
1LRIT (associated unit with CNRST, URAC 29), Mohammed V-Agdal University, Morocco
2The School of Information Sciences, Morocco
4Higher School of Technology, Morocco
Keywords: Biometric systems vulnerabilities, biometric cryptosystems, fuzzy commitment, security analysis.
Received August 18, 2013; accepted May 10, 2014
Full Text