Wednesday, 09 June 2010 19:00

A Markovian Approach for Arabic
Root Extraction

Abderrahim Boudlal1, Rachid Belahbib2, Abdelhak Lakhouaja3, Azzeddine Mazroui3,
Abdelouafi Meziane3, and Mohamed Bebah3
1Faculty of Letters and Human Sciences, University Mohamed I, Morocco
2College of Arts and Sciences, Qatar University, Qatar
3Department of Mathematics and Computer Sciences, University Mohamed I, Morocco

Abstract: In this paper, we present an Arabic morphological analysis system that assigns, for each word of an unvoweled Arabic sentence, a unique root depending on the context. The proposed system is composed of two modules. The first one consists of an analysis out of context. In this module, we segment each word of the sentence into its elementary morphological units in order to identify its possible roots. For that, we adopt the segmentation of the word into three parts (prefix, stem,  suffix). In the second module we use the context to identify the correct root among all the possible roots of the word. For this purpose, we use a Hidden Markov Models approach, where the observations are the words and the possible roots represent the hidden states. We validate the approach using the NEMLAR Arabic writing corpus consisting of 500,000 words. The system gives the correct root in more than 98% of the training set, and in almost 94% of the words in the testing set.

Keywords: Arabic NLP, morphological analysis, root extraction, hidden Markov models, and Viterbi algorithm.

Received February 21, 2009; accepted August 3, 2009

 

Full Text

 

Wednesday, 09 June 2010 19:00

 Novel Robust Multilevel 3D Visualization Technique for Web Based GIS

      Mohamed Sherif 1 and Hatem Abdul-Kader 2
1Faculty of Computers and Informatics, Suez Canal University, Egypt
2Faculty of Computers and Information, Menofiya University, Egypt


Abstract: Number of recent technologies take Geographic Information Systems to new levels of power and usability. One of the most promising technologies that empower Geographic Information Systems is the 3D GIS modeling. In this paper, a novel robust multilevel data structure model called EBOT model (block octree tetrahedron mode Geographic Information Systems l) is presented on the basis of BOT visualization model. This model combines octree and Tetrahedral Network structures. In this paper a performance simulation for implementing EBOT (Enhanced block octree tetrahedron model) algorithm into the browser using X3D Visualization is presented. A Simulation results is introduced to demonstrate the robustness of the proposed algorithm.

Keywords: Geographic information systems, Geometric modeling, octrees, visualization.


Received December 13, 2008; accepted August 3, 2009

Wednesday, 09 June 2010 19:00

AMI: An Advanced Endurance Management Technique for Flash Memory Storage Systems

     Sanam Shahla Rizvi and Tae-Sun Chung
School of Information and Computer Engineering, Ajou University, Korea 


Abstract: Flash memory is small size, lightweight, shock-resistant, nonvolatile, and consumes little power. Flash memory therefore shows promise for use in storage devices for consumer electronics, mobile computers, wireless devices and embedded systems. However, flash memory cannot be overwritten unless erased in advance. Erase operations are slow that usually decrease system performance and consume power. The number of erase cycles is also limited, and a single worn-out block affects the usefulness of entire flash memory device. Therefore, for power conservation, better system performance and longer flash memory lifetime, system support for erasure management is necessary. In this paper, we propose a novel idea of system software for garbage collection and wear-leveling called Allocation of Memory Intellectually for NAND flash memories. Proposed scheme classifies data blocks intellectually according to their write access frequencies and improves the space utilization by allocating separate limited number of log blocks to both natures, hot and cold, of data blocks with proposed new system architecture. Our proposed cleaning scheme achieves a block to erase with optimal number of space utilization and minimum overhead of data migration. A hybrid wear-leveling approach is also proposed to evenly wear-down flash memory. Proposed scheme enhances the system life time by managing the blocks according to their degree of worn. We compared our proposed idea with two previous schemes. Our proposed idea improved system performance 95% for garbage collection and 36% for wear-leveling. The evaluation results prove that our proposed scheme, AMI, outperforms both previous schemes particularly with efficient flash bandwidth utilization and attempted erase operations.

Keywords: Consumer electronics, embedded systems, data organization, endurance management, memory management, and system performance.


Received May 11, 2008; accepted November 5, 2008

Wednesday, 09 June 2010 19:00

Social Issues in Wireless Sensor Networks
with Healthcare Perspective


Moshaddique Al Ameen and Kyung Kwak
Graduate School of IT and Telecommunications, Inha University, Korea

Abstract: The recent advances in Wireless Sensor Networks have given rise to many application areas in healthcare. It has produced new field of Wireless Body Area Networks. Using wearable and non-wearable sensor devices humans can be tracked and monitored. Monitoring from the healthcare perspective can be with or without the consent of the particular person. Even if it is with the consent of the person involved, certain social issues arise from this type of application scenario. The issues can be privacy, security, legal and other related issues. Healthcare sensor networks applications have a bright future and it is a must to take up these issues at the earliest. The issues should be carefully studied and understood or else they can pose serious problems. In this paper we try to raise and discuss these issues and find some answers to them.

Keywords: Wireless sensor networks, healthcare systems, social issues, privacy, security, and legal issues.

Received September 27, 2008; accepted February 25, 2009

 

Full Text

 

Wednesday, 09 June 2010 19:00

 e-Learning Systems in Virtual Environment

       Hatem Mohamed Abdul-Kader
 Faculty of Computers and Information, Menofiya University, Egypt


Abstract: E-learning is one of the emerging needs of the information age. Therefore a lot of potential is seen in distance learning development. Virtual environment interface to e-learning systems have recently appeared on the Internet. Using virtual reality environment, the applications appear to be promising to e-learning tasks more nature and interactive. Also using this technology, it is possible to get a sense of three dimensional environments and level of user immersion. Extensible 3D (X3D) is the most common tool for building 3D viewing and browsing of e-learning systems.  In this paper the benefits of virtual reality environment using X3D in e-learning applications are demonstrated. Those will be shown via implementing two web enabled virtual environment e-learning systems. The first one is an on-line virtual chemistry lab system. This application gives the student the ability to perform all experiments in a certain crucial. The second application is an on-line English language education system. This application gives the students the ability to learn the language audile and visual via on line interactive system.  X3D is used as the main implementation tool which gives the systems users the full visualization and interactivity of all learning steps.

Keywords: e-Learning, virtual reality, virtual lab and, distance learning, and X3D.


Received October 6, 2008; accepted May 17, 2009

Wednesday, 09 June 2010 19:00

Stochastic Bounds for Microprocessor
 Systems Availability

Ihab Sbeity1, Mohamed Dbouk1, and Brigitte Plateau2
 1Faculty of Sciences, Lebanese University, Lebanon
2Laboratoire Informatique de Grenoble, France

Abstract: The computing of stochastic bounds has become an efficient technique to obtain performance predictions for computer systems by the mean of Markovian models. However, the quality of these bounds may be affected by several properties related not only to how to use the technique but also to the Markovian model itself. On the other hand, multiprocessor systems have become an efficient and widely used infrastructure to run several critical-life applications. In this paper, we describe how to calculate a stochastic bound for the multiprocessor system availability. We focus on the irreducibility of the model to show how it might influence the quality of bounds.

Keywords: Stochastic bounds, multiprocessor system, availability, irreducibility, and Markov chain.

Received January 15, 2009; accepted August 20, 2009

 

Full Text

 

Tuesday, 09 February 2010 19:00

Performance Analysis of Two-Hop Cooperative MIMO Transmission with Best Relay Selection
in Rayleigh Fading Channel

Ahasanun Nessa, Qinghai Yang, and Kyung-sup Kwak
Graduate School of IT and Telecommunications, Inha University, South Korea
State Key Laboratory of ISN, School of Telecom, Engineering, Xidian University


Abstract: Wireless Relaying is a promising solution to overcome the channel impairments and provides high data rate coverage that appears for beyond 3G mobile communications. In this paper we present end to end BER performance of dual hop wireless communication systems equipped with multiple Decode and Forward relays over Rayleigh fading channel with the best relay selection. We compare the BER performance of the best relay with the BER performance of single relay. We select the best relay based on the end to end channel conditions. We further calculate the outage probability of the best relay.  It is shown that the outage probability of the best relay is equivalent to the outage probability when all relays take part in the transmission. We apply orthogonal space time block coding at the source terminal. Numerical and simulation results are presented to verify our analysis.

Keywords: Bit error rate, amplify and forward, multiple input multiple output, decode-and-forward, and probability density function (pdf).

Received September 14, 2008; accepted May 17, 2009

Tuesday, 09 February 2010 19:00

Binary Phoneme Classification Using  Fixed and Adaptive Segment- Based Neural Network Approach

Lotfi Messikh1 and Mouldi Bedda2
1Electronic Department Annaba University Algeria, Algeria
 2College of Engineering, ALJouf University, KSA


Abstract: This paper addresses the problem of binary phoneme classification via a neural net segment-based approach. Phoneme groups are categorized based on articulatory information. For an efficient segmental acoustic properties capture, the phoneme associated with a speech segment is represented using MFCC’s features extracted from different portions of that segment as well as its duration. These portions are obtained with fixed or variable size analysis. The classification is done with a Multi-Layer Perceptron trained using the Mackay’s Bayesian approach. Experimental results obtained from the Otago speech corpus favourites the use of fixed segmentation strategies over adaptive ones for resolving consonants/vowels, Fricatives/non fricatives, nasals/non nasals and stops/non stops binary classification problems.

Keywords: Signal segmentation, binary phoneme classification, segment-based processing, and neural network.

Received November 11, 2008; accepted May 17, 2009

Tuesday, 09 February 2010 19:00

A Strategy to Reduce the Control Packet Load of AODV Using Weighted Rough Set Model for MANET

Nagaraju Aitha1 and Ramachandram Srinadas2
1Kamala Institute of Technology & Science, Singapur
2Department of CSE, Osmania University, Hyderabad

Abstract: A mobile Ad-hoc networks consists of wireless hosts that may move often, movement of host results changes in path. The well known Ad-Hoc On-demand distance vector routing protocol determines a route when no route exists or route breaks. To establish new path from source to destination, it broadcast control packets (route request packets), which increases the network bandwidth consumption. As mobile Ad-hoc networks have limited bandwidth, it is important to reduce the control packets. We propose a protocol which uses the weighted Rough set model to control the route request packets in the existing Ad-hoc on-demand distance vector routing protocol. Weighted Rough set theory is a mathematical tool to deal with vagueness, uncertainty and it also considers the importance of the objects. 

Keywords: Weighted roughest, AODV, and route request. 

Received May 14, 2009; accepted August 10, 2009

Saturday, 14 February 2009 19:00

Network Load and Packet Loss Optimization During Handoff Using Multi-Scan Approach

 Anwar Saif and Mohamed Othman
Deptartment of Communication Technology and Network, Universiti Putra Malaysia, Malaysia


Abstract: Handoff is a critical function that enables mobile nodes to stay connected to the wireless network by switching the data connection from one WLAN to another. During handoff the communication may be degraded or interrupted due to the high packets loss. To prevent packet loss during handoff, a handoff management scheme that employs a transport protocol has been proposed. It supports multiple connections for Voice Over IP communication and makes handoff decision based on the number of frame retransmission on the MAC layer. Moreover, the handoff scheme uses the multi-scan technique that enables mobile nodes to use two WLAN interfaces for channel scanning and multi-path transmission rather than single WLAN interface. This technique introduces extra network overhead during multi-path transmission. This work optimizes the network overhead and packet loss and keeps VoIP communication at an acceptable level.

Keywords: Handoff, network overhead, packet loss, multi-scan, VoIP, and WLAN.

Received October 5, 2008; accepted May 17, 2010

Full Text
Saturday, 14 February 2009 19:00

A Constraint Programming Based Approach to Detect Ontology Inconsistencies

Moussa Benaissa and Yahia Lebbah
Faculté des Sciences, Université d’Oran Es-Senia, Algeria


Abstract: This paper proposes a constraint programming based approach to handle ontologies consistency, and more precisely user-defined consistencies. In practice, ontologies consistency is not still well handled in the current software environments. This is due to the algorithmic and language limitations of the tools developed to handle the consistency constraints. The idea of this paper is to tackle this problem by exploiting constraint programming which is proved to be efficient and provides various techniques to handle many types of constraints on different domains.



Keywords: Constraint programming, filtering, ontologies, consistency checking, ontology evolution.

Received January 8, 2008; accepted May 17, 2008

Full Text
Saturday, 14 February 2009 19:00

A New Grid Resource Discovery Framework

Mahamat Hassan and Azween Abdullah
 Department of Computer and Information Sciences, Universiti Teknologi PETRONAS, Malaysia


Abstract: Resource Discovery is an important key issue in grid systems since resource reservation and task scheduling are based on it. This paper proposes a novel semantic-based scalable decentralized grid RD framework. The paper integrates ontology, Peer-to-Peer network and intelligent agents to build the framework. The framework consists of an ontology model, an agent model, and a set of algorithms for implementing the P2P architecture and searching the shared resources.  The paper shows how the framework satisfies grid RD features such as scalability, decentralization, dynamism and interoperability.

Keywords: Grid computing, peer-to-peer networks, ontology, and intelligent agent.

Received May 7, 2009; accepted August 4, 2009

Full Text
Saturday, 14 February 2009 19:00

Rank-Order Weighting of Web Attributes
for Website Evaluation

Mehri Saeid, Abdul Azim Abd Ghani, and Hasan Selamat
Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Malaysia


Abstract: The rapid growth of web applications increases the need to evaluate web applications objectively. In the past few years some valuable works like WebQEM tried to objectively evaluate the web applications. However, still weighting web attributes which is one step of evaluation of web applications is completely subjective, depending mostly on experts’ judgments. In this paper a two-step weighting approach is proposed. The approach divided the weighting step into two steps which are ranking and then weighting by using rank-order weighting formula. A simulation was conducted to compare using different rank-order weighting methods (RR, RS, and ROC) with the TRUE weights (simulated experts’ judgments without prior ranking). Two kinds of comparison were done; comparison on weights and comparison on quality scores. From the results, using Rank-Sum is suggested as a good surrogate for experts’ weights for the attributes when evaluating some web applications.

Keywords: Web attribute, Web quality, and attribute weighting.

Received October 8, 2008; accepted May 17, 2009

Full Text
Saturday, 14 February 2009 19:00

Lossless Text Compression Technique Using Syllable Based Morphology

Ibrahim Akman1, Hakan Bayindir1, Serkan Ozleme2, Zehra Akin3, and Sanjay Misra1
1Computer Engineering Department, Atilim University, Turkey
2Parana Vision Image Processing Technologies and Solutions Consultancy Corporation, Turkey
3Meteksan Systems and Computer Technologies Corporation, Turkey


Abstract: In this paper, we present a new lossless text compression technique which utilizes syllable-based morphology of multi-syllabic languages. The proposed algorithm is designed to partition words into its syllables and then to produce their shorter bit representations for compression. The method has six main components namely source file, filtering unit, syllable unit, compression unit, dictionary file and target file. The number of bits in coding syllables depends on the number of entries in the dictionary file. The proposed algorithm is implemented and tested using 20 different texts of different lengths collected from different fields. The results indicated a compression of up to 43%.

Keywords: Algorithm, text compression technique, syllable, multi-syllabic languages.

Received December 15, 2008; accepted August 3, 2010

Full Text
Saturday, 14 February 2009 19:00

Entropy as a Measure of Quality of
XML Schema Document

Dilek Basci1 and Sanjay Misra2
 1BILGI Geographic Information Conversion & Management System, LTD, Turkey
2Department of Computer Engineering, Atilim University, Turkey


Abstract: In this paper, a metric for the assessment of the structural complexity of eXtensible Markup Language schema document is formulated. The present metric ‘Schema Entropy is based on entropy concept and  intended to measure the complexity of the schema documents written in W3C XML Schema Language due to diversity in the structures of its elements. The SE is useful in evaluating the efficiency of the design of Schemas. A good design reduces the maintainability efforts. Therefore, our metric provides valuable information about the reliability and maintainability of systems. In this respect, this metric is believed to be a valuable contribution for improving the quality of XML-based systems. It is demonstrated with examples and validated empirically through actual test cases.

Keywords: XML Schema documents, W3C XML schema language, complexity metric, entropy, maintainability, and validation.

Received December 22, 2008; accepted August 3, 2009

Full Text
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…