Monday, 24 June 2019 01:22

Improving Energy Efficiency and Impairing Environmental Impacts on Cloud Centers by Transforming Virtual Machine into Self-Adaptive Resource Container

Siva Shanmugam1 and Sriman Iyengar2

1Assistant Professor, School of Computer Science, India

2Information Technology, Sreenidhi Institute of Science and Technology, India

Abstract: Enterprises are seeking on-demand computing models that can be employed with better utilization and reduced operational cost by remitting up to the users' needs; this brings up zero charges as zero demand exhibits because user demands are vary drastically over time. To reinforce dynamic resource provision, service providers have to maintain more computational resources than needed. Meanwhile, the IT sector has more apprehensions about the impact on the environment due to an increase in carbon dioxide emissions, higher electricity consumption and a growth in the electronic wastes from electronic components. Most of the research works focus primarily on the expertise required for providing the needed resources and not care on resource utilized which brings unsustainability. To achieve sustainable computing, unwanted installation of contemporary computational resources should be rolled up and better sharing options should be made available. This paper proposes new virtualization techniques which engage cloud services exclusively between host and guest operating environments. By doing so, this mechanism stands as the best crossover with other working engines and provide open service to execute any type of applications on it. Finally, the combination of cloud service and virtualization enables container features with efficient utilization factor. Most probably, a proper combination of these resources solves any computational issues so these two resource mechanism’s always standing on the top of the change. This experiment analysis aims to compare the performance of container with virtual machine based container in an adopted infrastructure via cloud simulator. And the result of better efficiency metrics attained by virtual based container were explored and plotted.

Keywords: Container as a service, green cloud computing, energy efficiency, resource utilization.

Received December 2, 2016; accepted March 26, 2017

Monday, 24 June 2019 01:20

Wavelet Tree based Dual Indexing Technique

for Geographical Search

Arun Kumar Yadav1 and Divakar Yadav2

1Department of Computer Science and Engineering, Ajay Kumar Garg Engineering College, India

 2Department of Computer Science and Engineering, Madan Mohan Malaviya University of Technology, India

Abstract: Today’s information retrieval systems are facing new challenges in indexing the massive geographical information available on internet. Though in past, solutions for it, based on R-tree family and B-tree has been given, but due to increased size of index, they are found to be less efficient and time consuming. This paper presents a dual indexing technique for Geographical Information Retrieval. It uses wavelet tree data structure for both, textual and spatial indexing. It also allows dynamic insertion of Minimum Bounding Rectangle (MBR) in the wavelet tree during index construction. The proposed technique has been evaluated in terms of efficiency and time complexity. For pure spatial indexing, using this technique, the search time complexity is reduced and takes even less than one third time of that of spatial indexing performed using R-tree or R*-tree. Even in case of dual indexing (textual and spatial) using wavelet tree, the search time is reduced by half in comparison to other techniques such as B/R, B/R* when the search query length is larger than 2 keywords. In case the query is of 1 or 2 keywords, the search time remains approximately similar to that of other techniques.

Keywords: Information retrieval, wavelet tree, spatial search, indexing, Minimum bounding rectangles.

Received May 28, 2016; accepted May 1, 2017

Full Text    

Monday, 24 June 2019 01:19

A Dynamic Scheduling Method for Collaborated Cloud with Thick Clients

Pham Phuoc Hung1, Golam Alam2, Nguyen Hai3, Quan Tho3, and Eui-Nam Huh4

1Department of Computer Science, Kent State University, USA

2Department of Computer Science and Engineering, BRAC University, Bangladesh

3Ho Chi Minh City University of Technology, Vietnam National University, Vietnam

4Department of Computer Engineering, Kyung Hee University, Korea

Abstract: Nowadays, the emergence of computation-intensive applications brings benefits to individuals and the commercial organization. However, it still faces many challenges due to the limited processing capacity of the local computing resources. Besides, the local computing resources require a lot of finance and human forces. This problem, fortunately, has been made less severe, thanks to the recent adoption of Cloud Computing (CC) platform. CC enables offloading heavy processing tasks up to the "cloud", leaving only simple jobs to the user-end capacity-limited clients. Conversely, as CC is a pay-as-you-go model, it is necessary to find out an approach that guarantees the highly efficient execution time of cloud systems as well as the monetary cost for cloud resource use. Heretofore, a lot of research studies have been carried out, trying to eradicate problems, but they have still proved to be trivial. In this paper, we present a novel architecture, which is a collaboration of the computing resources on cloud provider side and the local computing resources (thick clients) on client side. In addition, the main factor of this framework is the dynamic genetic task scheduling to globally minimize the completion time in cloud service, while taking into account network condition and cloud cost paid by customers. Our simulation and comparison with other scheduling approaches show that the proposal produces a reasonable performance together with a noteworthy cost saving for cloud customers.

Keywords: Genetic, cloud computing, task scheduling, thick client, distributed system.

Received September 10, 2014; accepted January 20, 2016

 Full Text   

Monday, 24 June 2019 01:18

Modelling and Analysis of a Semantic Sensor Service Provider Ontology

Faiza Tila and Do Kim

Computer Engineering Department, Jeju National University, South Korea

Abstract: The realization of Internet of Things has gained a huge amount of momentum in the past few years. It’s vision is to interconnect devices from all over the world. These devices are heterogeneous and produce data that is multi modal and diverse in nature. The heterogeneity of the devices and data makes interoperability an issue in IoT. In this paper we are presenting the modelling of a semantic sensor service provider and its ontology i.e., the Sensor Service Provider (SSP) ontology. The semantic sensor service provider is a module which is a part of a larger system i.e., a semantic IoT system based on context aggregation of an indoor environment. To provide interoperability between the devices used by the system, we have developed ontologies for each domain of the system. The modelling of the ontology presented in this paper reuses the SSN ontology to define the basic concepts and observations of a sensor, and has been extended to define concepts related to the module itself. Simple Protocol and Resource Description Framework (RDF) Query Language (SPARQL) queries are used to retrieve data from the ontology as well as manipulate the data stored to it.

Keywords: RDF, web ontology language, internet of things, SPARQL, semantic sensor service provider, and semantic sensor service provider ontology.

Received May 4, 2015; accepted January 13, 2016

 Full Text 

Monday, 24 June 2019 01:17

A Black-Box and Contract-Based Verification

of Model Transformations

Meriem Lahrouni1,2, Eric Cariou2, and Abdelaziz El Fazziki1

1Computer Science Department, University Cadi Ayyad, Morocco

2Computer Science Laboratory, University of Pau and Pays de l’Adour, France

Abstract: The main goal of Model-Driven Engineering (MDE) is to manipulate productive models to build software. In this context, model transformation is a common way to automatically manipulate models. It is then required to ensure that transformation has been correctly processed. In this paper, we propose a contract-based method to verify that a target model is a valid result of a source model with respect to the transformation specification. The verification is made in a black-box mode, independently of the implementation and the execution of the transformation. The method allows the contract to be written in any constraint language. In association with this method, we have implemented a tool that partially generates contracts written in OCL and manages their evaluation for both endogenous and exogenous transformations.

Keywords: MDE, model transformation, contract, verification.

Received October 26, 2015; accepted July 19, 2017

 Full Text   

Monday, 24 June 2019 01:15

QoS Based Multi-Constraints Bin Packing Job

Scheduling Heuristic for Heterogeneous

Volunteer Grid Resources

Saddaf Rubab, Mohd Fadzil Hassan, Ahmad Mahmood, and Nasir Mehmood

Department of Computer and Information Sciences, University Technology Petronas, Malaysia

Abstract: Volunteer grid is a kind of distributed networks, consisting of contributed resources which are heterogonous and distributed. The heterogeneity of resources can be in terms of the time of availability, resource characteristics among others. Usually submitted jobs to volunteer grid usually require different heterogeneous resources depending on their requirements. Efficient scheduling of submitted jobs can be done if jobs are divided into small number of tasks to fulfil multiple requirements, which requires multi-resource scheduling policy to consider different constraints of resource and job before scheduling. In traditional scheduling policies only single scheduling or optimization constraint is considered to either complete job within specific deadline or to maximize the resource usage. Therefore, a scheduling policy is required to serve multiple constraints for optimizing resource usage and completing jobs within specified deadlines. The work presented in this paper proposed Quality of Service (QoS) based multi-constraint job scheduling heuristics for volunteer grid resources. Bin packing problem is also incorporated within the proposed heuristic for reordering and jobs assignment. The performance of proposed scheduling heuristic is measured by comparing it with other scheduling algorithms used in grid environment. The results presented suggest that there is a reasonable improvement in waiting time, turnaround time, slowdown time and job failure rate.

Keywords: Volunteer grid computing, volunteer resources, QoS, SLA, multi-constraints, rescheduling, bin-packing, back-filling.

Received October 30, 2015; accepted April 13, 2017

Full Text    

Monday, 24 June 2019 01:14

A Novel Face Recognition System by the Combination of Multiple Feature Descriptors

Nageswara Reddy1, Mohan Rao2, and Chittipothula Satyanarayana1

1Department of Computer Science and Engineering, Jawaharlal Nehru Technological University, India

2Department of Computer Science and Engineering, Avanthi Institute of Engineering and Technology, India

Abstract: Face recognition system best suits several security based applications such as access control system and identity verification system. A robust system to recognise human faces, which relies upon features, is proposed in this work. Initially, the reference face is created and the features are extracted from the reference face by feature descriptors such as Local Binary Pattern (LBP), Local Vector Pattern (LVP) and Gabor Local Vector Pattern (GLVP). The extracted features are combined together and are clustered by employing cuckoo search algorithm. Finally in the testing phase, the face is recognised by Extreme Learning Machine (ELM), which differentiates faces by considering facial features. The public database ‘Faces 95’ is exploited for analysing the performance of the system. The proposed work is analysed for its performance and evaluated against existing algorithms such as Principal Component Analysis (PCA), Canonical Correlation Analysis (CCA), combination of CCA and k Nearest Neighbour (kNN) and combination of CCA and Support Vector Machine (SVM) and experimental results are satisfactory in terms of accuracy, misclassification rate, sensitivity and specificity.

Keywords: Face recognition system, LBP, LVP, GLVP, ELM.

Received January 8, 2016; accepted November 17, 2016

Full Text    

Monday, 24 June 2019 01:12

A Novel Age Classification Method Using

Morph-Based Models

Asuman Günay Yılmaz1 and Vasif Nabiyev2

1Department of Computer Technologies, Karadeniz Technical University, Turkey

2Department of Computer Engineering, Karadeniz Technical University, Turkey

Abstract: Automatic facial age classification and estimation is an interesting and challenging problem, and has many real world applications. The performances of the classification methods may differ depending on the selected training samples. Also using large amount of training samples makes the classification systems more complex and time consuming. In this paper, a novel and a simple age classification method using morph-based age models is presented. The age models representing the common characteristics of age groups are produced using image morphing method. Then age related facial features are extracted with Local Binary Patterns. In the classification phase, ensemble of distance metrics is used to determine the closeness of the test sample to age groups. Then, the results of these metrics are combined with Borda Count voting method to improve the classification performance. Experimental results using the Face and Gesture Recognition Research Network (FGNET) and Park Aging Mind Laboratory (PAL) aging databases show that the proposed method achieves better age classification accuracy when compared to some of the previous methods.

Keywords: Age classification, image morphing, local binary patterns, borda count voting.

Received January 27, 2016; accepted June 13, 2016

Full Text    

Monday, 24 June 2019 01:10

Colour Histogram and Modified Multi-layer Perceptron Neural Network based Video Shot Boundary Detection

DaltonThounaojam1, Thongam Khelchandra2, Thokchom Jayshree2, Sudipta Roy3, and Khumanthem Singh2

1Department of Computer Science and Engineering, National Institute of Technology Silchar, India

2Department of Computer Science and Engineering, National Institute of Technology Manipur, India

3Department of Computer Science and Engineering, Assam University Silchar, India        

Abstract: The paper proposes a shot boundary detection technique using colour histogram difference and modified Multi-Layer Perceptron (MLP). In this the learning process in the MLP is modified as an evolutionary learning process using Genetic Algorithm (GA) in which the weights of the hidden layer and output layer of the MLP are updated by GA. Colour Histogram Differences (HD) between two consecutive frames are used for feature extraction. Four values HDi,HDi-1 and-1 are used as an input for the modified MLP Neural Network where HDi is the colour histogram difference between frame fi and fi+1, HDi-1 is the colour histogram difference between frame fi-1 and fi and HDi+1 is the colour histogram difference between frame fi+1 and fi+2. The propose system is tested with the TRECVid 2001 and 2007 test data and it is also compared with latest algorithms and yields better results.

Keywords: Abrupt; fade-in; fade-out; dissolve; shot boundary detection; neural network; genetic algorithm.

Received February 11, 2016; accepted March 26, 2017

Full Text    

Sunday, 23 June 2019 07:59

Securely Publishing Social Network Data

Emad Elabd1, Hatem AbdulKader1, and Waleed Ead2

1Faculty of computers and information, Menoufia University, Egypt

2Faculty of Computers and Information, Beni-Suef University, Egypt

Abstract: Online Social Networks (OSNs) data are published to be used for the purpose of analysis in scientific research. Yet, offering such data in its crude structure raises serious privacy concerns. An adversary may attack the privacy of certain victims easily by collecting local background knowledge about individuals in a social network such as information about its neighbors. The subgraph attack that is based on frequent pattern mining and members’ background information may be used to breach the privacy in the published social networks. Most of the current anonymization approaches do not guarantee the privacy preserving of identities from attackers in case of using the frequent pattern mining and background knowledge. In this paper, a secure k-anonymity algorithm that protects published social networks data against subgraph attacks using background information and frequent pattern mining is proposed. The proposed approach has been implemented and tested on real datasets. The experimental results show that the anonymized OSNs can preserve the major characteristics of original OSNs as a tradeoff between privacy and utility.

Keywords: Data publishing, privacy preserving, online social networks, background knowledge, anonymization, frequent pattern mining.

Received May 7, 2016; accepted June 12, 2017

Full Text   

Sunday, 23 June 2019 07:58

An Efficient Steganographic Approach to HideInformation in Digital Audio using Modulus Operation

Krishna Bhowal, Debasree Chanda, Susanta Biswas, and Partha Sarkar

Department of Engineering and Technological Studies, University of Kalyani, India

Abstract: This paper presents an efficient data hiding technique where the encrypted secret message has been hidden into digital audio based on modified Exploiting Modification Direction (mEMD) technique. We put an effort to minimize the bit alterations introduced in the host audio signal during data hiding process. The proposed scheme confirms that the maximum change is less than 6.25% of the related audio sample and the average sample level error is less than 3%. The experimental results ensure that the method has a higher embedding capacity (88.2 kbps), maintaining imperceptibility (Object Difference Grades are between -0.10 and -0.31) and offer robustness against detection of intentional or unintentional audio signal attack detection. Based on imperceptibility, security, robustness, and embedding capacity - performance has been evaluated.

Keywords: Information security, audio steganography, watermarking, secret communication.

Received June 5, 2016; accepted August 21, 2017

Full Text   

Sunday, 23 June 2019 07:57

Digital Signature Protocol for Visual Authentication

Anirban Goswami1, Ritesh Mukherjee2, Soumit Chowdhury3, and Nabin Ghoshal4

1Department of Information Technology, Techno India, India

2Department of Advanced Signal Processing, Centre for Development of Advanced Computing, India

3Department of Computer Science and Engineering, Government College of Engineering and Ceramic Technology, India

4Department of Engineering and Technological Studies, University of Kalyani, India

Abstract: Information security in digital domain is all about assurance of Confidentiality, Integrity, Availability (CIA) extending authenticity and non-repudiation issues. Major concerns towards implementation of information security are computational overhead, implementation complexity and robustness of the protocol. In this paper, we proposed a solution to achieve the target in line with state of the art information security protocol. The computational overhead is significantly reduced without compromising the uncertainty in key pair generation like existing digital signature schemes. The first section deals with collection of digitized signature from an authentic user, generation of shares from the signature, conversion of a cover image to quantized frequency form and casting of a share in appropriate coefficients. In the second section, share detection is done effectively and the data security is confirmed by overlapping the detected share with the other share. Specific constraints are fitted appropriately to recreate a clean digitized signature, reform the cover image using Discrete Cosine Transform (DCT) and quantization method, select frequency coefficients for share casting and manipulate the casting intensity. Impressive effort is made to ensure resistance to some of the common image processing attacks. The undesired white noise is reduced considerably by choosing a suitable threshold value. The selection of pseudorandom hiding position also helps to increase the robustness and the experimental results supports the efficacy of the algorithm.

Keywords: Share, DCT and IDCT, image compression, data hiding, SSIM, collusion attack.

Received July 6, 2016; accepted August 21, 2017
Full Text   
Sunday, 23 June 2019 07:51

Change Management Framework to Support

UML Diagrams Changes

Bassam Rajabi and Sai Peck Lee

Faculty of Computer Science and Information Technology, University of Malaya, Malaysia

Abstract: An effective change management technique is essential to keep track of changes and to ensure that software projects are implemented in the most effective way. Unified Modeling Language (UML) diagrams are widely adopted in software analysis and design. UML diagrams are divided into different perspectives in modelling a problem domain. Preserving the consistency among these diagrams is very crucial so that they can be updated continuously to reflect software changes. In this research, a change management framework is proposed to trace the dependency and to determine the effect of the change in UML diagrams incrementally after each update operation. A set of 45 change impact and traceability analysis templates for all types of change in UML diagrams elements are proposed to detect the change affected and to maintain the diagrams consistency and integrity. The proposed framework is modeled and simulated using Coloured Petri Nets (CPNs) formal language. UML is powerful in describing the static and dynamic aspects of systems, but remains semi-formal and lacks techniques for models validation and verification especially if these diagrams updated continuously. Formal specifications and mathematical foundations such as CPNs are used to automatically validate and verify the behavior of the model. A new structure is proposed for the mutual integration between UML and CPNs modeling languages to support model changes.

Keywords: Change impact, change management, traceability analysis, unified modeling language, coloured petri nets.

Received April 16, 2016; accepted September 24, 2017
Full Text   
Sunday, 23 June 2019 07:50

Comparative Analysis of PSO and ACO Based Feature Selection Techniques for Medical Data Preservation

Dhanalakshmi Selvarajan1, Abdul Samath Abdul Jabar2, and Irfan Ahmed3

1Department of Computer Applications and Software Systems, Sri Krishna Arts and Science College, India

2Department of Computer Science, Government Arts College, India

3Department of Computer Applications, Nehru Institute of Engineering and Technology, India

Abstract: Sensitive medical dataset consist of large number of disease attributes or features, not all these features are used for diagnosis. In order to preserve the medical dataset it is not essential to perturb all the features before it is shared for mining purpose. To reduce the computational cost and to increase the efficiency, in this work tried to use Ant Colony Optimization (ACO) for feature subset selection which is used to reduce the dimension and also compared with feature subset selection using Particle Swarm Optimization (PSO) which is also used to reduce the dimension. Both the techniques are explored to reduce the dimension before applying preservation technique. By using randomization method a known distribution is added to the reduced sensitive data before the data is sent to the miner. The approach is analyzed using standard UCI medical datasets. The result is analyzed based on classification accuracy using machine learning algorithms (Naïve Bayes, Decision Tree) build on the randomized dataset. The experimental results show that the accuracy is maintained in the reduced perturbed datasets. The results also show that ACO search based feature selection has more accuracy than PSO search based selection.

Keywords: Randomization, particle swarm optimization, ant colony optimization, feature selection.

Received October 9, 2015; accepted November 9, 2016
Full Text    
Sunday, 23 June 2019 07:48

Texture Segmentation from Non-Textural Background Using Enhanced MTC

Mudassir Rafi and Susanta Mukhopadhyay

Department of Computer Science and Engineering, Indian Institute of Technology, India

Abstract: In image processing, segmentation of textural regions from non-textural background has not been given a significant attention, however, considered to be an important problem in texture analysis and segmentation task. In this paper, we have proposed a new method, which fits under the framework of mathematical morphology. The entire procedure is based on recently developed textural descriptor termed as Morphological Texture Contrast (MTC). In this work authors have employed the bright and dark top-hat transformations to handle the bright and dark features separately. Both bright and dark features so extracted are subjected to MTC operator for identification of the texture components which in turn are used to enhance the textured parts of the original input image. Subsequently, our method is employed to segment the bright and dark textured regions separately from the two enhanced versions of the input image. Finally, the partial segmentation results so obtained are combined to constitute the final segmentation result. The method has been formulated, implemented and tested on benchmark textured images. The experimental results along with the performance measures have established the efficacy of the proposed method.

Keywords: Texture segmentation, top-hat transformation, bottom-hat transformation, MTC.

Received October 25, 2015; accepted January 12, 2017

Full Text    

Sunday, 23 June 2019 07:47

Intrusion Detection System using Fuzzy Rough

Set Feature Selection and Modified KNN Classifier

Balakrishnan Senthilnayaki1, Krishnan Venkatalakshmi2, and Arpputharaj Kannan1

 1Department of Information Science and Technology, College of Engineering, Anna University, Chennai

 2Departemnt of Electronics and Communication Engineering, University College of Engineering Tindivanam, Anna University, Tindivanam

Abstract: Intrusion detection systems are used to detect and prevent the attacks in networks and databases. However, the increase in the dimension of the network dataset has become a major problem nowadays. Feature selection is used to reduce the dimension of the attributes present in those huge data sets. Classical Feature selection algorithms are based on Rough set theory, neighborhood rough set theory and fuzzy sets. Rough Set Attribute Reduction Algorithm is one of the major theories used for successfully reducing the attributes by removing redundancies. In this algorithm, significant features are selected data are extracted. In this paper, a new feature selection algorithm is proposed using the Maximum dependence Maximum Significance algorithm. This algorithm is used for selecting the minimal number of attributes of knowledge Discovery and Data (KDD) data set. Moreover, a new K-Nearest Neighborhood based algorithm proposed for classifying data set. This proposed feature selection algorithm considerably reduces the unwanted attributes or features and the classification algorithm finds the type of intrusion effectively. The proposed feature selection and classification algorithms are very efficient in detecting attacks and effectively reduce the false alarm rate.

Keywords: Rough set, fuzzy set, feature selection, classifications and intrusion detection.

Received June 9, 2015; accepted March 9, 2016

Full Text    

Sunday, 23 June 2019 07:45

A Certificate-Based AKA Protocol Secure Against Public Key Replacement Attacks

Yang Lu, Quanling Zhang, and Jiguo Li

College of Computer and Information, Hohai University, China

Abstract: Certificate-based cryptography is a new public key cryptographic paradigm that has many appealing features since it simultaneously solves the certificate revocation problem in conventional public key cryptography and the key escrow problem in identity-based cryptography. Till now, three certificate-based Authenticated Key Agreement (AKA) protocols have been proposed. However, our cryptanalysis shows that none of them is secure under the public key replacement attack. To overcome the security weaknesses in these protocols, we develop a new certificate-based AKA protocol. In the random oracle model, we formerly prove its security under the hardness of discrete logarithm problem, computational Diffie-Hellman problem and bilinear Diffie-Hellman problem. Compared with the previous proposals, it enjoys lower computation overhead while providing stronger security assurance. To the best of our knowledge, it is the first certificate-based AKA protocol that resists the public key replacement attack in the literature so far.

Keywords: Key agreement, certificated-based cryptography, public key replacement attack, random oracle model.

Received September 15, 2015; accepted March 12, 2017 
Full Text   
Sunday, 23 June 2019 07:42

Automatic Screening of Retinal Lesions for Grading Diabetic Retinopathy

Muhammad Sharif1 and Jamal Hussain Shah1,2

1Department of Computer Science, COMSATS University Islamabad, Pakistan

2University of Science and Technology of China, China

Abstract: Diabetic Retinopathy (DR) is a diabetical retinal syndrom. Large number of patients have been suffered from blindness due to DR as compared to other diseases. Priliminary detection of DR is a critical quest of medical image processing. Retinal Biomarkers are termed as Microaneurysms (MAs), Haemorrhages (HMAs) and Exudates (EXs) that are helpful to grade Non-Proliferative DR (NPDR) at different stages. This research work contributes an automatic design for the retinal lesions screening to grade DR system. The system is comprised of unique preprocessing determination of biomarkers and formulation of profile set for classification. During preprocessing, Contrast Limited Adaptive Histogram Equalization (CLAHE) is utilized and Independent Component Analysis (ICA) is extended with Curve Fitting Technique (CFT) to eliminate blood vessels and optic disc as well as to detect biomarkers from the digital retinal image. Subsequent, NPDR lesions based distinct eleven features are deduced for the purpose of classification. Experiments are performed using a fundus image database. The proposed method is appropriate for initial grading of DR.

Keywords: DR, CLAHE, ICA, CFT, biomarkers.

Received December 10, 2015; accepted March 24, 2016
Full Text  
Sunday, 23 June 2019 07:38

A new Framework for Elderly Fall Detection

Using Coupled Hidden Markov Models

Mabrouka Hagui1, Mohamed Mahjoub1, and Faycel ElAyeb2

1National Engineering School of Sousse, University of Sousse, Laboratory of Advanced Technology and Intelligent Systems, Tunisia

2Preparatory Institute for Engineering Studies, University of Monastir, Tunisia

Abstract: Falls are a most common problem for old people. They can result in dangerous consequences even death. Many recent works have presented different approaches to detect fall and prevent dangerous outcomes. In this paper, human fall detection from video streams based on a Coupled Hidden Markov Model (CHMM) has been proposed. The CHMM was used to model the motion and static spatial characteristic of human silhouette. The validity of current proposed method was demonstrated with experiments on Le2i database, Weizman database and video from Youtube simulating falls and normal activities. Experimental results showed the superiority of the CHMM for video fall detection.

Keywords: Fall detection; feature extraction; shape deformation, motion history of image, coupled hidden markov models.

Received June 17, 2016; accepted July 28, 2016

Full Text 

Sunday, 23 June 2019 07:35

Cockroach Swarm Optimization Using A Neighborhood-Based Strategy

Le Cheng1,2, Yanhong Song1, and Yuetang Bian3

1College of Computer and Communication Engineering, Huaian Vocational College of Information Technology, China

2College of Computer and Information, Hohai University, China

3School of Business, Nanjing Normal University, China

Abstract: The original Cockroach Swarm Optimization (CSO) algorithm suffers from the problems of slow or premature convergence. This paper described a new cockroach-inspired algorithm, which is called CSO with Global and Local neighborhoods (CSOGL). In CSOGL, two kinds of neighborhood models are designed, in order to increase the diversity of promising solution. Based on above two neighborhood models, two kinds of novel chase-swarming behaviors are proposed and applied to CSOGL. Moreover, this paper also provides a formal convergence proof for the CSOGL algorithm. The comparison results show that the CSOGL algorithm outperform the existing cockroach-inspired algorithms.

Keywords: Cockroach swarm optimization, cockroach-inspired algorithm, CSO with global and local neighborhoods, premature convergence.

Received January 24, 2016; accepted June 13, 2017

Full Text   

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…