Improvement of Imperialist Competitive Algorithm based on the Cosine Similarity Criterion of Neighbo
Improvement
of Imperialist Competitive Algorithm based on the Cosine Similarity Criterion
of Neighboring Objects
Maryam Houtinezhad and Hamid Reza
Ghaffary
Department of Computer Engineering,
Ferdows Branch, Islamic Azad University, Ferdows, Iran
Abstract: The goal of optimizing
the best acceptable answer is according to the limitations and needs of the
problem. For a problem, there are several different answers that are
defined to compare them and select an optimal answer; a function is called a
target function. The choice of this function depends on the nature of the problem. Sometimes several goals are together
optimized; such optimization problems are called multi-objective issues. One
way to deal with such problems is to form a new objective function in the form
of a linear combination of the main objective functions. In the proposed
approach, in order to increase the ability to discover new position in the Imperialist
Competitive Algorithm (ICA), its operators are combined with the
particle swarm optimization. The colonial
competition optimization algorithm has the ability to search global and has a
fast convergence rate, and the particle swarm algorithm added to it increases
the accuracy of searches. In this approach, the cosine similarity of the neighboring
countries is measured by the nearest colonies of an imperialist and closest
competitor country. In the proposed method, by balancing the global and local
search, a method for improving the performance of the two algorithms is
presented. The simulation results of the combined algorithm have been evaluated
with some of the benchmark functions. Comparison of the results has
been evaluated with respect to metaheuristic
algorithms such as Differential Evolution (DE), Ant Lion Optimizer
(ALO), ICA, Particle Swarm Optimization (PSO), and Genetic Algorithm (GA).
Keywords: Imperialist competitive algorithm, particle swarm optimization, optimization
problem.
Received March 24, 2018;
accepted November 17, 2019
Automatic Topics Segmentation for News Video by Clustering of Histogram of Orientation Gradients Fac
Automatic Topics Segmentation for News Video
by Clustering of Histogram of Orientation
Gradients Faces
Mounira Hmayda, Ridha Ejbali,
and
Mourad Zaied
RTIM: Research
Team in Intelligent Machines,University of Gabes, National Engineering School of Gabes
(ENIG), Tunisia
Abstract: TV
stream is a major source of multimedia data. The proposed method aims to enable
a good exploitation of this source of video by multimedia services social community,
and video-sharing platforms. In this work, we propose an approach to the
automatic topics segmentation of news video. The originality of the approach is
the use of Clustering of Histogram of Orientation Gradients (HOG) faces as
prior knowledge. This knowledge is modeled as images which governs the
structuring of TV stream content. This structuring is carried out on two
levels. The first consists in the identification of anchorperson by
Single-Linkage Clustering of HOG faces. The second level aims to identify the
topics of news program due to the large audience because of the pertinent
information they contain. Experiments comparing the proposed technique to
similar works were carried out on the TREC Video Retrieval Evaluation (TRECVID)
2003 database. The results show significant improvements to TV news structuring
exceeding 96 %.
Keywords: Anchorperson,
clustering, face detection, features extraction, news program.
Received December 28, 2018; accepted April 10, 2020
Detection of Bundle Branch Block using Higher
Order Statistics and Temporal Features
Yasin Kaya
Department of Computer Engineering, Adana
Alparslan Türkeş Science and Technology University, Turkey
Abstract: Bundle Branch Block (BBB) beats are the
most common Electrocardiogram
(ECG) arrhythmias and can be indicators of significant heart disease. This
study aimed to provide an effective machine-learning method for the detection
of BBB beats. To this purpose, statistical and temporal features were
calculated and the more valuable ones searched using feature selection
algorithms. Forward search, backward elimination and genetic algorithms were
used for feature selection. Three different classifiers, K-Nearest Neighbors
(KNN), neural networks, and support vector machines, were used comparatively in
this study. Accuracy, specificity, and sensitivity performance metrics were
calculated in order to compare the results. Normal sinus rhythm (N),
Right Bundle Branch Block
(RBBB), and Left Bundle Branch Block (LBBB) ECG beat types were used in the
study. All beats containing these three beat types in the MIT-BIH arrhythmia
database were used in the experiments. All of the feature sets were obtained at
a promising classification accuracy for BBB classification. The KNN classifier
using backward elimination-selected features achieved the highest
classification accuracy results in the study with 99.82%. The results showed
the proposed approach to be successful in the detection of BBB beats.
Keywords: ECG, arrhythmia detection, bundle branch block, genetic algorithms,
neural networks, k-nearest neighbors, support vector machines, backward
elimination, forward selection.
Received
August 20, 2019; accepted October 5, 2020
Emotion Recognition based on EEG Signals
in Response to Bilingual Music Tracks
Rida Zainab and Muhammad
Majid
Department of Computer Engineering,
Abstract: Emotions are vital for communication in daily life and their
recognition is important in the field of artificial intelligence. Music help
evoking human emotions and brain signals can effectively describe human
emotions. This study utilized Electroencephalography (EEG) signals to recognize
four different emotions namely happy, sad, anger, and relax in response to
bilingual (English and Urdu) music. Five genres of English music (rap, rock,
hip-hop, metal, and electronic) and five genres of Urdu music (ghazal, qawwali,
famous, melodious, and patriotic) are used as an external stimulus.
Twenty-seven participants consensually took part in this experiment and
listened to three songs of two minutes each and also recorded self-assessments.
Muse four-channel headband is used for EEG data recording that is commercially
available. Frequency and time-domain features are fused to construct the hybrid
feature vector that is further used by classifiers to recognize emotional
response. It has been observed that hybrid features gave better results than
individual domains while the most common and easily recognizable emotion is
happy. Three classifiers namely Multilayer Perceptron (MLP),
Keywords: Emotion recognition, electroencephalography, feature extraction,
classification, bilingual music.
Received September 16, 2019; accepted July 26, 2020
A Novel Machine-Learning
Framework-based on LBP and GLCM Approaches for CBIR System
Meenakshi Garg1, Manisha Malhotra1,
and Harpal Singh2
1University Institute of Computing, Chandigarh University, India
2Department of Electronics and Communication Engineering, CEC
Landran, India
Abstract: This paper
presents a Multiple-features extraction and reduction-based approaches for
Content-Based Image Retrieval (CBIR). Discrete Wavelet Transforms (DWT) on
colored channels is used to decompose the image at multiple stages. The Gray
Level Co-occurrence Matrix (GLCM) concept is used to extract statistical
characteristics for texture image classification. The definition of shared
knowledge is used to classify the most common features for all COREL dataset
groups. These are also fed into a feature selector based on the particle swarm
optimization which reduces the number of features that can be used during the
classification stage. Three classifiers, called the Support Vector Machine (SVM),
K-Nearest Neighbor (KNN) and Decision Tree (DT), are trained and tested, in
which SVM give high classification accuracy and precise rates. In several of
the COREL dataset types, experimental findings have demonstrated above 94
percent precision and 0.80 to 0.90 precision values.
Keywords: CBIR, DWT, SHO, feature
selection, classification.
Received November 18, 2019; accepted July
20, 2020
Maximizing the Area Spanned by the Optical SNR of the 5G Using Digital Modulators and Filters
Guruviah
Karpagarajesh and Helen Anita
Department
of Electronics and Communication, Government College of Engineering Tirunelveli,
India
Abstract: High
security data link channels having more immunity against channel noise is the
need of the century. Free Space Optical communication (FSO) is the modern
technology which kick-starts it’s application in inter satellite communication,
underwater communication and mobile communication to the next level of data
transmission by means of complete utilisation of the allocated frequency
spectrum. In Europe and Asian countries, 5G optical communication will going to
expand its usage to nearly 50% in upcoming years and so bandwidth and power
efficiency has to be enhanced as much as possible since the consumption rate of
the users is increasing exponentially. But increasing the distance increases
the attenuation in case of severe atmospheric weather condition. In this paper,
5G data rate of 50Gbps is ensured for better signal reception with maximum
possible link distance between the sender and the receiver keeping variable
attenuation environment. The frequency of operation is 1550nm throughout the
processes. In this work, several digital modulation techniques and optical
filters for receiver are designed and simulated. The better resulting modulator
and filter design in terms of high Quality factor and low bit error rate are
considered and is integrated with each other. The Signal to Noise Ratio (SNR)
and optical SNR are calculated for the integrated design theoretically. Higher
the SNR less will be BER and hence the signal connectivity can be improved in
the high speed free space optical communication systems.
Keywords: Free
space optics, 5G data transmission, signal to noise ratio, bit error rate,
quality factor.
Received December 17, 2019; accepted June
15, 2020
Software
Defined Network: Load Balancing Algorithm Design and Analysis
Senthil Prabakaran1 and Ramalakshmi Ramar2
1Department of Electronics and Communication Engineering, Kalasalingam Academy of Research and
Education, India
2Department
of Computer Science and Engineering, Kalasalingam Academy of Research and Education, India
Abstract: Software Defined Network (SDN) cut down the monopolies of producing network devices and their applications. It allows the use of an omniscient controller that manages the overall network and promises for simplifying the configuration and management burden of the traditional Internet Protocol (IP) network. The use of hardware load balancer is a critical issue in conventional IP networks that creates many negative impacts such as the cost affordability, features customization, and availability. Also, the existing load balancing algorithm does not consider the flow size generated by the client nodes. Further, flows are not classified based on the threshold value of the dynamic flow size. The paper proposes to compare the performance of two load balancing algorithms such as flow-based load balancing algorithm and traffic pattern-based load balancing algorithm with distributed controllers' architecture. The result shows that the flow-based load balancing algorithm minimizes response time by 94%, enhances transaction rate by 14% and Traffic pattern-based load balancing algorithm has improved availability by 2.69%.
Keywords: SDN, distributed controller, flow-based algorithm, traffic pattern based algorithm, failover.
Received January 10, 2020; accepted November 24, 2020
https://doi.org/10.34028/iajit/18/3/7
Software Metrics for Reusability of Component
Based Software System: A Review
Jyoti Aggarwal1
and Manoj Kumar2
1Department of
Computer Science (ASET), Amity University, Noida
2School of Computer
Science, University of Petroleum and Energy Studies, Dehradun
Abstract: Component Based Software System (CBSS)
have become most generalized and popular approach for developing reusable
software applications. A software component has different important factors,
but reusability is the most citing factor of any software component. Software
components can be reused for the development of another software application,
which further reduces the amount of time and effort of software development
process. With the increase in the number of software components, requirement
for identification of software metrics also increased for quantitative analysis
of different aspects of components. Reusability depends on different factors
and these factors have different impact on the reusability of software
components. In this paper, study has been performed to identify the major reusability
factors and software metrics for measuring those factors. From this research
work, it will become easier to measure the reusability of software components,
and software developers would be able to measure the degree of various features
of any application which can be reused for developing other software
applications. In this way, it would be easy and convenient to identify and
compare the reusable software components and they could be reused in effective
and efficient manner.
Keywords:
Reusability
metrics, software components, factors, software.
Received
January 21, 2020; accepted December 10, 2020
MCA-MAC: Modified Cooperative
Access MAC Protocol in Wireless Sensor Networks
Aya Hossam1, Tarek
Salem2, Anar Abdel Hady2, and Sherine Abd El-Kader2
1Electrical Engineering Department, Faculty of
Engineering (Shoubra), Benha University, Egypt
2Computers and
Systems Department, Electronics Research Institute, Egypt
Abstract: Throughput, energy efficiency and average
packet delivery delay are some of the most crucial metrics that should be
considered in Wireless Sensor Networks (WSNs). This paper proposes a modified
Medium Access Control (MAC) protocol for WSNs, called (MCA-MAC). MCA-MAC aims
to improve the previous metrics and thus the overall performance of WSNs through
using cooperative communication. It enables source nodes from using
intermediate nodes as relays to send their data through them to the access
point. MCA-MAC protocol is also acting as a cross layer protocol where the best
end-to-end path between the source and destination is found through an
efficient algorithm. Mathematical analysis demonstrates that MCA-MAC protocol can
determine the optimal relay node that has the minimum transmission time for the
given source-destination pair. Using Multi-Paradigm Programming Language (MATLAB)
simulation environment, this paper estimates MCA-MAC protocol performance in
terms of system throughput, energy efficiency and delay. The results show that
MCA-MAC protocol outperforms the existing scheme called Throughput and Energy
aware Cooperative MAC protocol (TEC-MAC) protocol under ideal and dynamic
channel conditions. Under ideal conditions, MCA-MAC
protocol achieved throughput, and energy efficiency improvements of 12%, and 50%
respectively, more than TEC-MAC protocol. While the packet delay through using
MCA-MAC has been decreased by about 48% less than TEC-MAC protocol.
Keywords: WSN, MAC protocol, cooperative communication, energy efficient.
Received
April 12, 2020; accepted September 7, 2020
A New Approach for
Textual Password Hardening Using Keystroke Latency Times
Khalid Mansour1
and Khaled Mahmoud2
1Faculty of Information Technology, Zarqa University, Jordan
2King
Hussein School of Computing Sciences, Princess Sumaya University for
Technology, Jordan
Abstract:
Textual passwords are still widely used as an authentication
mechanism. This paper addresses the problem of textual password hardening and
proposes a mechanism to make textual passwords harder to be used by
unauthorized persons. The mechanism introduces time gaps between keystrokes
(latency times) that would add a second protection line to the password.
Latency times are converted into discrete representation (symbols) where the
sequence of these symbols is added to the password. For accessing system, an
authorized person needs to type his/her password with a certain rhythm. This
rhythm is recorded at the sign-up time.This work is an extension to a previous
work that elaborates more on the local approach of discretizing time gaps
between every two consecutive keystrokes. In addition, more experimental
settings and results are provided and analyzed. The local approach considers
the keying pattern of each user to discretize latency times. The average,
median and min-max are tested thoroughly.Two experimental settings are
considered here: laboratory and real-world. The lab setting includes students
studying information technology while the other group are not. On the other
hand, information technology professional individuals participated in the real-world
experiment. The results recommend using the local threshold approach over the
global one. In addition, the average method performs better than the other
methods. Finally, the experimental results of the real-world setting support
using the proposed password hardening mechanism.
Keywords: Textual
password, password hardening, latency time, keying pattern, discretization.
Received April 26, 2020; accepted November 18, 2020
Encryption Based on Cellular Automata for Wireless Devices
in IoT Environment
Harinee Shanmuganathan
and Anand Mahendran*
School
of Computer Science and Engineering, Vellore Institute of Technology, Vellore,
Tamil Nadu, India
This email address is being protected from spambots. You need JavaScript enabled to view it.
Abstract: A large number of physical
objects are interconnected and accessed through the internet along with the
existing technology like cloud computing, mobile computing, wireless sensor
networks and big data forms a big paradigm called the Internet of Things (IoT).
Data from remote and neglected areas are collected via wireless sensors and
stored in the cloud. Growth in the number of the sensing device and the server
to which these are connected leads to many security issues and malicious
attacks. With mirai botnet and jeep hack about 150 million user’s information
from MyFitnessPal nutrition app was stolen. This paper mainly focuses on
security challenges for transmitting the data in the wireless sensor network in
the IoT environment. To avoid the malicious attack, eavesdropping, algebraic
attacks and other attacks a new security algorithm is proposed based on
Cellular Automata (CA) which has a key length of 80 bits and the text size is
64 bits. Reversible rules of CA are used in this algorithm to achieve
reversibility, parallelism, stability, randomness, and uniformity. The process
of encryption and decryption is performed for 15 rounds to avoid dependency
between the ciphertext and the plain text. Finally, we compare the execution
time, throughput and the avalanche effect of the proposed algorithm with the
existing algorithm like Advanced Encryption Standard (AES), Height, Present
algorithm. The proposed algorithm is verified to be a better choice for lost
cost and resource-restricted devices.
Keywords: Cellular
Automata, secure data transmission, security algorithm, IOT.
Received May 25, 2020; accepted September 27, 2020
https://doi.org/10.34028/iajit/18/3/11
A New Approach to Automatically Find and Fix
Erroneous Labels in Dependency Parsing Treebanks
Metin Bilgin
Department of Computer Engineering, Bursa Uludağ
University, Turkey
Abstract: Dependency Parsing (DP) is the
existence of sub-term/upper-term relations between the words that make up that
sentence for each sentence in the text. DP serves to produce meaningful
information for high-level applications. Correct labeling of the text corpus
used in DP studies is very important. There will be mistakes in the results of
the studies that will be performed with the wrongly-labeled text corpus. If
text corpus is labeled manually or automatically by human beings, then faulty
cases will occur. As a result of the cases that may arise from human factors or
annotations used for labeling, faulty labels will be on treebanks. In order to
prevent these errors, detection, and correction of possible faulty labeling is
very important in terms of increasing the accuracy of the studies to be carried
out. Manual correction of possible faulty labels requires great effort and
time. The purpose of this study is to create a model that automatically finds
possible faulty labels and offers new label suggestions for faulty labels. With
the help of the proposed model, it is aimed to detect and correct possible
faulty labels that are included in a text corpus, and to increase consistency
among the text corpus of the same language. With the help of the developed
model, suggesting new labels for faulty labels by a language expert will be a
great convenient for the specialist. Another advantage of the model is that the
developed model provides a language-independent structure. It has succeeded in
obtaining successful results in finding and correcting potentially faulty
labels in experimental studies for Turkish. An increase in accuracy has been
detected in studies carried out for languages other than Turkish. In
investigating the accuracy of the results obtained by the system, the results
were analyzed with the help of 10 different language experts.
Keywords: Natural language processing, dependency parsing,
universal dependency, error detection, treebank consistency.
Received July 27, 2020;
accepted January 19, 2021
Discretization Based Framework to Improve
the Recommendation Quality
Bilal Ahmed and Wang Li
Department
of Information and Computer, Taiyuan University of Technology, China
Abstract: Recommendation
systems are information filtering software that delivers suggestions about
relevant stuff from a massive collection of data. Collaborative filtering
approaches are the most popular in recommendations. The primary concern of any
recommender system is to provide favorable recommendations based on the rating
prediction of user preferences. In this article, we propose a novel
discretization based framework for collaborative filtering to improve rating
prediction. Our framework includes discretization-based preprocessing, chi-square based attribution selection, and K-Nearest
Neighbors (KNN) based similarity computation. Rating prediction affords some
basis for the judgment to decide whether recommendations are generated or not,
subject to the ratio of performance of any recommendation system. Experiments
on two datasets MovieLens and BookCrossing, demonstrate the effectiveness of
our method.
Keywords: Recommender systems,
collaborative filtering, prediction, discretization, chi-square.
Received October 21, 2019; accepted July 20, 2020
https://doi.org/10.34028/iajit/18/3/13
Algebraic Supports and New Forms of the Hidden Discrete Logarithm Problem for Post-quantum Public-ke
Algebraic Supports and New Forms of
the Hidden Discrete Logarithm Problem
for Post-quantum Public-key Cryptoschemes
Dmitriy Moldovyan1, Nashwan Al-Majmar2, and Alexander Moldovyan1
1St.
Petersburg Institute for Informatics and Automation of Russian Academy of
Sciences, Russia
2Computer Sciences
and Information Technology Department, Ibb University, Yemen
Abstract: This paper introduces two new forms of
the hidden discrete logarithm problem defined over a finite non-commutative
associative algebras containing a large set of global single-sided units. The proposed forms are promising
for development on their base practical post-quantum public key-agreement
schemes and are characterized in performing two different masking operations
over the output value of the base exponentiation operation that is executed in
framework of the public key computation. The masking operations represent
homomorphisms and each of them is mutually commutative with the exponentiation
operation. Parameters of the masking operations are used as private key
elements. A 6-dimensional algebra containing a set of p3 global left-sided units is used as
algebraic support of one of the hidden logarithm problem form and a
4-dimensional algebra with p2 global right-sided units is used to implement the
other form of the said problem. The result of this paper is the proposed two methods
for strengthened masking of the exponentiation operation and two new post-quantum public key-agreement
cryptoschemes.
Mathematics
subject classification: 94A60, 16Z05, 14G50, 11T71, 16S50.
Keywords: Finite associative algebra,
non-commutative algebra, right-sided unit, left-sided unit, global units,
discrete logarithm problem, post-quantum cryptography, public key-agreement.
Received December 23, 2019; accepted November 24, 2020