1) Multiple Fault Localization of Software Programs: State...
Transcript of 1) Multiple Fault Localization of Software Programs: State...
1) Multiple Fault Localization of Software Programs: State-of-the-Art, Issues and
Challenges
Abu Bakar Zakaria, Sai Peck Lee, Raja Sehrab Bashir and Girish Bekaroo
Software fault localization is one of the most tedious and costly activities in program debugging, that
aids in identifying faults locations in a software program. Most of the existing techniques localize faults
based on the presumption that a program has only one fault, whereby in reality, a program failure is
normally caused by many faults. Therefore, the effectiveness of existing techniques reduces when used
on programs with multiple faults due to the fault interference phenomenon. The current scenario is that
one-fault-at-a-time method is normally used to localize multiple faults. Also, some multiple fault
localization techniques suffer from computational complexity and scalability problems which limit their
application in practice. In this paper, the existing state-of-the-art fault localization techniques that use
different methods such as one-fault-at-a-time, simultaneous, failure clustering, and parallelization in
localizing multiple faults are classified and critically analysed. Furthermore, the current research trends,
issues, and challenges in the field of study are extensively discussed. The results strongly suggest that
existing fault localization techniques suffer from a reduction in effectiveness, scalability, and
computational complexity issues that limit their applicability in the software industry.
Keywords: Failure clustering; Fault interference; automated fault localization; Program debugging;
multiple fault localization
2) Bibliometric Study of Trends in Demand Driven Acquisition Publications
Aliyu Olugbenga Yusuf and Noorhidawati Abdullah
This study provides an overview of the progression and development in literatures on the theme of
Demand Driven Acquisition (DDA) for the years ranging 1998-2016. The study embraced text mining and
bibliometrics analysis using the ScopusTM online database. The researchers adopted bibliometrics
analysis to identify the trends, progress and growth rate, authorship and collaborations pattern, author’s
productivity, most cited journals articles and journal outlets on the subject as found in the database. It
was discovered that publication outlets have significant relationship with journal citations. Contributions
to DDA literature are majorly from authors who are librarians from varied departments of Academic
libraries. The study revealed that authors from United States contributed majorly to publications on
Demand Driven Acquisition, trailed by United Kingdom authors and few researchers from Asia region.
There was few publications on Demand Driven Acquisition from (Malaysia) Asia and none from any
country in Africa.
Keywords: Demand Driven Acquisition (DDA), Bibliometric, journal citation.
3) One Touch Intelligent System (OTIS) for maximizing cropping intensity of rice in Sindh-
Pakistan
Bushra Ikram, Sameem Abdul Kareem, Ikram Ullah
Existing systems in the domain of rice are highly input dependent and require a series of user interaction.
Millions of rice-growers of Sindh in Pakistan are hardly literate and cannot handle this problem
independently especially for retrieving the prognostic levelled information about “which rice seed variety
is better to plant in to two seasons”. In this paper, we present, One Touch Intelligent System (OTIS) for
providing decision levelled information quickly on the same screen. OTIS will accepts only one input and
perform two tasks within main system 1) Identification and 2) Recommendation. The first part will
analyse the information and finds the good quality seed (GQS) which require optimal temperature, less
water, and shortest time to mature. The second part will provide the suggestion in two colours (red &
green) for the decision “whether the selected seed is best to plant in two seasons or not?” The green
coloured text will shows the higher level of prognosis i.e. positive in selecting GQS else the red coloured
text will appeared on user interface and shows lower level i.e. negative prognosis. So that, hardly literate
rice-farming-community or even ignorant community can understand easily and handle it independently.
The OTIS will become highly assisting tool for millions of rice growers of Sindh in Pakistan in extending
rice cropping intensity from one season to two seasons and we hoped that the system will support
directly in improving the productivity of rice and leaves the significant impact on food security especially
the districts Larkana and Qamber Shahdadkot will be capable to produce 66.8% of rice instead of 33.4%
(100% increased) annually after employing the OTIS without engaging more land and other resources.
We discuss the earlier studies conducted by many researchers in the world in section 2.2, a prototype of
OTIS in section 3.2, methodology in 3.4, and conclusion in section 4.1.
Keywords: Intelligent system, AI in rice productivity, Expert systems in agriculture, Decision support
system, good quality seed, rice seed varieties of Sindh.
4) Classifier Performance Evaluation on Imbalanced Data
Ebubeogu Amarachukwu Felix, Lee Sai Peck
Class imbalance remains an important issue in binary defect classification studies, as it can affect the
performance of defect classification models. In this study, we investigate the average prediction
performance of six selected state-of-the-art classifiers across multiple projects with data imbalances
between the defective and defect-free classes. We use suitable performance measures to assess the
average performance of each classifier. For both the defective and defect-free classes, the logistic
regression classifier achieved an average classification accuracy of 89.78%; the neural network classifier
achieved an area under the receiver operating characteristic curve of 83.99%; the naive Bayes classifier
achieved a Brier score and a J-coefficient of 79.43% and 31.64%, respectively; and the K-
nearestneighbors classifier achieved an average information score of 4.56%. We hypothesized that not
all classifiers may show degradation in their average performance when applied to imbalanced data. To
verify this hypothesis, we carefully identified the minority class when performing data cleaning to avoid
misleading results.
Keywords: Machine learning data pre-processing classification algorithm binary classification
imbalanced data.
5) Authentication and Capability-Based Framework for Communication Security in IoT
Networks
Fadele Ayotunde Alaba and Mazliza Othman
The Internet of Things (IoT) expects to improve human lives with the rapid development of resource-
constrained devices and with the increased connectivity of physical embedded devices that make use of
current Internet infrastructure to communicate. The major challenging in such an interconnected world
of resource-constrained devices and sensors are security and privacy features. IoT is demand new
approaches to security like a secure lightweight operating system, scalable approaches to continuous
monitoring and threat mitigation, and new ways of detecting and blocking active threats. In this
research, a novel security framework and authentication protocol was proposed. The proposed security
framework is developed using cost effective design approach. It is a security design approach that make
use of both hardware and software components to achieve security goals and build a cost effective
system. Moreover, authentication protocol is also proposed. The proposed protocol makes use of AES-
GCM technique to provide authentication and encryption for resource constrained devices and capability
as a second line of defense for access control and random number generator. Finally, the proposed
framework is evaluated using mutual authentication, resistance to replay-attacks, computational cost
and data traffic cost.
Keywords: Internet of Things (IoT), Security, Framework
6) A Dynamic Game Theoretic Modeling of EDoS Eye to Effectively Mitigate Economic
Denial of Sustainability (EDoS)
Fahad Zaman Chowdhury, Mohd Yamani Idna Bin Idris, Miss Laiha Binti Mat Kiah
Economic denial of sustainability (EDoS) is a new threat of cloud computing. This attack is a type of DDOS
attack that targets the vulnerable pricing model of cloud consumers. In EDoS, attackers consume cloud
resources in a stealthy manner which results multiple cloud instances due to elastic feature of cloud. As a
result, unexpected and unbearable burden of excessive billing amount impose to the cloud consumers.
Moreover, valid cloud users experience slow but steady service degradation over time. In this research,
we introduce a novel dynamic game based analytical approach and incorporate it to our existing EDoS
Eye model as an extension of our previous work. Our proposed Dynamic Game Based Decision Module
(D-GBDM) in EDoS Eye able to determine an optimal strategic threshold value through Nash equilibrium.
This dynamic threshold value can block EDoS traffic efficiently. In simulation, it shows promising results.
Keywords: Economic Denial of Sustainability; EDoS; Cloud Computing; Game Theory; Honeypot;
Threshold; Traffic Distribution; Nash Equilibrium; QoS; Dynamic Game.
7) IoT Based Energy Efficient Smart Wireless Automatic Traffic Light Controller
Fawad Ali Khan, Rafidah Md Noor, Miss Laiha Binti Mat Kiah, Fadele Ayotunde
The exponential increase rate of car ownership is one of the tool for measuring the growth of a country
economically. Traffic management intersections are a challenging issue that needs to be addressed due
to limited infrastructure which results in problems like high Fuel consumption, energy consumption and
increased in Carbon dioxide (CO2) emissions from vehicles due to delay and waiting time on traffic signals
on road sides. In this research, we proposed an IoT Based Eco-friendly, Smart Wireless Automatic Traffic
Light Controller that uses ultrasonic sensor, RFID tags and other IoT components. The proposed
technique is capable of handling traffic congestion issues for both normal and special vehicles to ensure
flexibility and smooth traffic. In addition, the IoT-Based Eco-friendly, Smart Wireless Automatic Traffic
Light Controller reduces vehicle fuel consumption and emission of CO2 that may arise when vehicles are
waiting on the queue. The performance of the proposed system was tested and evaluated with the
existing traffic light controller with respect to traffic delays and waiting at a signal junction, average
distance traveled by cars and minimization of CO2 footprint. The experimental results show that the IoT
Based Eco-friendly, Smart Wireless Automatic Traffic Light Controller provide an actual real time traffic
information, subsystem information management that guides and protect both cars and road, intelligent
traffic management subsystem and emergency vehicle problem than the existing techniques. Finally, the
simulation results indicate that proposed system controller is good for controlling traffic congestion.
Keywords: IoT, RFID, Intelligent Traffic Management, Traffic Light Controller
8) CloudProcMon: A Non Intrusive Cloud Monitoring Framework.
Hassan Jamil Syed and Abdullah Gani
Cloud computing has provided new dimensions to businesses. Cloud gives an opportunity to businesses
to focus on their core and leave the computation services for the cloud to do. Cloud services are getting
popular and enormous growth in its demand has been observed in recent years. Like any other system;
Performance monitoring and fault and failure detection for the cloud is also very essential. Clouds are
large scale distributed systems and services offered by the cloud are dependent on multiple softwares
and hardwares, provided that these services run on virtualized environment and can be moved from one
physical machine to another, the dependency map is updated dynamically. In the case of sub-optimal
performance or failure, non-intrusive finding out root cause becomes more difficult in the previously
discussed scenario. a non-intrusive collection of monitoring data from a VM is also a challenging job. In
the proposed research, we propose a performance monitoring framework that provides the solution to
the challenges mentioned above. The proposed framework is light-weight, incurring negligible overhead
on the host operating systems of the physical servers. In addition, the framework is horizontally as well
as vertically scalable. Vertical scalability is achieved through ecient monitoring metrics collection from
the operating system. Horizontal scalability is achieved through the push mechanism where by physical
compute nodes push the collected information to the central controller node of the cloud and induce
minimal overhead on the controller.
Keywords: Cloud Monitoring, IAAS Monitoring, VM Monitoring, Cloud Infrastructure Monitoring, Non
Intrusive VM Monitoring
9) Deep Learning Framework for Human Activity Recognition using Mobile and Wearable
Sensor Networks
Henry Friday Nweke and Ying Wah Teh
Human activity recognition systems are implemented to enable continuous monitoring of human
behaviours with extensive application in ambient assisted living, sports injury detection, elderly care,
rehabilitation, and entertainment and surveillance in smart home environments. Extraction of relevant
and discriminative features is the most important aspect of human activity recognition and yet very
challenging. Feature extraction influences the algorithm performance and reduces computation time
and complexity. However, current human activity recognition relies on handcrafted features that are
incapable of handling complex activities especially with the current influx of multimodal and high
dimensional sensor data. With the emergence of deep learning and increased computation powers, deep
learning methods are being adopted for automatic feature learning and classification of simple and
complex human activity recognition in mobile and wearable sensors. In this paper, comprehensive
framework for human activity recognition using mobile and wearable sensor data is proposed. The
different components of the framework and the tasks involve to achieving high recognition performances
and efficient training is explained. These components include data collections, data preparation, feature
extraction with deep learning, training, activity classification and evaluation of the deep learning based
human activity recognition.
Keywords: Deep Learning, Human Activity Recognition, Sensor Network, Feature Representation
10) Delay-aware Framework for Enhancing Data Collection in Mobile Wireless Sensor
Networks using Cloud Computing
Ihsan Ali, Hossien Anisi, Adbdullah Ghani and Ismail
Traditional Wireless Sensor Networks (WSNs) used Static Sink (SS) for data collection which cause
hotspot problem. The introduction of mobile sink solve the hot spot problem as in some application
scenario nodes in WSNs may not connect due to the dead node or obstacle, appointing Mobile Sink (MS)
is a feasible solution for data collection in that area. Trajectory of MS is addressed by a different
researcher, but most of the work address the static scenario i.e. The trajectory is predefined and MS will
follow that trajectory which create problem of high latency of data and the tendency of node to buffered
overflow. In this paper, we used sensor cloud architecture with a static gateway to collect data from
mobile sink and upload to Cloud to solve the aforementioned problem
Keywords: Traditional Wireless Sensor Networks (WSNs), Static Sink (SS)
11) Managing Architecture Decisions through Traceability of Design Concerns and
Rationale
Md Abdullah Al Imran and Lee Sai Peck
Decision-making process during architecture design phase has a high impact towards achieving the
overall system qualities. One of the key options for ensuring that quality is to capture and reason about
the design decisions through traceability of design concerns and rationale. Significant research has been
conducted over the years to support software architecture design and documentation. Over the years
many methods and tools have been proposed to assist in capturing, using, managing and documenting
architecture decisions. Some approaches provided some kind of reasoning mechanism for design
decisions as well as capturing of rationale to stop knowledge vaporization. However, regardless the
advancement in architecture knowledge management, support in architecture decision-making process
is still lacking. This research focuses on providing assistance in architecture decision-making the process
not only by manual capturing of design decisions but also through reusing the past design decisions
through recommendation and retrieval approach. The research will contribute to developing a software
tool of architecture decisions management by storing new decisions in Decisions Repository and by
reusing stored decisions based on the design concerns. The developed approach and software tool will be
evaluated for the effectiveness and usefulness in decision making by user reviews.
Keywords: Architecture decision, Traceability, Architecture decisions management
12) Towards Utilizing Paper-Citation Relations for Research Paper Recommendation
Khalid Haruna & Maizatul Akmar Ismail
Research paper recommenders emerged over the last decade to ease finding publications relating to
researchers’ area of interest. However, the approaches assumed the availability of the whole contents of
the recommending papers to be freely accessible, which are not always true due to factors such as
copyright restrictions. By leveraging the advantages of collaborative filtering approach, we mine the
hidden association between a target paper and its citations to provide a unique and useful list of
research papers as recommendations. Using a publicly available dataset, our proposed approach has
recorded a significant improvement over the baseline in providing relevant and useful recommendations
at the top of the recommendation list.
Keywords: Research Paper, Paper-Citation Relation, Publicly Available Contextual Metadata,
Recommender System.
13) Day-to-day information needs and use of the elderly in Songkhla province, Thailand
Kongkidakorn Boonchuay and Kiran Kaur
The need seeking and use of information is very important to all people of all ages. Especially the elderly
who need to seek and use information that is accurate, complete, accurate and reliable. The study of
information seeking in the daily life of the elderly is essential. Research results can be used to improve
the development of information services and information resources to serve the elderly, especially to
reduce social and health problems. Thus, resulting in improved quality of life for the elderly. This study
investigated the information needs and seeking of elderly people in Thailand. The following research
objectives guided the study: to explore information seeking of the elderly in Songkhla Province, Thailand;
and to understand information seeking behavior of the elderly in their daily life. The elderly’s club of
Boromarajonani College of Nursing, Songkhla,Thailand, is chosen for this study. Data collection was
conducted using several techniques, including survey, interview, observation, and casual conversation. A
total of 61 respondents provided data for the first phase of the study involving a survey, of which six
participants became the informants for the qualitative phase to understand the information seeking
behavior of the elderly. This study found that highly required information needs are nutrition and
exercise, followed by general health and current events/news and weather. These information needs are
fulfilled by the following sources by priority: television, other club members and same age group
meetings. The elderly mainly use the information to improve their health care, to make better life
decisions and to better communicate with people. Upon close contact with the six participants, it was
found that their information seeking behavior for daily life routine can be depicted in several main
themes, namely healthy lifestyle, social networks, psychological, knowledge / cognitive and economy.
The elderly revealed that their information seeking was affected by their lack of time to search, their
knowledge about resources of information and lack of internet access and skills. They pointed out that
they would like to gain skills to use the internet and have greater access to healthcare information. Their
social needs are satisfied through active interaction with friends at the Club. The elderly however do not
use the library and did not express the need to use the library in the near future. It is recommended that
the authorities access the information needs of the elderly more often and repackage the information,
especially on the television, to be more relevant to the needs of the elderly for a better life ahead.
Keywords: Information needs and use; Elderly
14) An uncertainty-aware hybrid MCDM model for Cloud Service Evaluation and Selection
based on Non-functional requirements
Khubaib Amjad Alam and Rodina Ahmad
With the recent paradigm shift towards Cloud computing and Service Oriented Architecture (SOA),
Service evaluation and selection have emerged as significant challenges. The competitive cloud
computing market resulted in a large number of functionally similar cloud services available over the
internet. Selection of the best candidate services among them, is not a straightforward task due to the
influence of several non-functional requirements (NFRs). Service evaluation and selection process
becomes even more complex due to the inherent impression of the evaluation criteria and uncertainty
involved in the decision making process. Several evaluation models have been proposed in the past few
years. However, existing literature lacks a systematic procedure aggregating the user feedback and real-
world performance assessment data, while incorporating the uncertainty and vagueness at multiple
levels. In addition, existing literature reports constrained evaluation criteria while neglecting several
decisive factors. In this study, we formulate this problem as a Multi-Criteria Decision making (MCDM)
problem, and propose an integrated MCDM model based on the Fuzzy Delphi methodology (FDM), Fuzzy
Analytic Hierarchy Process (FAHP), and four MCDM methods for alternative evaluation. Use of a single
MCDM method may lead to divergent results. Which implies the need of careful selection of these
methods. Several recent studies have emphasized that in order to cope with the method uncertainty and
to validate the results, two or more MCDM methods must be used for a single evaluation problem. This
study employs Fuzzy TOPSIS, Fuzzy VIKOR, Fuzzy Multi-MOORA and Fuzzy WASPAS for alternative
evaluation. In addition we have used a comprehensive and multi-dimensional list of evaluation factors
which tries to fill the gap of non-consideration of important evaluation criteria. The proposed model
ousts existing solution in multiples aspects, as it incorporates both user feedback and real world
performance assessment data through third party benchmarking, and deals with the vagueness in the
decision making process. FDM is used to determine critical subjective and objective factors for service
evaluation and selection. Then, Fuzzy AHP is used to determine the relative importance of the decision
criteria. In the next step, we employ four MCDM methods including, Fuzzy TOPSIS (with two
normalization procedures), Fuzzy VIKOR, Fuzzy Multi-MOORA and Fuzzy WASPAS for the ranking of
alternative cloud services. Additionally, Sensitivity analysis for FAHP-FTOPSIS, FAHP-FVIKOR, FAHP-
FMMOORA, and FAHP-FWASPAS is performed to ensure the robustness of these modules. This is the first
study of its kind in services computing, that incorporates three types of uncertainty including the
uncertainty in human judgment, vagueness of data and method uncertainty. Finally, a case study on
real-world cloud services is performed to demonstrate the advantages of the proposed model.
Considering the promising results and multi-faceted significance of this model, it can be exploited as a
decision aid tool, when an optimal decision has to be taken for the most suitable services among a set of
functionally similar cloud services.
Keywords: Cloud Computing, Multi-Criteria Decision making (MCDM)
15) Privacy Preserving and Forensics Enabled Cloud Log Using Encryption with Individual
User’s Key and Key Sharing
M A Manazir Ahsan, Ainuddin Wahid Bin Abdul Wahab & Mohd Yamani Idna Bin Idris
Activity log of users in the cloud-computing plays a very crucial role in forensic investigation while its
reliability can come into question if it is not preserved in a secure way. Most existing solutions for secure
logging consider only traditional computing system, hence unable to address threats posed by cloud’s
inherited property. First comprehensive scheme for secure cloud logging is proposed by Zawoad’ et al. in
their SecLaaS (Secure Logging as a service) scheme. But, SecLaaS scheme does not guarantee privacy in
case of collusion between malicious cloud employee and forensic investigator. Moreover, the user has no
way to know whether cloud server writing his log correctly, which in turn opens a way of repudiating for
the dishonest cloud user. In this paper, we propose a scheme to secure cloud log which mitigates the
aforementioned problems by using cloud users public key and sending the encrypted log back to the user
for verification. In order to prevent future modification (without detection) of log, we generate proof of
past log (PPL) using Rabin’s fingerprint and Bloom filter that also reduces verification time significantly.
Our experiment is conducted on testbed, which implies that it can be deployed in a real cloud
environment.
Keywords: Cloud computing, cloud forensics, cloud security, Information security, cloud log, privacy
16) A Multi-Layered Framework for Building Multilingual Sentiment Lexicons
Mohammed Kaity and Vimala Balakrishnan
Sentiment analysis is a field of science that has contributed to extract and analyse public mood and
views. However, it mainly focuses on building systems and resources in English, which caused delays in
the exploit of more than 73% of data written in languages other than English. In this regard, this paper
proposes a multi-layered framework for building multilingual sentiment lexicons. The layers divide based
on the input resource to i) lexicon-based layer, ii) corpus-based layer, iii) human-based layer. Each layer
sends its output as input to the next layer. Ultimately, a new lexicon of the target language is built and
extended to be used in sentiment analysis tasks.
Keywords: Sentiment Analysis; Sentiment Lexicons; Multilingual; lexicon-based; corpus-based.
17) A quality model for Academic Management System in Higher Institution
Nur Razia Mohd Suradi, Saliyah Kahar and Nor Azliana Akmal Jamaludin
Academic Management System consists of several systems used by academicians to perform daily tasks.
Basically the system is available in web-based applications which allow the user to access anywhere and
anytime. The system is known as a critical system because it is the backbone to the institution where the
important tasks are performed using the system. In software engineering, quality of the system is a vital
element to ensure the system accepted by the user. Various models are available to be used in achieving
good quality model. This research combines five basic models and web-based application characteristics
to propose a new quality model for academic management system. The objective of this paper is to
review existing quality model, study characteristics of web-based application and proposed a generic
quality model for academic application of higher education institution. Therefore, this paper aims to
develop a quality model to provide a framework for assessing the quality of education institutions’
academic management systems.
Keywords: Software quality; Software quality model; web-based application; quality model; academic
management system
18) System for Breast Cancer Diagnosis: A Survey
Rasha Atallah, Maizatul Akmar Ismail, Amirrudin Kamsin, Saqib Hakak
One of the most cause of death in the world is breast cancer. This cancer increase the ration of the
women death. Cancer happens when uncontrollable cell start appear in the human body and spread. But
to reduce breast cancer fatality must detect and diagnose this disease early. Accurate classification of
breast tumor is an important task in medical diagnosis. New technical computing starting help medical in
diagnosis diseases and to improve the specialists doctors performance. The aim of this survey paper is to
define the current state of research in breast cancer and to extract the limitation of the existing systems.
There are many software based on neural network, support vector machine, fuzzy logic, deep learning
and many others techniques are being used in medical. In this paper two type of technique have been
studied deep learning and neural network. The comparison between existing approaches was done
based on three main evaluation parameters i.e., accuracy, sensitivity and specificity along with data sets
used.
Keywords: Breast Cancer, Deep learning , Neural network
19) A Proposed Model of Individual Resistance of Change in Big Data Implementation
R.Renugah, Suraya Hamid & Abdullah Gani
Big data has gained massive attention from diverse domain due to its data-driven decision making
capabilities. Any new IT-enabled innovation implementation will raise various new challenges in an
organization and stimulates two sides of change; technology and human. Thus, this paper defines the
cognitive, emotional and behavioural individual change by focusing at resistance of change aspect. From
the review, we identify the relationship between three entities; the individual’s resistance to change and
personality traits that contribute to the work-related outcome. This paper proposes a conceptual
framework to investigate individual resistance of change factors from the big data perspectives by
adapting Oreg’s work in 2006. The objective of this study is to investigate and evaluate the individual
resistance of change. The proposed model will be assessed using quantitative (survey) approach in four
big data pilot agencies. The outcome are expected to provide some insights for further understanding
related to big data and individual resistance of change in an organization.
Keywords: Big Data; Change Management; Organization; Individual; Resistance of Change
20) An efficient detection of selective forwarding attacks in heterogeneous IoT Networks
Shapla Khanam, Ismail Bin Ahmedy, Mohd Yamani Idna Bin Idris
The Internet of Things (IoT) is the revolutionary technology which connects resource constrained smart
objects to untrustworthy internet using compressed Internet Protocol version 6 (IPv6) over IPv6 Low
Power Personal Area Networks (6LoWPAN). IoT networks are heterogeneous in nature and the
connected things are exposed to various security attacks from inside and outside the networks.
Therefore, there is a need for a unified security framework to analyze and detect selective forwarding
attacks to ensure a secure routing. In this paper we propose a game-theory based attack model to
analyze the malicious behavior of attackers in the IoT networks. In this model two players are involved in
the game where player_1 and player_2 play to maximize and minimize the throughputs of the network
respectively. Additionally, a hop-by-hop acknowledgement (ACK) algorithm is also presented detect
malicious attacker in order to defend networks from selective forwarding attacks in IoT. We illustrates
and analyze the proposed mathematical model to verify the effectiveness of the presented approach.
Keywords: IoT, Selective forwarding attacks, game theory
21) Vehicular Ad Hoc Network (VANET) Handover Schemes Based on Long Term Evolution
–Advanced (LTE-A) Using Decision Technique
Siti Sabariah Salihin, Rafidah Md Noor, Liyth A. Nissirat And Ismail Ahmedy
This paper presents the framework of Vehicular Ad Hoc Network (VANET) handover schemes based on
Long Term Evolution –Advanced (LTE-A) using decision technique in hybrid communication mode for
Vehicles to Vehicles (V2V) communication network and Vehicles to Infrastructure (V2I) communication
network. The Optimum Handover Decision Technique, (OHDT) for the VANET- LTE will be proposes to
provide comprehensive understanding of the parameters influencing the VANET and their consequences
by first developing and optimizing a VANET Model and the Handover Decision Model. The optimization of
VANET Model and Handover Decision Model will carry out using MATLAB software by developing a
mathematical model that considers the effect of the statistical properties of vehicles availability and
velocities in the highway on the handover probability in the network. The optimization results expected
to show that the statistical properties of vehicles availability and velocities would induce time
dependence on the handover probabilities and throughput. The handover and throughput optimization
conditions will be derived from the mathematical model by proposing OHDT, the handover control
parameters as functions of signal quality, and vehicle velocities. Furthermore, the comprehensive
simulation tool using ns3 will be develops to evaluate the performance of the propose Optimum
Handover Decision Technique VANET-LTE system under different conditions. The expected comprehensive
simulation tool results using ns3, obtained for several cases, scenarios, conditions and velocities will be
graphically compare with other methods. The simulation results expected to show that the propose study
of VANET handover based on LTE-A using propose decision handover technique will be able to be the
optimum solution model approach to the seamless Quality of Services (QoS) performances in all
scenarios, conditions and velocities.
Keywords: Vehicular Ad-hoc Network (VANET); Long Term Evolution Advanced (LTE-A); Variation;
Handover; Quality of Service (QoS)
22) Continuous Auditing (CA) Integration with Big Data Analytics to improve FFB Yield
performance for Plantation Company
Suhaimi Misran, Dr. Azah Anir Norman and Dr. Suraya Hamid
Most of the company did not progress in Continuous Auditing (CA) due to the value of the CA are
invariably hidden. In many cases, the potential risks have been mitigated by process owners taking the
cue from the flag indication by the CA itself. Therefore, there is a need to prove the value of CA to
accelerate CA implementation. Moreover, in the era of big data, assurance functions are facing greatest
challenge to remain relevant as assurance provider in the company. The purpose of this study is to
integrate Big Data Analytics with Continuous Auditing to improve Yield performance for Plantation
Companies (in the context of improving the value of CA) and the context is on Oil Palm industry in
Malaysia with the ultimate aim to use CA to improve the yield performance. Hence, the study will add
the value of the assurance function in the company. The study uses Literature Review for Continuous
Auditing to analyse Continuous Auditing criteria that are important to be implemented to improve the
existing framework. Then, it uses CASE STUDY AND POC to integrate Big Data Analytics through
Continuous Auditing to develop Continuous Auditing framework/model for Plantation Company. Finally,
it uses another CASE STUDY / POC to evaluate a model/framework for improving Yield Performance thru
integration of Big Data Analytics of Continuous Auditing in Plantation Company. By developing the
Continuous Auditing framework/model for Plantation Company, it is expected to improve the FFB Yield
and prove the value of CA to management.
Keywords: Continuous Auditing; Big Data Analytics; Yield Performance; Plantation; FFB
23) IMAGE SPLICING FORGERY DETECTION AND LOCALIZATION USING FREQUENCY-BASED
FEATURES
Thamarai Subramaniam and Hamid Abdullah Jalab
Digital images are frequently used in various fields to disseminate information or source of evidence to
make certain facts clear and concise. The explosion of digital technologies advancement and the
sophistication of image editing software have paved many ways for image forgery. Images are digitally
tampered and modified to deceive the receivers of the information. Image splicing is one of the most
notorious techniques used to forge images. In image splicing, a new tampered image is created using
different fragments from another image(s). Image forgeries are increasingly becoming difficult to detect
by human and machine. Thus, it is important to develop an effective and efficient detection method to
authenticate the originality of an image. The proposed system will adopt image transform with texture
features to extract the features from the image(s) and train the system using three datasets DVMM v1,
DVMM v2, and CASIA to detect image splicing forgery.
Keywords: Image forensics, forgery detection, image splicing, localization techniques
24) Practical eventual consistency on decentralized consensus for edge-centric Internet-of-
Things PHD candidate
Yeow Kim Chai
With the exponential rise in number of devices, Internet-of-Things (IoT) is gearing towards edge-centric
computing to offer high bandwidth, low latency, and improved connectivity. In contrast, legacy cloud
centric platforms offer limited bandwidth and connectivity that affects the QoS. Hosting the services on
nearby edge gateways or even devices creates an opportunity to resolve the issues of cloud centric
services. Edge-centric Internet-of-Things based technologies such as fog computing and mist computing
offers distributed and decentralized solutions to resolve the issues of cloud centric model. However, to
foster a distributed edge centric models, a decentralized consensus system is needed to incentivize all
participants to share their edge resources. Decentralized consensus systems adopt either blockchain or
blockchainless directed-acyclic-graph technologies which serve as the immutable public ledger for
transactions. Research found that blockchainless DAG like IOTA Tangle fulfil the edge-centric IoT
requirements especially on excellent scalability with billions of M2M nano-transactions without fee.
However, IOTA may seems to have flaws in its probabilistic security model that may need a certain
centralized check-point to co-ordinate and overcome a “Tragedy of the Commons”. The reason is that
IOTA DAG could not assert Satoshi’s creative security model which is bounded by Longest-Chain-Rule but
instead to be unbounded in security risk where mathematical modelling cannot cover all in game
theoretical analysis. IOTA DAG has excellent nodes scalability via partition-tolerance. But the cost of this
generality - semantics of the consistency cannot be unequivocally provably extracted. Thus IOTA makes a
tradeoff in CAP theorem, that is, it only guarantees eventual consistency but not strong consistency. The
objective of this research is to prove the practicality of the IOTA eventual consistency in edge-centric IOT
application, using the weaker Requirements from Distributed Ledgers (RDL) formal framework where the
safety and liveness properties of the system are still intact.
Keywords: Internet of Things (IoT), Decentralization, Directed-acyclic-graph
25) HMM-based Arabic Handwritten word recognition via zone segmentation
Aznul Qalid Md Sabri, Amirrudin Kamsin, Saqib Hakak, Faiz Alotaibi
This paper presents a novel approach towards Arabic handwritten word recognition using the zone-wise
material. Due to complex nature of the Arabic characters involving issues of overlapping and related
issues like touching, the segmentation and recognition is a monotonous main occupation of in Arabic
cursive (e.g. Naskha, Riqaa and other comparable scripts written for Holy Quran). To solve the issues of
this character segmentation in such cursive, HMM founded on sequence modelling relying on the holistic
way. This paper proposes an efficient framework word recognition by segmenting the handwritten word
features horizontally into three zones (upper, middle and lower) and then recognise the corresponding
zones. The aim of this zone is to minimise the quantity of distinct component classes associated to the
total a number of classes in Arabic cursive. As an outcome of this proposed approach is to enhance the
recognition performance of the system. The elements of segmentation zone especially in middle zone
(baseline), where characters are frequently tender, are recognised using HMM. After the recognition of
middle zone, HMM Based in Viterbi forced Alignment is performed to mark the right and left characters
in conjoint zones. Next, the residue components, if any, in upper and lower zones are highlighted in a
character boundary then the Components are joint with the morphology of the character to achieve the
whole word level recognition. Water reservoir- created the main properties that had integrated into the
framework to increase the performance of the zone segmentation especially for the upper zone for the
character to determine the boundary detection imperfections in segmentation stage. A new sliding
window-based feature, named hierarchical Histogram OFOriented Gradient (PHOG) is suggested for
lower and upper zone recognition. The comparison study with other similar PHOG features and found
robust for Arabic handwriting script recognition. An exhaustive experiment is performed of other
handwriting using different dataset such IFN / IFNT to evaluate the rate and the recognition
performance.The outcome of this experiment, it has been renowned that proposed zone-wise recognition
increases accuracy with respect to the traditional way of Indic word recognition.
Keywords: Handwritten word recognition, Hidden Markov Model, Arabic script recognition
Introduction (Heading 1)
26) Searchable Encryption Using Polynomial Function and Modular Arithmetic
Md Shamim Hossain and M A Manazir Ahsan
With the advent of cloud computing more and more IT and business organizations are migrating to cloud
for its entrancing features of computing and storage infrastructure. At the same, cloud security, privacy
of cloud computing is not at satisfactory level. To make things worse, cloud’s inherent nature brought
new set of security issues. Specially, the cloud storage causes dilemma between two important factors:
confidentiality and search facility. To preserver the confidentiality of data, user encrypts before
outsourcing to cloud which deters user to search on encrypted data. An appropriate solution to the
problem is searchable encryption (SE). In this paper, we study existing schemes of SE and enlist two
security requirements: secure tag (or trapdoor) irreversibly and dynamic update of index. Lastly, we
propose a searchable encryption scheme based on polynomial function and modular arithmetic. Our
security analysis shows that, proposed scheme holds the necessary properties for SE.
Keywords: Information security, searchable encryption, modular arithmetic, polynomial, trapdoor
indistinguishably.