|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
December 2014 | Vol. 70 No.3 |
Title: |
ENSEMBLE OF CLUSTERING ALGORITHMS FOR ANOMALY INTRUSION DETECTION SYSTEM |
Author: |
SALIMA BENQDARA, MD ASRI NGADI, JOHAN MOHAMAD SHARIF, SAQIB ALI |
Abstract: |
Maximizing detection accuracy and minimizing the false alarm rate are two major
challenges in the design of an anomaly Intrusion Detection System (IDS). These
challenges can be handled by designing an ensemble classifier for detecting all
classes of attacks. This is because, single classifier technique fails to
achieve acceptable false alarm rate and detection accuracy for all classes of
attacks. In ensemble classifier, the output of several algorithms used as
predictors for a particular problem are combined to improve the detection
accuracy and minimize false alarm rate of the overall system. Therefore, this
paper has proposed a new ensemble classifier based on clustering method to
address the intrusion detection problem in the network. The clustering
techniques combined in the proposed ensemble classifier are KM-GSA, KM-PSO and
Fuzzy C-Means (FCM). Experimental results showed an improvement in the detection
accuracy for all classes of network traffic i.e., Normal, Probe, DoS, U2R and
R2L. Hence, this validates the proposed ensemble classifier. |
Keywords: |
Intrusion Detection, Ensemble Learning, Voting Ensemble |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
NODE LOCATION ANALYSIS USING RSSI FOR AUTOMATIC POSITION REPORTING SYSTEM |
Author: |
S.V.Tresa Sangeetha, Dr.S.Ravi, Dr.Uma RajaRam, Dr.S.Radha Rammohan |
Abstract: |
This paper analyses various connectivity issues encountered when a mobile node
identifies an access point with greater signal strength and find a proper node
location in the network. Dynamic networks require adaptive policies for optimal
performance and guaranteed QoS to support realtime environment. The network
connectivity issues of different wireless protocols and standards are discussed
with respect to bit rate and the number of users accessing the network. The
study is demonstrated with an experimental set up of multiple wireless nodes
performing push and pop functions in a common server through the network. The
experimental work is extended to specific application like Automatic Position
Reporting System (APRS) of remote data logging and a protocol implementation
work of packet flow and data structure is presented. |
Keywords: |
Wireless Network, Access Point (AP), Self Learning Network, Fairness Policy |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
AUTOMATIC CLASSIFICATION OF AUDIO DATA USING GRADIENT DESCENT NEURAL NETWORK
BASED ALGORITHM |
Author: |
LAKSHMI DEVASENA C, M HEMA LATHA |
Abstract: |
Audio mining is a technique by which the core of an audio signal can be
automatically searched and analyzed. This research work addresses feature
extraction from audio and audio similarity measures and proposes an algorithm to
mine any type of audio data. This Hybrid Algorithm for Audio Mining (HAAM)
consists of two different phases. The first phase named Training Phase, takes
Training Audio data as input and then Sonogram, Spectrum Histogram, Periodicity
Histogram and Fluctuation Pattern for the audio data are calculated. Then the
features are extracted using Mel Frequency Cepstral Coefficient (MFCC) and the
essential features are reduced from these for further processing. Then,
classification is done by using a supervised classification technique. Then the
features of the audio files are trained using Gradient Descent Adaptive Learning
Back propagation Network. The second phase is the Testing Phase, which takes
Testing Audio data as input and then preprocessing is done and the features are
extracted. Then classification is done using the findings from the trained data
by matching the features of the test audio with the equivalent or closest
feature audio classes. The working prototype of this algorithm has been
implemented and tested. During the experiments, Instrumental music data have
been used as input and the system performed well and classified the input music
in an accurate manner and presented the results obtained. The system is suitable
for classifying the different types of audio data and can be used in many
applications including speech recognition, audio classification in scientific
research and engineering, audio data comparison and detection in audio
surveillance applications. |
Keywords: |
Audio Mining, Gradient Descent Neural Network, K-Means Classifier, Mel Frequency
Cepstrum Coefficient, Principal Component Analysis |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
A MIDDLEWARE AND POLICY-BASED ADAPTATION FRAMEWORK TO SIMPLIFY SOFTWARE
EVOLUTION: AN IMPLEMENTATION ON UNIT TRUST ENTERPRISE SYSTEM |
Author: |
N.H AWANG, S. SHAHIBUDDIN, W.M.N WAN KADIR |
Abstract: |
Software evolution needs to be properly controlled to avoid huge problems during
maintenance phase. Software needs to evolve to ensure it meets its development
purpose. One of promising ways to address the issue of software evolution is via
software adaptation. There are 4 main approaches to software adaptation i.e.
architecture-based, component-based, agent-oriented and middleware-based. This
research is adopting middleware-based approach to software adaptation. An
adaptation framework called MiPAF, which uses middleware and policy-based
concept is proposed to simplify software evolution. MiPAF comprises 6 components
namely Service Manager, Adaptation Manager, Service Infrastructure, Device
Controller, Policy Repository and Context Monitor. The use of MiPAF will affect
4 software development phases i.e. requirement, analysis, design, and
development. MiPAF runtime is developed to enable adaptation of the device layer
of a Unit Trust Enterprise System (UTES). A simple, XML-based policy language is
developed to specify what action to be taken when certain condition happens. The
adaptation requirements of this system is specified and an adaptation policy is
developed according to the requirements. In this implementation, MiPAF runtime
is developed using C language and it is installed on workstations together with
UTES client. There are 2 adaptation requirements for this implementation. The
first requirement is when a passbook printer fail, the system can proceed with
printing using another passbook printer without interruption. The second
requirement is that when the type of passbook printer is changed, the system
should not be impacted. The evaluation is done against 6 evaluation criteria;
scalability, context-awareness, performance, usability, heterogeneity, and
dynamic-evolveability. MiPAF meets all the mentioned criteria. |
Keywords: |
Software Evolution, Software Adaptation, Middleware, Policy, Framework |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
AUTOMATED TEST CASE GENERATION USING UML USE CASE DIAGRAM AND ACTIVITY DIAGRAM |
Author: |
ARUP ABHINNA ACHARYA, PRATEEVA MAHALI, DURGA PRASAD MOHAPATRA |
Abstract: |
Testing plays a major role for improving the quality of a software product. Due
to its iterative and incremental nature it needs special attention. Test case
generation is one of the complex activities carried out during testing phase.
Generating test cases in the early phases of development life cycle works like a
catalyst for model based testing and at the same time efficiently manages time
and resources. This paper describes a novel approach for test case generation
from UML Activity Diagram (AD) and Use Case Diagram (UCD). At first UCD and AD
are converted into Use Case Graph (UCG) and Activity Graph (AG) respectively.
The AG and UCG are integrated to form a combined graph called Activity Use Case
Graph (AUCG). The AUCG is further traversed to generate test cases. Test cases
generated using the combined approach is capable of detecting more number of
faults as compared to individual models while keeping intact the total coverage.
The proposed approach also reveals faults like execution fault, operational
fault and use case dependency fault. |
Keywords: |
Testing, AUCG, Test Case Generation, Dependency Fault, Operational Faults |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
INVESTIGATING BIOGEOGRAPHY-BASED OPTIMISATION FOR PATIENT ADMISSION SCHEDULING
PROBLEMS |
Author: |
ABDELAZIZ I. HAMMOURI, BASEM ALRIFAI |
Abstract: |
This paper derives its work from an interest in the development of an automated
approach to tackle highly constrained patient admission scheduling problems (PASP).
It is concerned with an assignment of patients to bed in an appropriate
department in such a way it can maximise medical treatment effectiveness and
patients’ comfort. In this paper too, we have investigate a newly created
meta-heuristic optimisation algorithm, called Biogeography Based Optimization (BBO)
based on the idea of migration of species between different habitats. To
evaluate the performance of the proposed method, six instances of PASP data sets
were used. The performance of the BBO algorithm was compared with other
approaches that in the literature. Experimental results show that, the BBO needs
more investigation for improving the obtained results. |
Keywords: |
Meta-Heuristic. Evolutionary Algorithms. Biogeography-Based Optimization.
Patient Admission scheduling. Healthcare. |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
AUTOMATED ARABIC ANTONYM EXTRACTION USING A CORPUS ANALYSIS TOOL |
Author: |
LULUH ALDHUBAYI, MAHA ALYAHYA |
Abstract: |
The automatic extraction of semantic relations between words from textual
corpora is an extremely challenging task. The increasing need for language
resources supporting Natural language processing (NLP) applications has
encouraged the development of automated methods for the extraction of semantic
relations between words. The use of corpus statistical and similarity
distribution methods can help in the task of semantic relation extraction
between pairs of words. In this paper, we present a pattern-based bootstrapping
approach using Arabic language corpora and a corpus analysis tool (Sketch
Engine) to extract the semantic relations (antonyms) between word pairs. The
algorithm uses LogDice and pattern co-occurrence to classify the extracted pairs
into antonyms. Results of evaluation show that our approach is able to extract
the antonym relations with a precision of 76%. |
Keywords: |
Antonym Extraction, Sketch Engine, Arabic Lexicon, Semantic Relation, Arabic NLP |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
EVALUATED REPUTATION-BASED TRUST FOR WSN SECURITY |
Author: |
ABDULLAH SAID ALKALBANI, ABU OSMAN MD. TAP, TEDDY MANTORO |
Abstract: |
During the last years, Wireless Sensor Networks (WSNs) and its applications have
obtained considerable momentum. However, security and power limits of WSNs are
still important matters. Many existing approaches at most concentrate on
cryptography to improve data authentication and integrity but this addresses
only a part of the security problem without consideration for high energy
consumption. Monitoring behavior of node neighbors using reputation and trust
models improves the security of WSNs and maximizes the lifetime for it. However,
a few of previous studies take into consideration security threats and energy
consumption at the same time. Under these issues Modified Reputation-Based Trust
model proposed and optimized for security strength. During evaluation of the
model with well-known models two security threats (oscillating and collusion)
were applied in order to measure the accuracy, scalability, trustworthiness and
energy consumption. As a result, the effects of collusion and oscillating on
proposed model are minimized and energy consumption for dynamic networks
reduced. Also, simulation results show that MRT has better average accuracy and
less average path length than other mechanisms, due to the security and energy
aware. |
Keywords: |
Wireless Sensor Networks (WSNs), Collusion, Oscillating, Power Consumption,
Trust and Reputation Models |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
VOICE ANALYSIS FOR DETECTING PERSONS WITH PARKINSON’S DISEASE USING PLP AND VQ |
Author: |
ACHRAF BENBA, ABDELILAH JILBAB, AHMED HAMMOUCH |
Abstract: |
In order to improve the assessment of speech disorders in the context of
Parkinson's disease, we have used 34 voice recordings of sustained vowel / a /,
from 34 subjects including 17 patients with Parkinson’s disease. We subsequently
extracted from 1 to 20 coefficients of the Perceptual linear prediction (PLP)
from each subject. The frames of the PLP were compressed using vector
quantization, with six Codebook sizes. We used Leave One Subject Out validation
scheme known as (LOSO) and the Support Vector Machines (SVM) classifier with its
different types of kernels, (i.e.; RBF, Linear). After viewing the variety of
obtained results, we proceeded to a bench of 100 trials. The best average result
obtained was 75.79%, and the maximum result obtained was 91.17% using the
codebook size of 1. |
Keywords: |
Voice analysis, Parkinson’s disease, Perceptual Linear Prediction, Vector
quantization. Leave One Subject Out, Support Vector Machines. |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
ERROR COMPENSATED FIXED WIDTH MODIFIED BOOTH MULTIPLIER FOR MULTIMEDIA
APPLICATIONS |
Author: |
R. MOHAMED NIYAS, N. PRABHAKARAN, N. SATHYA, P. VAISHNAVI, B. MADHUSUDHANA REDDY |
Abstract: |
Many multimedia and digital signal processing systems are desirable to maintain
a fixed format and to allow little accuracy loss to output data. The objective
of this paper is to design a fixed width modified booth multiplier with high
error performance. And the need to derive an effective error compensation
function that makes the error distribution more symmetric and centralized in the
error equalized to zero. The compensation circuit is mainly composed of
simplified sorting network and this can achieve a tiny mean and mean square
error as compared to the other circuits. The odd even sorting networks used for
error compensation are composed of appropriately connected comparators. The
simplified form of sorting network consist of neither NAND, NOR, AND-OR INVERTER
(AOI) and OR-AND-INVERTER (OAI). In fixed width modified Booth multiplication,
to reduce the number of partial products by a factor of two, modified booth
encoding is used. The software used for the simulation of this circuit is
Altera-Quartus II. The RTL code is generated using the above software. The
implementation of the circuit is done using DE1 board. |
Keywords: |
Booth Multiplier, mean square error, partial products |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
MARKOV MODEL FOR DISCOVERING KNOWLEDGE IN TEXT DOCUMENTS |
Author: |
I.BERIN JEBA JINGLE, Dr. J.JEYA A.CELIN |
Abstract: |
The digital data knowledge discovery and data mining due to their immense growth
have engrossed a great deal of deliberation in recent years. Numerous
applications, such as market investigation and business society, can be promoted
by use of the information and facts extracted from a bulky amount of data. Text
mining is the skill that allows users to mine useful information from a large
amount of digital text documents on the Web or databases.Text Mining is the
discovery of unknown information, by automatically extracting information from
different written resources. This paper deals with the patterns discovery in
text documents using the hidden Markov model. In most existing pattern mining in
text mining methods, all go through the problems of lack of accuracy and lack of
performance. In the proposed system the first progression is pre-processing step
which removes the “noise” word. The next development is the HMMs, which is used
for pattern extraction and classification of input data. Hidden Markov Model
calculates the possibility value between noticed events and unnoticed events.
This method can improve the accuracy of evaluating term weights and also used to
progress the performance for discovering patterns in text for large databases. |
Keywords: |
Data mining, HMM, Classification, stemming, Smoothing Technique |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
DETECTION AND TREATMENT OF THE SELF_CROSS IN THE SHRINK PROCESSES OF PARAMETRIC
ACTIVE CONTOUR MODEL |
Author: |
LIU HONGSHEN, WANG NAN, RUAN YUE |
Abstract: |
The research object of this paper is the self_crosses of contour in parameter
active contour model with shrink strategy (PACMS). For studying the object,
parameter active contour model is improved to rapidly converge and reduce number
of the self_crosses of contour. The concept of self_cross in PACMS is introduced
and the influence of it on the evolution of PACMS is presented. With the
improved model, the existence of self_cross in PACMS is shown. The treatments of
self_cross on contours in the current literatures are analyzed. A model of
describing the self_crosses is set up and an improved method of detecting and
treating the self_crosses in PACMS is put forward. The experiment results of the
new method are given and the comparison among the new method and other methods
of detecting and treating the self_crosses are made. The result shows that the
new method is more efficient than the other method of detecting self_crosses on
contours. |
Keywords: |
Parametric Active Contour Model, Parameter Active Contour Model With Shrink
Strategy, The Self_Cross Of Contour, Detection, Treatment |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
MEASUREMENT STUDY ON PACKET SIZE AND PACKET RATE EFFECTS OVER VEHICULAR AD HOC
NETWORK PERFORMANCE |
Author: |
L. A. HASSNAWI, R.B. AHMAD, MUATAZ H. SALIH, M.N. MOHD WARIP, M. ELSHAIKH |
Abstract: |
A vehicular ad hoc network (VANET) is a type mobile ad hoc network which is
developed to increase physical safety of vehicles’ derivers. Many parameters
affect the performance of VANET. Packet size and packet rate are two important
parameters that need to be considered when using VANET. Different values of
these parameters provide different network performance. Using large packet size
leads to reduce packet header overhead, but may leads to increase packet loss
ratio due to packets collision. Therefore the values of these parameters must be
chosen carefully. This paper presents an investigation and analysis of using
different packet sizes and different packet rates on a VANET consists of
wireless nodes distributed along a highway. The investigation and analysis were
performed under varying network environments and for several different
evaluation metrics. It is a conclusion of this study that using small packet
size gives better network performance than using large packet size. Moreover,
the slow packet rate can provide better network performance than using faster
packet rate in highway VANET. |
Keywords: |
MANET, Packet Size, Packet Rate, Throughput, VANET |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
SIMULTANEOUS COORDINATED AND TUNING OF PSS FOR A MULTIMACHINE POWER SYSTEM USING
A NEW HYBRIDIZATION (GA-GR) VIA A MULTI-OBJECTIVE FUNCTION |
Author: |
A. CHOUCHA, L. CHAIB1 , S. ARIF1, L. MOKRANI |
Abstract: |
This work presents a new coordinated and robust tuning procedures of power
system stabilizer PSS using a novel hybridization technique to damp out power
system oscillations. This hybridization is based a combination between
stochastic Genetic algorithm (GA) methods and deterministic methods (gradient);
it is called GA-GR, and even between themselves stochastic methods genetic
algorithm and simulated annealing (GA-SA). The proposed approach is used for a
multi-objective function based on the real part of eigenvalues and the damping
factor, to search for optimal stabilizer parameters. To examine the
effectiveness and robustness of this tuning approach in enhancing the stability
of power systems, modal analysis and nonlinear simulations have been carried out
on New England/New York interconnected network system 68-bus, 16-machine power
system. |
Keywords: |
Genetic Algorithm (GA), Gradient Method, Modal Analysis, Power System
Stabilizer, Multimachine Power System, Small Signal Stability. |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
DATA PROCESSING METHODS IN MULTIMEDIA HOME PLATFORM APPLICATIONS FOR EARTHQUAKE
EARLY WARNING OF POTENTIALLY TSUNAMI SERVICES BASED INTERACTIVE DIGITAL
TELEVISION |
Author: |
SUHARTONO, ALI MAHMUDI |
Abstract: |
Earthquake early warning service for tsunami is important for the public. On one
side the service cannot be easily accessed by the public especially television
users. On another side the development of television technology, especially
digital television, should be much easier to access. The purpose of this
research is to make the method of processing data in Multimedia Home Platform (MHP)
Applications for Earthquake Early Warning of Potentially Tsunami Services.
Earthquake and weather data are taken from the web site Application Programming
Interface (API) by The Meteorology, Climatology, and Geophysics (BMKG) in
Indonesia. The research method is parsing techniques eXtensible Markup Language
(XML) with the Document Object Model (DOM). The time interval for data
collection from the Web site of BMKG uses the method of Systematic Sampling.
Therefore, MHP applications can update Earthquakes and weather data in real-time
in a specified interval. The results of XML parsing form of tsunami potential
earthquake data are recorded by MHP applications into file storage. The findings
show that MHP applications can display earthquake early warning of potentially
Tsunami information to users of television accordance with BMKG website in real
time. |
Keywords: |
MHP application, eXtensible Markup Language, Document Object Model, Systematic
Sampling |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
GENERALIZED FREQUENCY DIVISION MULTIPLE ACCESS ALLOCATION SCHEME FOR LTE-A |
Author: |
MOHAMMED MELOOD A. ABDASED, MAHAMOD ISMAIL, ROSDIADEE NORDIN |
Abstract: |
Frequency Division Multiple Access (FDMA) is one of the important features of
Long Term Evolution Advanced (LTE-A). For LTE-A downlink, Orthogonal FDMA (OFDMA)
scheme is used to provide frequency orthogonality. Whereas, for uplink, Signal
Carrier FDAM (SC-FDMA) is preferred since it provides improved PAPR performance
over OFDMA. Interleaved FDMA (IFDMA) and Localized FDMA (LFDMA) are forms of SC-FDMA
usually used in LTE-A. User subcarrier allocation method is the main distinction
between IFDMA and LFDMA schemes. Users are allocated destributed frequency
carriers in IFDMA while they are allocated localized carriers in LFDMA. Whereas
IFDMA scheme provides better PAPR performance over LFDMA scheme, the latter has
lower complexity requirements. In this paper a new user subcarrier allocation
scheme is introduced. The proposed scheme provides a variable interleaved
allocation of subcarriers in the bandwidth, hence it is named Generalized FDMA (GFDMA).
This variable interleaved allocation scheme made the transition from IFDMA to
LFDMA to be seamless. Theoritical derivation of the proposed scheme shows that
both IFDMA and LFDMA are subclasses of GFDMA. Simulation results of the proposed
GFDMA show that the signal PAPR calculated varies according to the interleave
level selected. The simulation has been conducted for different QAM modulation
schemes and bandwidths. In all the cases, GFDMA PAPR performance is in good
match to the calculated IFDMA and LFDMA PAPR. Moreover, further enhancement in
GFDMA PAPR is achieved as the number of users in a given bandwidth increased. |
Keywords: |
SC-FDMA, LTE-A, IFDMA, LFDMA, GFDMA |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
CURRENT STATE OF ANTI-PHISHING APPROACHES AND REVEALING COMPETENCIES |
Author: |
HIBA ZUHAIR ZEYDAN, ALI SELAMAT, MAZLEENA SALLEH |
Abstract: |
Phishing has become a substantial threat for internet users and a major cause of
financial losses. In these attacks the cybercriminals carry out user credential
information and users can fall victim. The current solution against phishing
attacks are not sufficient to detect and work against novel phishes. This paper
presents a systematic review of the previous and current research waves done on
Internet phishing mitigation in different areas of expertise and highlighted
phishing attacks types and some existing anti-phishing approaches. Further the
discussion about novel phishes and identify the elements of issues highlighted.
The review can be valuable source of information to find and identify recent gap
and challenges to fulfill the security flaws. |
Keywords: |
Anti-phishing, detection, novel, credentials, client |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
A WEB SITE USABILITY MEASURE USING NAVIGABILITY MAP METRIC |
Author: |
SANDEEP KUMAR PANDA, SANTOSH KUMAR SWAIN, RAJIB MALL |
Abstract: |
We propose a website navigability map metrics suite. To evaluate a large scale
website, use of website navigability map metrics opens up the potential of
employing intelligent tools. We propose a set of four metrics associated with
navigability rating based upon computing site navigability map complexity. To
validate our proposed metrics we have evaluated the Cyclomatic complexity of 10
electronic commerce websites. We have conducted empirical studies to study the
effectiveness of our proposed metrics. |
Keywords: |
Navigability Map Metrics, Cyclomatic Complexity, Intelligent Tool, Electronic
Commerce |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
TRUST BASED AD HOC ON DEMAND DISTANCE VECTOR ROUTING PROTOCOL AGAINST WORMHOLE
ATTACK |
Author: |
N SATHEESH, Dr. K. PRASADH |
Abstract: |
Mobile Ad hoc Networks (MANET) consists of mobile devices that communicate with
each other without any predefined infrastructure/centralized administration. In
network nodes can join or leave a network freely, MANETs are vulnerable to
unauthorized data manipulation as it does not verify user identity before
ensuring data access. Thus it is challenging to design security mechanisms that
protect MANETs from routing attacks in the presence of malicious nodes. This
study proposes a trust based Adhoc On-demand Distance Vector (AODV) protocol.
Experiments were conducted in two scenarios and the results proved that the new
method outperformed traditional AODV. |
Keywords: |
Mobile Adhoc Network (MANET), Adhoc On-demand Distance Vector (AODV), Routing,
Trust, Wormhole attack |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
A NEW APPROACH FOR GENERATING EXERCISES BASED ONTOLOGIES: A CASE STUDY OF
COURSES IN MATHEMATICAL ANALYSIS |
Author: |
IMANE LMATI, HABIB BENLAHMAR, NACEUR ACHTAICH |
Abstract: |
We present in this paper a methodology to generate an exercise model from the
mathematical content of analysis taught in high school and university. The
approach allows to a non-computer designer (teacher, educationalist ...) to set
the parameters of a pedagogical object PO (Theorem, Definition ...) and to
dynamically generate instances of the model by exploiting the semantic relations
of PO with other pedagogical objects of mathematical corpus. |
Keywords: |
Automatic Generation Of Exercise, Evaluation, Ontology. |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
MODELING CREDIBILITY ASSESSMENT AND EXPLANATION FOR TWEETS BASED ON SENTIMENT
ANALYSIS |
Author: |
DWI H. WIDYANTORO & YUDI WIBISONO |
Abstract: |
Credibility is one of the main issues when dealing with information obtained
from Online Social Networks (OSNs). Although a significant number of prior works
have addressed many issues in this topic, only a few that have worked on methods
for automatic credibility measurement for OSN messages and almost none who has
addressed a specific problem in explaining the credibility information. This
paper proposes a new approach for modeling credibility of tweets and explaining
it to users. We model tweet credibility based on other independent tweet
contents that support and oppose the topic issue in question. We also consider
the opinions of tweets’ followers who either confirm or deny, along with their
reputations. This method is based on assumption that the more community is who
agree with the claim, the more credible is the claim. Explanation of the
credibility measure is based on the tweet content itself. We provide users with
representative tweets that can be progressively zoomed-in for more detail
explanation. To achieve this goal, all tweets are hierarchically structured and
tweet representatives on each node are selected from the ones that are most
similar to the centroid. Our evaluation results indicate the feasibility of the
proposed methods. |
Keywords: |
Credibility Assesment, Credibility Explanation, Online Social Netwok, Tweet,
Sentiment Analysis |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
AN EFFECTIVE SCHEDULING OF MULTIPLE SOURCE NODES IN MANET USING DEMPSTER-SHAFTER
HYPOTHESIS |
Author: |
R.M.CHAMUNDEESWARI , DR P SUMATHI |
Abstract: |
A Mobile Ad-hoc network (MANET) is built without any central control point such
as a base station. Partially observable Markov decision process (POMDP)
multi-armed bandit problem solved continuous user authentication and intrusion
detection systems (IDs) for large network with different nodes. But POMDP did
not deal with higher number of nodes’ states for performing the scheduling
process in MANET. Maximum-Residual Multicast Protocol (MRMP) based on the
independent choice of transitional nodes provided loop-free tree and was
theoretically optimal in minimizing the residual energy. However, multiple
sources were not considered simultaneously for scheduling process using MRMP. To
effectively schedule multiple source information in MANET, Distributed
Scheduling Decision (DSD) scheme is proposed in this paper. In DSD scheme, the
source node to be broadcasted is dynamically selected to handle higher level of
security while transferring data packets with minimal residual energy
consumption states. DSD Scheme uses the Dempster-Shafter Hypothesis for multiple
source node scheduling process. The DSD scheme with Dempster-Shafter Hypothesis
derives the rule for multiple sources, with the confidence value, thereby
ignoring the contradictory values using the normalization factor. The
normalization factor in DSD scheme detects the intruded nodes with the aid of
contradictory values. The normalization factor improves the scheduling
efficiency in the distributed mobile network and increases the decision accuracy
in MANET. Simulation work is carried out on analyzing the DSD scheme performance
on the factors such as false positive rate, time taken on scheduling the packets
and decision control overhead. |
Keywords: |
Distributed Scheduling Decision, Confidence Value, Dempster-Shafter Hypothesis,
Mobile Ad-Hoc Network, Normalization Factor, Residual Energy |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
A PIXEL INTERPOLATION TECHNIQUE FOR CURVED HAIR REMOVAL IN SKIN IMAGES TO
SUPPORT MELANOMA DETECTION |
Author: |
T Y SATHEESHA D SATYANARAYANA M N GIRIPRASAD |
Abstract: |
Melanoma Insitu is one of the most earliest perilous forms of skin cancer. These
cancerous growths develop when unrepaired DNA damage to skin cells (most often
caused by ultraviolet radiation) triggers mutations (genetic defects) that lead
the skin cells to multiply rapidly and form malignant tumors. As there is no
effective treatment for advanced melanoma, recognizing the lesion at an early
stage is crucial for successful treatment. This lead to the development of
several computer-aided methods to assist dermatologists. While diagnosing, hair
occlusion caused algorithms to fail to identify the correct lesion in skin, and
caused errors in the results. Removing hairs without altering the lesion in skin
images is important to effectively apply detection algorithms.
The challenge is to develop fast, precise and robust algorithms for the removal
of hairs without altering the lesion in skin images. Hence, it leads to the
techniques of image processing by identifying hair pixels within a binary image
mask using the Pixel Interpolation Technique. The Pixel Interpolation Technique
was adapted to find a quadratic curve which detects curved hairs in the image
mask for removal and replacement through pixel interpolation. MATLAB [12] gives
the platform to perform tests rapidly on both simulated and actual images for
implementing this. Overall the quadratic Radon formula for Pixel Interpolation
works nicely in being able to detect curves in the image and ignore the majority
of image spots which are considered noise. |
Keywords: |
Median filter, Pixel Interpolation Techniques, Melanocyte |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
THE OPTIMAL QUANTIZATION MATRICES FOR JPEG IMAGE COMPRESSION FROM PSYCHOVISUAL
THRESHOLD |
Author: |
FERDA ERNAWAN , SITI HADIATI NUGRAINI |
Abstract: |
The JPEG image compression method has been widely implemented in digital camera
devices. The quantization process plays a primary role in JPEG image
compression. The quantization process is used to determine the visibility
threshold of the human visual system. The quantization tables are generated from
a series psychovisual experiments from several angle points of experimental
views. This paper proposes psychovisual threshold through quantitative
experiments for JPEG image compression. This experiment investigates the
psychovisual threshold based on the contribution of DCT coefficients on each
frequency order to the reconstruction error. The average reconstruction error
from incrementing DCT coefficient is investigated to produce a primitive
psychovisual threshold. The psychovisual threshold is designed to give an
optimal balance between quality of image reconstruction and compression rates. A
psychovisual threshold is obtained to generate new quantization tables for JPEG
image compression. The performance of new quantization tables from the
psychovisual threshold are analyzed and compared to the existing default JPEG
quantization tables. The experimental results show that the new quantization
tables from the psychovisual threshold produce higher quality of image
reconstruction at lower average bit-length of Huffman code than default JPEG
quantization tables. |
Keywords: |
JPEG compression, Psychovisual Error Threshold, Quantization Matrices Generation |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
WEB STRUCTURE MINING FOR USERS BASED ON A HYBRID GA/PSO APPROACH |
Author: |
B. RAJDEEPA , DR. P. SUMATHI |
Abstract: |
Web Mining is a demanding task that looks for Web access model, Web structures
and the reliability and dynamics of the Web contents. It offers capable Web
Personalization, System development, Site alteration, Business Intelligence and
Usage Characterization. A latest approach is offered for the estimation of the
web site hyperlink structure, where a web user advantage utility function is
planned based on known assumptions. The aspire of this work is to propose and
get better the Web Structure according to an effectiveness criterion. The
frequencies of procedure and time division to click on a link are utilized for
constructing the web user utility. Researching probabilistic behaviours have
permitted quality development of the hyperlink structure, using a Hybrid GA/PSO.
The planned methodology is tested on a real web site and the results are
interpreted using the web user benefits by links. |
Keywords: |
Web mining, web structure mining, web content mining, Hybrid GA/PSO |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
Title: |
APPLICATION OF FEAST (FEATURE SELECTION TOOLBOX) IN IDS (INTRUSION DETECTION
SYSTEMS) |
Author: |
MENDOZA PALECHOR FABIO, DE LA HOZ CORREA EDUARDO, DE LA HOZ MANOTAS ALEXIS |
Abstract: |
Security in computer networks has become a critical point for many
organizations, but keeping data integrity demands time and large economic
investments, in consequence there has been several solution approaches between
hardware and software but sometimes these has become inefficient for attacks
detection. This paper presents research results obtained implementing algorithms
from FEAST, a Matlab Toolbox with the purpose of selecting the method with
better precision results for different attacks detection using the least number
of features. The Data Set NSL-KDD was taken as reference. The Relief method
obtained the best precision levels for attack detection: 86.20 %( NORMAL),
85.71% (DOS), 88.42% (PROBE), 93.11 %( U2R), 90.07(R2L), which makes it a
promising technique for features selection in data network intrusions. |
Keywords: |
Feature Selection Toolbox (FEAST), Data-Set, Security, Attacks, Networks |
Source: |
Journal of Theoretical and Applied Information Technology
31st December 2014 -- Vol. 70. No. 3 -- 2014 |
Full
Text |
|
|
|