Skip to main content

ISECURE Journal

In recent years technology and management information system has been an excellent response too many global challenges, technology innovation has expanded over almost all the sectors of, and it made many processes more accurate and very... more
In recent years technology and management information system has been an excellent response too many global challenges, technology innovation has expanded over almost all the sectors of, and it made many processes more accurate and very faster than before. Technology systems playeda big role part in election processes in many democratic countries nowadays. The commission, in Iraq, suffers from many problems such as fraud, time-consuming and delays in the election processes that take a long time and also witness a delay in revealing the results. This research paper focuses on adapting the biometric system in Iraq; there are several different perspectives to specify the IHEC’s employees and manager’s attitude towards technology in general and Biometric system specifically. Most of the staff members feel confident about transforming into a technology system. In their responses to the questionnaires, most of them focused on getting trained before they start using the system. In this research, the data is collected by using survey technique from the independent high electoral commission managers and staff members, and the data is analyzed by using SPSS.

https://www.isecure-journal.com/article_110976.html
This paper is one that explored intelligent decision support systems and Decision support systems. Due to the inception and development of systems and technological advances such as data warehouse, enterprise resource planning, advance... more
This paper is one that explored intelligent decision support systems and Decision support systems. Due to the inception and development of systems and technological advances such as data warehouse, enterprise resource planning, advance plan system & also top trends like Internet of things, big data, internet, business intelligent etc. have brought in more advancement in the operations of decision support systems. This paper gives a systematic review on all the various applications of IDSS based on, knowledge, communication, documents etc. with also heading further to describe and differentiate two DSS methods which are Analytical Network Process (ANP) & Decision-Making Trial & Evaluation Laboratory (DEMATEL)

https://www.isecure-journal.com/article_104857.html
In this paper, we implement a Virtualized Network Management Laboratory named (VNML) linked to college campus network for educational purposes. This laboratory is created using Virtualbox virtualizer and GNS3 on Linux UBUNTU single HP... more
In this paper, we implement a Virtualized Network Management Laboratory named (VNML) linked to college campus network for educational purposes. This laboratory is created using Virtualbox virtualizer and GNS3 on Linux UBUNTU single HP DL380 G7 server platform. A total of 35 virtual devices (Routers, Switches and Virtual Machines) are created and distributed over virtualized campus network with seven network management tools configured and run. The proposed laboratory is aimed to overcome the limitations of network hardware existence in any educational facility teach network management subject in their curriculum. The other advantages include ease of managing the laboratory and overrides physical location setup within the same geographical area.

https://www.isecure-journal.com/article_96191.html
The Internet of Things (IoT) becomes the future of a global data field in which the embedded devices communicate with each other, exchange data and making decisions through the Internet. IoT could improve the quality of life in smart... more
The Internet of Things (IoT) becomes the future of a global data field in which the embedded devices communicate with each other, exchange data and making decisions through the Internet. IoT could improve the quality of life in smart cities, but a massive amount of data from different smart devices could slow down or crash database systems. In addition, IoT data transfer to Cloud for monitoring information and generating feedback that will lead to high delay in infrastructure level. Fog Computing can help by offering services closer to edge devices. In this paper, we propose an efficient system architecture to mitigate the problem of delay. We provide performance analysis like response time, throughput and packet loss for MQTT (Message Queue Telemetry Transport) and HTTP (Hyper Text Transfer Protocol) protocols based on Cloud or Fog servers with large volume of data from emulated traffic generator working alongside one real sensor. We implement both protocols in the same architecture, with low cost embedded devices to local and Cloud servers with different platforms. The results show that HTTP response time is 12.1 and 4.76 times higher than MQTT Fog and Cloud based located in the same geographical area of the sensors respectively. The worst case in performance is observed when the Cloud is public and outside the country region. The results obtained for throughput shows that MQTT has the capability to carry the data with available bandwidth and lowest percentage of packet loss. We also prove that the proposed Fog architecture is an efficient way to reduce latency and enhance performance in Cloud based IoT.

https://www.isecure-journal.com/article_96187.html
Project management is an important factor to accomplish the decision to implement large-scale software systems (LSS) in a successful manner. The effective project management comes into play to plan, coordinate and control such a complex... more
Project management is an important factor to accomplish the decision to implement large-scale software systems (LSS) in a successful manner. The effective project management comes into play to plan, coordinate and control such a complex project. Project management factor has been argued as one of the important Critical Success Factor (CSF), which need to be measured and monitored carefully during the implementation of Enterprise Resource Planning (ERP) systems. The goal of this article is to develop âĂIJCSF-Live!âĂİ which is a method for measuring, monitoring, and controlling critical success factors of large-scale software systems. To achieve such goal, we apply CSF-Live for the project management CSF. The CSF-Live uses the Goal/Question/Metric paradigm (GQM) to yield a flexible framework containing several metrics that we used it to develop a formulation to enable the measurement of the project management CSF. The formulation that we developed for the project management CSF implies that the significance of having proper project management when conducting an ERP system implementation, since it is positively associated with the success of the ERP.

https://www.isecure-journal.com/article_90895.html
Internet of Things (IoT) and cloud computing technologies have connected the infrastructure of the city to make the context-aware and more intelligent city for utility its major resources. These technologies have much potential to solve... more
Internet of Things (IoT) and cloud computing technologies have connected the infrastructure of the city to make the context-aware and more intelligent city for utility its major resources. These technologies have much potential to solve the challenges of urban areas around the globe to facilitate the citizens. A framework model that enables the integration of sensor's data and analysis of the data in the context of smart parking is proposed. These technologies use sensors and devices deployed around the city parking areas sending real time data through the edge computers to the main cloud servers. Mobil-Apps are developed that used real time data, set from servers of the parking facilities in the city. Fuzzification is shown to be a capable mathematical approach for modeling city parking issues. To solve the city parking problems in cities a detailed analysis of fuzzy logic proposed systems is developed. This paper presents the results achieved using Mamdani Fuzzy Inference System to model complex smart parking system. These results are verified using MATLAB simulation.

https://www.isecure-journal.com/article_90886.html
With the innovation of cloud computing industry lots of services were provided based on different deployment criteria. Nowadays everyone tries to remain connected and demand maximum utilization of resources with minimum time and effort.... more
With the innovation of cloud computing industry lots of services were provided based on different deployment criteria. Nowadays everyone tries to remain connected and demand maximum utilization of resources with minimum time and effort. Thus, making it an important challenge in cloud computing for optimum utilization of resources. To overcome this issue, many techniques have been proposed shill no comprehensive results have been achieved. Cloud Computing offers elastic and scalable resource sharing services by using resource management. In this article, a hybrid approach has been proposed with an objective to achieve the maximum resource utilization. In this proposed method, adaptive back propagation neural network and multi-level priority-based scheduling are being carried out for optimum resource utilization. This hybrid technique will improve the utilization of resources in cloud computing. This shows result in simulation-based on the form of MSE and Regression with job dataset, on behalf of the comparison of three algorithms like Scaled Conjugate Gradient (SCG), Levenberg Marquardt (LM) and Bayesian Regularization (BR). BR gives a better result with 60 hidden layers Neurons to other algorithms. BR gives 2.05 MSE and 95.8 regressions in Validation, LM gives 2.91 MSE and 94.06 regressions with this and SCG gives 3.92 MSE and 91.85 regressions.

https://www.isecure-journal.com/article_90883.html
The increasing volatility in pricing and growing potential for profit in digital currency have made predicting the price of cryptocurrency a very attractive research topic. Several studies have already been conducted using various... more
The increasing volatility in pricing and growing potential for profit in digital currency have made predicting the price of cryptocurrency a very attractive research topic. Several studies have already been conducted using various machine-learning models to predict crypto currency prices. This study presented in this paper applied a classic Autoregressive Integrated Moving Average (ARIMA) model to predict the prices of the three major cryptocurrencies âĂŤ Bitcoin, XRP and Ethereum âĂŤ using daily, weekly and monthly time series. The results demonstrated that ARIMA outperforms most other methods in predicting cryptocurrency prices on a daily time series basis in terms of mean absolute error (MAE), mean squared error (MSE) and root mean squared error (RMSE).

https://www.isecure-journal.com/article_90865.html
Data Virtualization (DV) has become an important method to store and handle data cost-efficiently. However, it is unclear what kind of data and when data should be virtualized or not. We applied a design science approach in the first... more
Data Virtualization (DV) has become an important method to store and handle data cost-efficiently. However, it is unclear what kind of data and when data should be virtualized or not. We applied a design science approach in the first stage to get a state of the art of DV regarding data integration and to present a concept matrix. We extend the knowledge base with a systematic literature review resulting in 15 critical success factors for DV. Practitioners can use these critical success factors to decide between DV and Extract, Transform, Load (ETL) as data integration approach.

https://www.isecure-journal.com/article_90858.html
This paper presents an investigation on the performance of the Non-Orthogonal Multiple Access (NOMA) in the power domain scheme. A Power Allocation (PA) method is proposed from NOMA throughput expression analysis. This method aims to... more
This paper presents an investigation on the performance of the Non-Orthogonal Multiple Access (NOMA) in the power domain scheme. A Power Allocation (PA) method is proposed from NOMA throughput expression analysis. This method aims to provide fair opportunities for users to improve their performance. Thus, NOMA users can achieve rates higher than, or equal to, the rates obtained with the conventional Orthogonal Multiple Access (OMA) in the frequency domain schemes. The proposed method is evaluated and compared with others PA techniques by computer system level simulations. The results obtained indicate that the proposed method increases the average cell spectral efficiency and maintains a good fairness level with regard to the resource allocation among the users within a cell.

https://www.isecure-journal.com/article_90856.html
Standard TCP is the de facto reliable transfer protocol for the Internet. It is designed to establish a reliable connection using only a single network interface. However, standard TCP with single interfacing performs poorly due to... more
Standard TCP is the de facto reliable transfer protocol for the Internet. It is designed to establish a reliable connection using only a single network interface. However, standard TCP with single interfacing performs poorly due to intermittent node connectivity. This requires the re-establishment of connections as the IP addresses change. Multi-path TCP (MPTCP) has emerged to utilize multiple network interfaces in order to deliver higher throughput. Resilience to link failures can be better supported in MPTCP as the segments' communication are maintained via alternative interfaces. In this paper, the resilience of MPTCP to link failures against several challenges is evaluated. Several link failure scenarios are applied to examine all aspects of MPTCP including congestion algorithms, path management, and subflow scheduling. In each scenario, the behavior of MPTCP is studied by observing and analyzing the throughput and delay. The evaluation of the results indicates MPTCP resilience to a low number of failed links. However, as the number of failed links increases, MPTCP can only recover full throughput if the link failure occurs on the server side. In addition, in the presence of link failures, the lowestRTT MPTCP scheduler yields the shortest delivery time while providing the minimum application jitter.

https://www.isecure-journal.com/article_90849.html
The fifth generation (5G) system architecture is defined as service-based and the core network functions are described as sets of services accessible through application programming interfaces (API). One of the components of 5G is... more
The fifth generation (5G) system architecture is defined as service-based and the core network functions are described as sets of services accessible through application programming interfaces (API). One of the components of 5G is Multi-access Edge Computing (MEC) which provides the open access to radio network functions through API. Using the mobile edge API third party analytics applications may provide intelligence in the vicinity of end users which improves network performance and enhances user experience. In this paper, we propose new mobile edge API to access and control the mobility at the network edge. The application logic for provisioning access and mobility policies may be based on considerations like load level information per radio network slice instance, user location, accumulated usage, local policy, etc. We describe the basic API functionality by typical use cases and provide the respective data model, which represents the resource structure and data types. Some implementation aspects, related to modeling the resource states as seen by a mobile edge application and by the network, are discussed.

https://www.isecure-journal.com/article_90843.html
With the emerging concept of model transformation, information can be extracted from one or more source models to produce the target models. The conversion of these models can be done automatically with specific transformation languages.... more
With the emerging concept of model transformation, information can be extracted from one or more source models to produce the target models. The conversion of these models can be done automatically with specific transformation languages. This conversion requires mapping between both models with the help of dynamic hash tables. Hash tables store reference links between the elements of the source and target model. Whenever there is a need to access the target element, we query the hash table. In contrast, this paper presents an approach by directly creating aspects in the source meta-model with traces. These traces hold references to target elements during the execution. Illustrating the idea of model driven engineering (MDE), This paper proposes a method that transforms UML class models to EMF ECORE model.

https://www.isecure-journal.com/article_90823.html
Classifications of several gesture types are very helpful in several applications. This paper tries to address fast classifications of hand gestures using DTW over multi-core simple processors. We presented a methodology to distribute... more
Classifications of several gesture types are very helpful in several applications. This paper tries to address fast classifications of hand gestures using DTW over multi-core simple processors. We presented a methodology to distribute templates over multi-cores and then allow parallel execution of the classification. The results were presented to voting algorithm in which the majority vote was used for the classification purpose. The speed of processing has increased dramatically due to using multi-core processors and DTW.

https://www.isecure-journal.com/article_90822.html
This paper presents a data mining application in metabolomics. It aims at building an enhanced machine learning classifier that can be used for diagnosing cachexia syndrome and identifying its involved biomarkers. To achieve this goal, a... more
This paper presents a data mining application in metabolomics. It aims at building an enhanced machine learning classifier that can be used for diagnosing cachexia syndrome and identifying its involved biomarkers. To achieve this goal, a data-driven analysis is carried out using a public dataset consisting of 1H-NMR metabolite profile. This dataset suffers from the problem of imbalanced classes which is known to deteriorate the performance of classifiers. It also influences its validity and generalizablity. The classification models in this study were built using five machine learning algorithms known as PLS-DA, MLP, SVM, C4.5 and ID3. This model is built after carrying out a number of intensive data preprocessing procedures to tackle the problem of imbalanced classes and improve the performance of the constructed classifiers. These procedures involves applying data transformation, normalization, standardization, re-sampling and data reduction procedures using a number of variables importance scorers. The best performance was achieved by building an MLP model that was trained and tested using five-fold cross-validation using datasets that were re-sampled using SMOTE method and then reduced using SVM variable importance scorer. This model was successful in classifying samples with excellent accuracy and also in identifying the potential disease biomarkers. The results confirm the validity of metabolomics data mining for diagnosis of cachexia. It also emphasizes the importance of data preprocessing procedures such as sampling and data reduction for improving data mining results, particularly when data suffers from the problem of imbalanced classes.

https://www.isecure-journal.com/article_90821.html
Port organizations have focused their efforts on physical or tangible assets, generating profitability and value. However, it is recognized that the greatest sustainable competitive advantage is the creation of knowledge using the... more
Port organizations have focused their efforts on physical or tangible assets, generating profitability and value. However, it is recognized that the greatest sustainable competitive advantage is the creation of knowledge using the intangible assets of the organization. The Balanced ScoreCard, as a performance tool, has incorporated intangible assets such as intellectual, structural and social capital into management. In this way, the port community can count on new forms of managing innovation, strengthening organizational practices, and increasing collaborative work teams. In this study, the concepts from analysis of the cognitive SWOT are applied to diagnose the port activity and its community. In workshops with experts and from the vision, mission, cognitive SWOT and strategies, a cognitive strategic map considering strategic objectives and indicators is designed in the customer, processes, and learning and growth axis for the port and port community. Causal relationships between objectives, associated indicators and incidence factors are established in a forward way from learning and growth axis to customer axis. Then, the incidence matrix is developed and the direct and indirect effects between factors are analyzed, which allows recommending the future course of the port and its community.

https://www.isecure-journal.com/article_90817.html
Time Saving and energy consumption has become a vital issue that attracts the attention of researchers in Underwater Wireless Sensor Networks (UWSNs) fields. According to that, there is a strong need to improve MAC protocols performance... more
Time Saving and energy consumption has become a vital issue that attracts the attention of researchers in Underwater Wireless Sensor Networks (UWSNs) fields. According to that, there is a strong need to improve MAC protocols performance in UWSNs, particularly enhancing the effectiveness of ALOHA Protocol. In this paper, a time-saving Aloha protocol with slotted carrier sense proposed which we called, ST-Slotted-CS-ALOHA protocol. The results of the simulation demonstrate that our proposed protocol can save time and decrease the average delay when it compared with the other protocols. Moreover, it decreased energy consumption and raised the ratio of throughput. However, the number of dropped nodes does not give better results compared to other protocols.

https://www.isecure-journal.com/article_90788.html
Simplifying and structuring qualitatively complex knowledge, quantifying it in a certain way to make it reusable and easily accessible are all aspects that are not new to historians. Computer science is currently approaching a solution to... more
Simplifying and structuring qualitatively complex knowledge, quantifying it in a certain way to make it reusable and easily accessible are all aspects that are not new to historians. Computer science is currently approaching a solution to some of these problems, or at least making it easier to work with historical data. In this paper, we propose a historical knowledge representation model taking into consideration the quality of imperfection of historical data in terms of uncertainty. To do this, our model design is based on a multilayer approach in which we distinguish three informational levels: information, source, and belief whose combination allows modeling and modulating historical knowledge. The basic principle of this model is to allow multiple historical sources to represent several versions of the history of a historical event with associated degrees of belief. In our model, we differentiated three levels of granularity (attribute, object, relation) to express belief and defined 11 degrees of uncertainty in belief. The proposed model can be the object of various exploitations that fall within the historian's decision-making support for the plausibility of the history of historical events.

https://www.isecure-journal.com/article_90787.html
Internet of Things (IoT) approach is empowering smart city creativities all over the world. There is no specific tool or criteria for the evaluation of the services offered by the smart city. In this paper, a new Multilayer Fuzzy... more
Internet of Things (IoT) approach is empowering smart city creativities all over the world. There is no specific tool or criteria for the evaluation of the services offered by the smart city. In this paper, a new Multilayer Fuzzy Inference System (MFIS) is proposed for the assessment of the Planet Factors of smart city (PFSC). The PFSC system is categorized into two levels. The proposed MFIS based expert system can categories the evaluation level of planet factors of the smart city into low, satisfied, or good.

https://www.isecure-journal.com/article_90517.html
Standard face recognition algorithms that use standard feature extraction techniques always suffer from image performance degradation. Recently, singular value decomposition and low-rank matrix are applied in many applications, including... more
Standard face recognition algorithms that use standard feature extraction techniques always suffer from image performance degradation. Recently, singular value decomposition and low-rank matrix are applied in many applications, including pattern recognition and feature extraction. The main objective of this research is to design an efficient face recognition approach by combining many techniques to generate efficient recognition results. The implemented face recognition approach is concentrated on obtaining significant rank matrix via applying a singular value decomposition technique. Measures of dispersion are used to indicate the distribution of data. According to the applied ranks, there is an adequate reasonable rank that is important to reach via the implemented procedure. Interquartile range, mean absolute deviation, range, variance, and standard deviation are applied to select the appropriate rank. Rank 24, 12, and 6 reached an excellent 100% recognition rate with data reduction up to 2 : 1, 4 : 1 and 8 : 1 respectively. In addition, properly selecting the adequate rank matrix is achieved based on the dispersion measures. Obtained results on standard face databases verify the efficiency and effectiveness of the implemented approach.

https://www.isecure-journal.com/article_90516.html
Wireless networks, Internet of Things (IoT), Internet of Everything (IoE), and smart homes have become extremely important terms in our present-day life. Most of the buildings, companies, institutions, and even homes depend on these... more
Wireless networks, Internet of Things (IoT), Internet of Everything (IoE), and smart homes have become extremely important terms in our present-day life. Most of the buildings, companies, institutions, and even homes depend on these technologies for interaction, communication, automation, and everything surrounding humans. To understand the advanced topics in wireless networks and IoT devices, it is necessary to use one of the practical learning tools, called Packet Tracer. This wireless network simulator is freely available by Cisco Networking Academy. In this paper, we will use Packet Tracer to design a smart home based on wireless and IoT devices and illustrate how to create different networking scenarios to make our homes more comfortable and convenient.

https://www.isecure-journal.com/article_90511.html
Medical images show a great interest since it is needed in various medical applications. In order to decrease the size of medical images which are needed to be transmitted in a faster way; Region of Interest (ROI) and hybrid lossless... more
Medical images show a great interest since it is needed in various medical applications. In order to decrease the size of medical images which are needed to be transmitted in a faster way; Region of Interest (ROI) and hybrid lossless compression techniques are applied on medical images to be compressed without losing important data. In this paper, a proposed model will be presented and assessed based on size of the image, the Peak Signal to Noise Ratio (PSNR), and the time that is required to compress and reconstruct the original image. The major objective of the proposed model is to minimize the size of image and the transmission time. Moreover, improving the PSNR is a critical challenge. The results of the proposed model illustrate that applying hybrid lossless techniques on the ROI of medical images reduces size by 39% and gives better results in terms of the compression ratio and PSNR.

https://www.isecure-journal.com/article_90502.html
In the recent years, social networks (SN) are now employed for communication and networking, socializing, marketing, as well as one's daily life. Billions of people in the world are connected though various SN platforms and applications,... more
In the recent years, social networks (SN) are now employed for communication and networking, socializing, marketing, as well as one's daily life. Billions of people in the world are connected though various SN platforms and applications, which results in generating massive amount of data online. This includes personal data or Personally Identifiable Information (PII). While more and more data are collected about users by different organizations and companies, privacy concerns on the SNs have become more and more prominent. In this paper, we present a study on information privacy in SNs through exploring the general laws and regulations on collecting, using and disclosure of information from Canadian perspectives based on the Personal Information Protection and Electronic Document Act (PIPEDA). The main focus of this paper is to present results from a survey and the findings of the survey.

https://www.isecure-journal.com/article_90475.html
The Internet of Things (IoT) is a very encouraging and fast-growing area that brings together the benefits of wireless systems, sensor networks, actuators, etc. A wide range of IoT applications have been targeted and several aspects of... more
The Internet of Things (IoT) is a very encouraging and fast-growing area that brings together the benefits of wireless systems, sensor networks, actuators, etc. A wide range of IoT applications have been targeted and several aspects of this field have been identified to address specific issues, as well as technologies and standards developed in various domains such as in radio frequency identification (RFID), sensors, and mobile telephony, to name a few. This article aims to talk specifically about the RFID technology and its accompanying communication, authentication, risk, and security concerns while applied to the IoT field. An important part of this work is indeed focused on security aspects that derive from the use of RFID in IoT, especially in IoT networks. The results of our research work highlighted an excellent integration of RFID in the field of Internet of things, particularly in healthcare systems.

https://www.isecure-journal.com/article_90275.html
This paper explores the algebraic matching approach for detection of vulnerabilities in binary codes. The algebraic programming system is used for implementing this method. It is anticipated that models of vulnerabilities and programs to... more
This paper explores the algebraic matching approach for detection of vulnerabilities in binary codes. The algebraic programming system is used for implementing this method. It is anticipated that models of vulnerabilities and programs to be verified are presented as behavior algebra and action language specifications. The methods of algebraic matching are based on rewriting rules and techniques with usage of conditional rewriting. This process is combined with symbolic modeling that gives a possibility to provide accurate detection of vulnerabilities. The paper provides examples of formalization of vulnerability models and translation of binary codes to behavior algebra expressions.

https://www.isecure-journal.com/article_90271.html
Detection of fake accounts on social networks is a challenging process. The previous methods in identification of fake accounts have not considered the strength of the users' communications, hence reducing their efficiency. In this work,... more
Detection of fake accounts on social networks is a challenging process. The previous methods in identification of fake accounts have not considered the strength of the users' communications, hence reducing their efficiency. In this work, we are going to present a detection method based on the users' similarities considering the network communications of the users. In the first step, similarity measures somethings such as common neighbors, common neighbors graph edges, cosine, and the Jaccard similarity coefficient are calculated based on adjacency matrix of the corresponding graph of the social network. In the next step, in order to reduce the complexity of data, Principal Component Analysis is applied to each computed similarity matrix to provide a set of informative features. then, a set of highly informative eigenvectors are selected using elbow-method. Extracted features are employed to train a One Class Classification (OCC) algorithm. Finally, this trained model is employed to identify fake accounts. As our experimental results indicate the promising performance of the proposed method a detection accuracy and false negative rates are 99.6% and 0%, respectively. We conclude that bringing similarity measures and One Class Classification algorithms into play, rather than the multi-class algorithms, provide better results.

https://www.isecure-journal.com/article_91325.html
Abstract- With the advancement and development of computer network technologies, the way for intruders has become smoother; therefore, to detect threats and attacks, the importance of intrusion detection systems (IDS) as one of the key... more
Abstract- With the advancement and development of computer network technologies, the way for intruders has become smoother; therefore, to detect threats and attacks, the importance of intrusion detection systems (IDS) as one of the key elements of security is increasing. One of the challenges of intrusion detection systems is managing of the large amount of network traffic features. Removing unnecessary features is a solution to this problem. Using machine learning methods is one of the best ways to design an intrusion detection system. Focusing on this issue, in this paper, we propose a hybrid intrusion detection system using the decision tree and support vector machine (SVM) approaches. In our method, the feature selection is initially done by the C5.0 decision tree pruning, and then the features with the least predictor importance value are removed. After removing each feature, the least square support vector machine (LS-SVM) is applied. The set of features having the highest surface area under the Receiver Operating Characteristic (ROC) curve for LS-SVM are considered as final features. The experimental results on two KDD Cup 99 and UNSW-NB15 data sets show that the proposed approach improves true positive and false positive criteria and accuracy compared to the best prior work.

https://www.isecure-journal.com/article_91592.html
GOST block cipher designed in the 1970s and published in 1989 as the Soviet and Russian standard GOST 28147-89. In order to enhance the security of GOST block cipher after proposing various attacks on it, designers published a modified... more
GOST block cipher designed in the 1970s and published in 1989 as the Soviet and Russian standard GOST 28147-89. In order to enhance the security of GOST block cipher after proposing various attacks on it, designers published a modified version of GOST, namely GOST2, in 2015 which has a new key schedule and explicit choice for S-boxes. In this paper, by using three exactly identical portions of GOST2 and fixed point idea, more enhanced fixed point attacks for filtration of wrong keys are presented. More precisely, the focus of the new attacks is on reducing memory complexity while keeping other complexities unchanged as well. The results show a significant reduction in the memory complexity of the attacks, while the time complexity slightly increased in comparison to the previous fixed point attacks. To the best of our knowledge, the lowest memory complexity for an attack on full-round GOST2 block cipher is provided here.

https://www.isecure-journal.com/article_89623.html
In this paper, we propose a new method of differential fault analysis of SHA-3 which is based on the differential relations of the algorithm. Employing those differential relations in the fault analysis of SHA-3 gives new features to the... more
In this paper, we propose a new method of differential fault analysis of SHA-3 which is based on the differential relations of the algorithm. Employing those differential relations in the fault analysis of SHA-3 gives new features to the proposed attacks, e.g., the high probability of fault detection and the possibility of rechecking initial faults and the possibility to recover internal state with 22-53 faults. We also present two improvements for the above attack which are using differential relations in reverse direction to improve that attack results and using the algebraic relations of the algorithm to provide a second way to recover the internal state of SHA-3. Consequently, we show that with 5-8 faults on average, SHA-3's internal state can be fully recovered.

https://www.isecure-journal.com/article_91736.html
Smart grid concept is introduced to modify the power grid by utilizing new information and communication technology. Smart grid needs live power consumption monitoring to provide required services and for this issue, bi-directional... more
Smart grid concept is introduced to modify the power grid by utilizing new information and communication technology. Smart grid needs live power consumption monitoring to provide required services and for this issue, bi-directional communication is essential. Security and privacy are the most important requirements that should be provided in the communication. Due to the complex design of smart grid systems, and utilizing different new technologies, there are many opportunities for adversaries to attack the smart grid system that can result fatal problems for the customers. Recently, Mahmood et al. [1] proposed a lightweight message authentication scheme for smart grid communications and claimed that it satisfies the security requirements. We found that Mahmood et al.'s scheme has some security vulnerabilities and it has not adequate security features to be utilized in smart grid. To address these drawbacks, we propose an efficient and secure lightweight privacy-preserving authentication scheme for a smart grid. Security of our scheme are evaluated, and the formal security analysis and verification are introduced via the broadly-accepted BAN logic and AVISPA tool. Finally, the security and efficiency comparisons are provided, which indicate the security and efficiency of the proposed scheme as compared to other existing related schemes.

https://www.isecure-journal.com/article_91969.html
In today's highly interconnected networks, security of the entities are often interdependent. This means security decisions of the agents are not only influenced by their own costs and constraints, but also are affected by their... more
In today's highly interconnected networks, security of the entities are often interdependent. This means security decisions of the agents are not only influenced by their own costs and constraints, but also are affected by their neighbors' decisions. Game theory provides a rich set of tools to analyze such influence networks. In the game model, players try to maximize their utilities through security investments considering the network structure, costs and constraints, which have been set by the network owner. However, decisions of selfish entities to maximize their utilities do not always lead to a socially optimum solution. Therefore, motivating players to reach the social optimum is of high value from the network owner's point of view. The network owner wants to maximize the overall network security by designing the game's parameters. As far as we know, there is no notable work in the context of linear influence networks to introduce appropriate game design for this purpose. This paper presents design methods that make use of the adjustments of players' costs, interdependencies, and constraints to align players' incentives with a network-wide global objective. We present a comprehensive investigation of existence and uniqueness conditions of Nash Equilibrium in such environments. Furthermore, numerical results of applying the proposed mechanisms in a sample real-world example are illustrated.

https://www.isecure-journal.com/article_92238.html
\emph{ Smooth Projective Hash Functions } ( SPHFs ) as a specific pattern of zero knowledge proof system are fundamental tools to build many efficient cryptographic schemes and protocols. As an application of SPHFs, \emph { Password -... more
\emph{ Smooth Projective Hash Functions } ( SPHFs ) as a specific pattern of zero knowledge proof system are fundamental tools to build many efficient cryptographic schemes and protocols. As an application of SPHFs, \emph { Password - Based Authenticated Key Exchange } ( PAKE ) protocol is well-studied area in the last few years. In 2009, Katz and Vaikuntanathan described the first lattice-based PAKE using the Learning With Errors ( LWE ) problem. In this work, we present a new efficient \emph { ring-based } smooth projectice hash function `` ( Ring - SPHF ) " using Lyubashevsky, Peikert, and Regev's dual-style cryptosystem based on the Learning With Errors over Rings ( Ring - LWE ) problem. Then, using our ring-SPHF, we propose the first efficient password-based authenticated key exchange ` ` ( Ring - PAKE ) " protocol over \emph{ rings } whose security relies on ideal lattice assumptions.

https://www.isecure-journal.com/article_80508.html
In the biclique attack, a shorter biclique usually results in less data complexity, but at the expense of more computational complexity. The early abort technique can be used in partial matching part of the biclique attack in order to... more
In the biclique attack, a shorter biclique usually results in less data complexity, but at the expense of more computational complexity. The early abort technique can be used in partial matching part of the biclique attack in order to slightly reduce the computations. In this paper, we make use of this technique, but instead of slight improvement in the computational complexity, we keep the amount of this complexity the same and reduce the data complexity enormously by a shorter biclique.
With this approach, we analysed full-round of LBlock, and also LBlock with modified key schedule (which was designed to resist biclique attack) both with data complexity 2^12, while the data complexity of the best biclique attack on the former was 2^52 and for the latter there is no attack on the full-round cipher, so far. Then we proposed a new key schedule that is more resistant against biclique cryptanalysis, though the low diffusion of the cipher makes it vulnerable to this attack regardless of the strength of the key schedule. Also using this method, we analyzed TWINE-80 with 2^12 data complexity. The lowest data complexity for the prior attack on the TWINE-80 was 2^60. In all the attacks presented in this paper, the computational complexities are slightly improved in comparison to the existing attacks.

https://www.isecure-journal.com/article_79989.html
Nowadays there are different kinds of attacks on Field Programmable Gate Array (FPGA). As FPGAs are used in many different applications, its security becomes an important concern, especially in Internet of Things (IoT) applications.... more
Nowadays there are different kinds of attacks on Field Programmable Gate Array (FPGA). As FPGAs are used in many different applications, its security becomes an important concern, especially in Internet of Things (IoT) applications. Hardware Trojan Horse (HTH) insertion is one of the major security threats that can be implemented in unused space of the FPGA. This unused space is unavoidable to meet the place and route requirements. In this paper, we introduce an efficient method to fill this space and thus to leave no free space for inserting HTHs. Using a shift register in combination with gate-chain is the best way of filling unused space, which incurs a no increase in power consumption of the main design. Experimental results of implementing a set of IWLS benchmarks on Xilinx Virtex devices show that the proposed prevention and detection scheme imposes a no power overhead with no degradation to performance and critical path delay of the main design

https://www.isecure-journal.com/article_82510.html
Linear diffusion layer is an important part of lightweight block ciphers and hash functions. This paper presents an efficient class of lightweight 4x4 MDS matrices such that the implementation cost of them and their corresponding inverses... more
Linear diffusion layer is an important part of lightweight block ciphers and hash functions. This paper presents an efficient class of lightweight 4x4 MDS matrices such that the implementation cost of them and their corresponding inverses are equal. The main target of the paper is hardware oriented cryptographic primitives and the implementation cost is measured in terms of the required number of XORs. Firstly, we mathematically characterize the MDS property of a class of matrices (derived from the product of binary matrices and companion matrices of $\sigma$-LFSRs aka recursive diffusion layers) whose implementation cost is $10m+4$ XORs for 4 <= m <= 8, where $m$ is the bit length of inputs. Then, based on the mathematical investigation, we further extend the search space and propose new families of 4x 4 MDS matrices with 8m+4 and 8m+3 XOR implementation cost. The lightest MDS matrices by our new approach have the same implementation cost as the lightest existent matrix.

https://www.isecure-journal.com/article_79447.html
While cloud computing is growing at a remarkable speed, privacy issues are far from being solved. One way to diminish privacy concerns is to store data on the cloud in encrypted form. However, encryption often hinders useful computation... more
While cloud computing is growing at a remarkable speed, privacy issues are far from being solved. One way to diminish privacy concerns is to store data on the cloud in encrypted form. However, encryption often hinders useful computation cloud services. A theoretical approach is to employ the so-called fully homomorphic encryption, yet the overhead is so high that it is not considered a viable solution for practical purposes. The next best thing is to craft special-purpose cryptosystems which support the set of operations required to be addressed by cloud services. In this paper, we put forward one such cryptosystem, which supports efficient search over structured data types, such as timestamps, which are comprised of several segments with well-known values. The new cryptosystem, called SESOS, provides the ability to execute LIKE queries, along with the search for exact matches, as well as comparison. In addition, the extended version, called XSESOS, allows for verifying the integrity of ciphertexts. The overhead of executing equality and comparison operations is negligible. The performance of LIKE queries is significantly improved by up to 1370X and the performance of result decryption improved by 520X compared to existing solutions on a database with merely 100K records.

https://www.isecure-journal.com/article_82692.html
Correctness verification of query results is a significant challenge in database outsourcing. Most of the proposed approaches impose high overhead, which makes them impractical in real scenarios. Probabilistic approaches are proposed in... more
Correctness verification of query results is a significant challenge in database outsourcing. Most of the proposed approaches impose high overhead, which makes them impractical in real scenarios. Probabilistic approaches are proposed in order to reduce the computation overhead pertaining to the verification process. In this paper, we use the notion of trust as the basis of our probabilistic approach to efficiently verify the correctness of query results. The trust is computed based on observing the history of interactions between clients and the service provider. Our approach exploits Merkle Hash Tree as an authentication data structure. The amount of trust value towards the service provider leads to investigating just an appropriate portion of the tree. Implementation results of our approach show that considering the trust, derived from the history of interactions, provides a trade-off between performance and security, and reduces the imposed overhead for both clients and the service provider in database outsourcing scenario.

https://www.isecure-journal.com/article_80601.html
In a world full of many ideas turning to various kinds of products which need to be protected and here comes the importance of intellectual property rights. The intellectual property has many types however, our interest is in trademarks.... more
In a world full of many ideas turning to various kinds of products which need to be protected and here comes the importance of intellectual property rights. The intellectual property has many types however, our interest is in trademarks. The Madrid system is a system which used by a group of countries that were in the Madrid level of agreement so they authorize it and they that has the agreement with them to use but the problem with it that it is a text-based system because of that we proposed a reverse image engine and that is because the reverse search image is better than the text-based system.

https://www.isecure-journal.com/article_131553.html
A Chatbot is a smart software that responds to natural language input and attempts to hold a conversation in a way that simulates humans. Chatbots have the potential to save any individual's time, hassle, and tedium by automating mundane... more
A Chatbot is a smart software that responds to natural language input and attempts to hold a conversation in a way that simulates humans. Chatbots have the potential to save any individual's time, hassle, and tedium by automating mundane tasks. The idea about this research is that to investigate how to help the user to efficiently interact with the robot receptionist through an intelligent assistant dialogue. Chatbots are an effective way to improve services with their 24 /7 up time, their cost efficiency, and their multiuser quality. Despite the, Chatbots reduce human errors and give more answers that are accurate. Successful implementation of a Chatbot requires a correct analysis of the user's query by the bot and ensures the correct response that should be given to the user. This research develops a Chatbot for the airports, which provides the visitors to the SWE Chatbot relevant information about the department.

https://www.isecure-journal.com/article_131483.html
Due to the increasing number of cars and the difficulty to find vacant parking spots easily, the smart parking system is essential to save time and efforts of drivers and to protect the environment from emissions and air pollution.... more
Due to the increasing number of cars and the difficulty to find vacant parking spots easily, the smart parking system is essential to save time and efforts of drivers and to protect the environment from emissions and air pollution. Wireless Sensor Networks used in smart parking systems consists of a number of sensors to monitor the events or changes and send the data, cluster head to manage the linked sensors, and base stations to manipulate and forward the data to the end system. All of these devices are used together to monitor a specific area. This paper analyzes the performance of IEEE802.11ac and compares with IEEE802.15.4 and IEEE802.11b using three different scenarios by measuring the average end to end delay and throughput with respect to the number of sensors (manually and automatically). This is done using Thing Speak cloud (An open IoT platform with MATLAB 2019 analytics) in IEEE 802.11ac and without a cloud setup in IEEE802.15.4 and IEEE802.11b. Three scenarios are considered in this work. First, the sensors are distributed manually in all the standards. Second, the sensors are distributed automatically in IEEE802.11 ac and manually in IEEE802.15.4 and IEEE802.11b. Third, the sensors are distributed automatically in IEEE802.11ac along with the cloud. While the sensors are placed manually with grid placement without the cloud in IEEE802.15.4 and IEEE802.11b. Finally, the results show that the IEEE802.11ac gave better results than other standards and it is suitable for applications with very high throughput.

https://www.isecure-journal.com/article_131125.html
Today, in the area of tele-communication, social media, internet of things (IoT) and virtual world, enormous amounts of data are being generated which are extracted to discover knowledge. Knowledge discovery from data in the... more
Today, in the area of tele-communication, social media, internet of things (IoT) and virtual world, enormous amounts of data are being generated which are extracted to discover knowledge. Knowledge discovery from data in the cloud-computing environment entails the extraction of new and necessary information from large and complex dataset. This study is qualitative and exploratory in nature. To review based on the recent literature, the articles published in the last five years (2014-2018) were searched. Different database were searched using the key words: "knowledge management" or "knowledge discovery" and "cloud computing". The literature review section is divided into three subsection based on the findings. The first two subsections present the data security and data privacy concerns under two main techniques (Big data analytics and machine learning) used in knowledge discover; and the last subsection presents various protocols proposed to address the related security and privacy concerns.

https://www.isecure-journal.com/article_131072.html
The application of New Information and Communication Technologies (NICT) in the field of training led to the creation of this new reality called distance learning. Described as the marriage of multimedia (sound, image, text) and the... more
The application of New Information and Communication Technologies (NICT) in the field of training led to the creation of this new reality called distance learning. Described as the marriage of multimedia (sound, image, text) and the Internet (online distribution, interactivity) DT has no doubt allowed to revive pedagogies to a new digital without or less presence.
Our purpose is to verify the impact of open distance learning on the development of socio-professional skills among future administrators of the Ministry of National Education in initial training. In addition, the instrumentation of these training devices also provides a framework for evaluating, monitoring and controlling the training process, using the resources of computers and the Internet.
Our research context takes place at the level of TAZA Regional Center of Trades Education and Training (CRMEF) during the academic year 2016-2017, the use of technological tools by trainers or trainees is quite common in courses, parallel activities, self-training, communication trainers-trainees or trainee-counsellors but all these forms suffer from the absence of a general frame of reference and regulation which guides the training actions via these tools.
Indeed, the recommended methodology based on engineering training devices and skills in a virtual environment ie ” The technical instrumentation “and” educational scripting ” objects and training content.
During its creation as its implementation, our platform experienced several difficulties including technical. Or today seeing the results of the questionnaires and following the feedbacks of trainee administrators we can judge that our goal is achieved.

https://www.isecure-journal.com/article_129217.html
The data warehouse size and the query complexity may cause unacceptable delay in decision support queries. A basic condition for the success of a data warehouse is the capability to supply decision makers with both precise information and... more
The data warehouse size and the query complexity may cause unacceptable delay in decision support queries. A basic condition for the success of a data warehouse is the capability to supply decision makers with both precise information and best response time. For this purpose, the concept of indexed views is used. Indexed views help to speed-up query processing and reduce the response time for tracing queries, especially for queries about past histories.

https://www.isecure-journal.com/article_129022.html
Simple signs existent in mammograms for diagnosing breast cancer are considered to be microcalcifications or MCs. Therefore, true detection of MCs is needed to minimize schedule diagnosis, efficient care and death rate due to breast... more
Simple signs existent in mammograms for diagnosing breast cancer are considered to be microcalcifications or MCs. Therefore, true detection of MCs is needed to minimize schedule diagnosis, efficient care and death rate due to breast cancer. A challenging task is to evaluate and interpret mammograms and, more over to the poor contrast consistency of MCs relative to the remain of the tissue, the precise identification of MCs, such as the minor size and random shape and size of the MC clusters, has several obstacles. These restrictions in the manual analysis of MCs increase the demand for an automated recognition system to help radiologists in mammogram analysis and it is important to design strength algorithm for this purpose. The goal of this paper is to present an efficient procedure that can be used to enhance images for extracting features to give excellent classification. The classifier senses which the region was normal, benign or malignant. The performance of KNN classifier with fuzzy histogram equalization using Otsu's multi threshold segmentation give excellent results in detection and recognition in mammograms for breast cancer distinguished in image mammograms obtained from hospital.

https://www.isecure-journal.com/article_129023.html
Entrepreneurship involves an immense network of activities, linked via collaborations and information propagation. Information dissemination is extremely important for entrepreneurs. Finding influential users with high levels of... more
Entrepreneurship involves an immense network of activities, linked via collaborations and information propagation. Information dissemination is extremely important for entrepreneurs. Finding influential users with high levels of interaction and connectivity in social media and involving them in information spread helps disseminating the information quickly. Thus, facilitating key entrepreneurial actors to find and collaborate with each other. Identifying and ranking entrepreneurial top influential people is still in infancy. This paper proposes an E-Rank framework for topic-specific influence theories that are specialized with respect to Twitter. Firstly, it extracts four dimensions to characterize influencers, including user popularity, activity, reliability, and tweet quality. Afterwards, it uses linear combinations of these dimensions to assign influence score to each user. Experimental results on a real-life dataset containing 233,018 Arabic tweets show that E-Rank successfully ranks 8 out of 10 entrepreneurial influencers. Unlike other existing approaches, E-Rank doesn't require any labelled data and has lower computational cost. To ensure the effectiveness and efficiency of E-Rank, three validation techniques were used (1) to compare the detected influencers with the real-world influencers, (2) to investigate the spread of information of the detected influencers, and (3) to compare the quality of E-Rank results with other ranking methods.

https://www.isecure-journal.com/article_128892.html
This paper reviews the characteristics of the main digest algorithms, and presents a new derivation of the leftover hash lemma, using the collision probability to derive an upper bound on the statistical distance between the key and seed... more
This paper reviews the characteristics of the main digest algorithms, and presents a new derivation of the leftover hash lemma, using the collision probability to derive an upper bound on the statistical distance between the key and seed joint probability, and the hash bit sequence distribution.

https://www.isecure-journal.com/article_128715.html
With the revolution in mobile technologies and the growing number of mobile internet users, Mobile Payment was born as a convenient channel of communication between customers and firms or organizations. Nowadays, Mobile Payments are on... more
With the revolution in mobile technologies and the growing number of mobile internet users, Mobile Payment was born as a convenient channel of communication between customers and firms or organizations. Nowadays, Mobile Payments are on the way to disrupting the traditional Payment methods and contributing to a massive shift to a cashless society. However, some Mobile Payment users may be resistant to changing from conventional Payment methods. Therefore, it is critical to guarantee users’ continuance intention (CI) toward Mobile Payments to ensure the widespread uptake of Mobile Payments. Given this, this research aims to study the influence of the quality of Mobile Payment impacts users’ CI in Saudi Arabia (SA). Methods: The conceptual model was constructed based on the Information System Success Model and Information System post-adoption researches to support the framework of the current study. Results are drawn from a self-administered survey of a random sample of 389 respondents who regularly use Mobile Payment services in SA. Quantitative analysis is used to determine the impact of Mobile Payment quality on persistence intention to operate in Saudi Arabia. Results: The current study outcomes have shown that all three dimensions of quality (system quality (SYSQ), service quality (SERQ), and information quality (INFQ)), influence user satisfaction (SAT).

https://www.isecure-journal.com/article_128632.html
With present-day technological advancements, the number of devices connected to the Internet has increased dramatically. Cyber-security attacks are increasingly becoming a threat to individuals and organizations. Contemporary security... more
With present-day technological advancements, the number of devices connected to the Internet has increased dramatically. Cyber-security attacks are increasingly becoming a threat to individuals and organizations. Contemporary security frameworks incorporate network intrusion detection systems(NIDS). These systems are an essential component for ensuring the security of computer networks against attacks. In this paper, two deep learning architectures are proposed for both binary and multi-class classification of network attacks. The models, CNN-IDS and LSTM-IDS, are based on convolutional neural network and long short term memory architectures, respectively. The models are evaluated using the well-known NSL-KDD dataset. The performance is measured in terms of accuracy, precision, recall, and F-measure. Experimental results show that the models achieve good performance in terms of accuracy and recall.

https://www.isecure-journal.com/article_128631.html
Motif discovery is a challenging problem in bioinformatics. It is an essential step towards understanding gene regulation. Although numerous algorithms and tools have been proposed in the literature, the accuracy of motif finding is still... more
Motif discovery is a challenging problem in bioinformatics. It is an essential step towards understanding gene regulation. Although numerous algorithms and tools have been proposed in the literature, the accuracy of motif finding is still low. In this paper, we tackle the motif discovery problem using ensemble methods. A review and classification of current ensemble motif discovery tools is presented. We then propose our cluster-based ensemble motif discovery tool (CEMD) which is based on k-medoids clustering of state-of-art stand-alone motif finding tools. We evaluate the performance of CEMD on benchmark datasets, and compare the results to both stand-alone and similar ensemble tools. Experimental results indicate that CEMD has better sensitivity than state-of-art stand-alone tools when dealing with human datasets. CEMD also obtains better values of sensitivity when motifs are implanted in real promoter sequences. As for the comparison of CEMD with ensemble motif discovery tools, results indicate that CEMD achieves better results than MEME-ChIP on all evaluation measures. CEMD shows comparable performance to RSAT peak-motifs and MODSIDE.

https://www.isecure-journal.com/article_128630.html
Ad hoc network is infrastructure-less support, so network nodes are vulnerable to many attacks. Security attacks in ad hoc networks are increasing significantly with time. The communicated and exchanged data should be also secured and... more
Ad hoc network is infrastructure-less support, so network nodes are vulnerable to many attacks. Security attacks in ad hoc networks are increasing significantly with time. The communicated and exchanged data should be also secured and kept confidential. Therefore, a hybrid cryptography is proposed to avoid unauthorized access of data. Data will be transmitted in an encrypted state, through Diffie-Hellman and later decrypted by the intended party. If a third party intercepts the encrypted data, it will be difficult to decipher. Ad hoc on demand distance vector (AODV) routing protocol is employed to determine the destination. The proposed solution is a hybrid mechanism of encryption algorithms. The NS-2.3 simulator was used to evaluate the performance of the proposed security algorithm. Simulation results have shown the performance of the proposed algorithm in ad hoc network on several metrics outperformed many developed security algorithm.

https://www.isecure-journal.com/article_128629.html

And 86 more