Recent Advances in Computer Science and Communications - Volume 13, Issue 3, 2020
Volume 13, Issue 3, 2020
-
-
A Study of the Cloud Computing Adoption Issues and Challenges
Authors: Dhanapal Angamuthu and Nithyanandam PandianBackground: The cloud computing is the modern trend in high-performance computing. Cloud computing becomes very popular due to its characteristic of available anywhere, elasticity, ease of use, cost-effectiveness, etc. Though the cloud grants various benefits, it has associated issues and challenges to prevent the organizations to adopt the cloud. Objective: The objective of this paper is to cover the several perspectives of Cloud Computing. This includes a basic definition of cloud, classification of the cloud based on Delivery and Deployment Model. The broad classification of the issues and challenges faced by the organization to adopt the cloud computing model are explored. Examples for the broad classification are Data Related issues in the cloud, Service availability related issues in cloud, etc. The detailed sub-classifications of each of the issues and challenges discussed. The example sub-classification of the Data Related issues in cloud shall be further classified into Data Security issues, Data Integrity issue, Data location issue, Multitenancy issues, etc. This paper also covers the typical problem of vendor lock-in issue. This article analyzed and described the various possible unique insider attacks in the cloud environment. Results: The guideline and recommendations for the different issues and challenges are discussed. The most importantly the potential research areas in the cloud domain are explored. Conclusion: This paper discussed the details on cloud computing, classifications and the several issues and challenges faced in adopting the cloud. The guideline and recommendations for issues and challenges are covered. The potential research areas in the cloud domain are captured. This helps the researchers, academicians and industries to focus and address the current challenges faced by the customers.
-
-
-
Malicious Route Detection in Vehicular Ad-hoc Network using Geographic Routing with Masked Data
Background: Vehicular Ad-hoc Network is the subset of Mobile Ad-hoc Network, Intelligent Transport System and Internet of Things. The acting nodes in VANET are the vehicles on the road at any moment. Objective: The anonymity character of these vehicles is opening the opportunity for malicious attacks. Malicious routes increase the data retransmission and hence, the performance of routing will be degraded. The main objective this work is to identify the malicious routes, avoid the data transmission using these routes and increase the packet delivery ratio. Methods: In the proposed system called Geographic Routing Protocol with Masked data, two binary- codes called mask and share have been generated to identify the malicious route. The original data is encoded using these binary-codes and routed to the destination using the geographic routing protocol. It is reconstructed at the destination node and based on the encoding technique the malicious routes and malicious nodes are identified. Simulations were conducted with varying speed and varying network size in 20 km2 geographical area. Results: The average packet delivery ratio with varying speed is 0.817 and with varying networksize is 0.733. Conclusion: The proposed geographical routing protocol with masked data technique outperforms than traditional geographic protocol and Detection of Malicious Node protocol, by 0.102 and 0.264 respectively with different speeds and by 0.065 and 0.1616 respectively with different network size.
-
-
-
Cost-Aware Ant Colony Optimization for Resource Allocation in Cloud Infrastructure
Authors: Punit Gupta, Ujjwal Goyal and Vaishali VermaBackground: Cloud Computing is a growing industry for secure and low cost pay per use resources. Efficient resource allocation is the challenging issue in cloud computing environment. Many task scheduling algorithms used to improve the performance of system. It includes ant colony, genetic algorithm & Round Robin improve the performance but these are not cost efficient at the same time. Objective: In early proven task scheduling algorithms network cost are not included but in this proposed ACO network overhead or cost is taken into consideration which thus improves the efficiency of the algorithm as compared to the previous algorithm. Proposed algorithm aims to improve in term of cost and execution time and reduces network cost. Methods: The proposed task scheduling algorithm in cloud uses ACO with network cost and execution cost as a fitness function. This work tries to improve the existing ACO that will give improved result in terms of performance and execution cost for cloud architecture. Our study includes a comparison between various other algorithms with our proposed ACO model. Results: Performance is measured using an optimization criteria tasks completion time and resource operational cost in the duration of execution. The network cost and user requests measures the performance of the proposed model. Conclusion: The simulation shows that the proposed cost and time aware technique outperforms using performance measurement parameters (average finish time, resource cost, network cost).
-
-
-
Machine Learning Based Support System for Students to Select Stream (Subject)
Authors: Kapil Sethi, Varun Jaiswal and Mohammad D. AnsariBackground: In most of the countries, students have to select a subject/stream in the secondary education phase. Selection of subject/stream is crucial for students because further their career proceeds according to their selection. Mostly subject/stream selection cannot be changed in the further career. Inappropriate selection of subjects due to parental pressure, lack of information etc. can lead to limited success in the selected stream. Guidance for subject/stream selection based on information of successful scholars of their stream and information of students such as interest, family background, previous education and other associated can enhance the success in career. Methods: Data mining and machine learning based methods were developed on the above information. Data from the different institutions and students of two different streams were used for training and testing purposes. Different machine learning algorithms were used and methods with high accuracy (86.72) were developed. Result: Developed methods can be extended and used for different subject/stream selection.
-
-
-
Performance Analysis of DCF- Two Way Handshake vs RTS/CTS During Train-Trackside Communication in CBTC based on WLAN802.11b
Authors: Bhupendra Singh and Rajesh MishraBackground: Wireless Local Area Network (WLAN) is used primarily in CBTC because of easy availability of commercial WLAN equipment. In present scenario, WLAN Medium Access Control (MAC) protocol is a well-known protocol which is used to satisfy real-time traffic and delay- sensitive applications. The bidirectional train-trackside communication is the fundamental key of train control in CBTC. Methods: DCF describes two basic techniques used for packet transmission: First technique is a Two Way Handshake (TWH) mechanism and another is Four Way Handshake (FWH) mechanisms. RTS/CTS FWH protocol specified by IEEE802.11b is introduced to rectify the Hidden Node Problem (HNP) encounters in TWH protocol. That is why the TWH mechanism of DCF technique suffers from higher average packet delay time when this protocol is applied to CBTC. DCF- Four Way Handshake (FWH), Request To Send (RTS) and Clear To Send (CTS) delay model is proposed to develop Communication Based Train Control (CBTC) system. Results: FWH is applied in CBTC to overcome the packet delay and throughput limitations of Two Way Handshake (TWH) mechanism of distributed coordination function (DCF) based technique. An experiment is designed to simulate and compare the performance of RTS/CTS delay model against TWH mechanism of DCF. Conclusion: It was found that the Average packet delay is slightly higher and throughput is lesser in RTS/CTS in comparison to TWH method. By comparing the performance of these two medium access mechanism in CBTC it was found that for multiple retransmissions with various data rates the RTS/CTS model had better packet delay time than TWH.
-
-
-
An Energy Efficient Routing Protocol Based On New Variable Data Packet (VDP) Algorithm for Wireless Sensor Networks
More LessBackground: Wireless Sensor Networks (WSNs) refer to a group of sensors used for sensing and monitoring the physical data of the environment and organizing the collected data at a central location. These networks enjoy several benefits because of their lower cost, smaller size and smarter sensors. However, a limited source of energy and lifetime of the sensors have emerged as the major setbacks for these networks. Methods: In this work, an energy-aware algorithm has been proposed for the transmission of variable data packets from sensor nodes to the base station according to the balanced energy consumption by all the nodes of a WSN. Result: Obtained simulation results verify that the lifetime of the sensor network is significantly enhanced in comparison to other existing clustering based routing algorithm. Conclusion: The proposed algorithm is comparatively easy to implement and achieves a higher gain in the lifetime of a WSN while keeping the throughput nearly same as LEACH protocol.
-
-
-
Brain Tumor Detection from MR Images Employing Fuzzy Graph Cut Technique
Authors: Jyotsna Dogra, Shruti Jain, Ashutosh Sharma, Rajiv Kumar and Meenakshi SoodBackground: This research aims at the accurate selection of the seed points from the brain MRI image for the detection of the tumor region. Since, the conventional way of manual seed selection leads to inappropriate tumor extraction therefore, fuzzy clustering technique is employed for the accurate seed selection for performing the segmentation through graph cut method. Methods: In the proposed method Fuzzy Kernel Seed Selection technique is used to define the complete brain MRI image into different groups of similar intensity. Among these groups the most accurate kernels are selected empirically that show highest resemblance with the tumor. The concept of fuzziness helps making the selection even at the boundary regions. Results: The proposed Fuzzy kernel selection technique is applied on the BraTS dataset. Among the four modalities, the proposed technique is applied on Flair images. This dataset consists of Low Grade Glioma (LGG) and High Grade Glioma (HGG) tumor images. The experiment is conducted on more than 40 images and validated by evaluating the following performance metrics: 1. Disc Similarity Coefficient (DSC), 2. Jaccard Index (JI) and 3. Positive Predictive Value (PPV). The mean DSC and PPV values obtained for LGG images are 0.89 and 0.87 respectively; and for HGG images it is 0.92 and 0.90 respectively. Conclusion: On comparing the proposed Fuzzy kernel selection graph cut technique approach with the existing techniques it is observed that the former provides an automatic accurate tumor detection. It is highly efficient and can provide a better performance for HGG and LGG tumor segmentation in clinical application.
-
-
-
SEGIN-Minus: A New Approach to Design Reliable and Fault-Tolerant MIN
Authors: Shilpa Gupta and Gobind Lal PahujaBackground: VLSI technology advancements have resulted the requirements of high computational power, which can be achieved by implementing multiple processors in parallel. These multiple processors have to communicate with their memory modules by using Interconnection Networks (IN). Multistage Interconnection Networks (MIN) are used as IN, as they provide efficient computing with low cost. Objective: the objective of the study is to introduce new reliable MIN named as a (Shuffle Exchange Gamma Interconnection Network Minus) SEGIN-Minus, which provide reliability and faulttolerance with less number of stages. Methods: MUX at input terminal and DEMUX at output terminal of SEGIN has been employed with reduction in one intermidiate stage. Fault tolerance has been introduced in the form of disjoint paths formed between each source-destnation node pair. Hence reliability has been improved. Results: Terminal, Broadcast and Network Reliability has been evaluated by using Reliability Block Diagrams for each source-destination node pair. The results have been shown, which depicts the hiher reliability values for newly proposed network. The cost analysis shows that new SEGINMinus is a cheaper network than SEGIN. Conclusion: SEGIN-Minus has better reliability and Fault-tolerance than priviously proposed SEGIN.
-
-
-
ANN-Based Relaying Algorithm for Protection of SVC- Compensated AC Transmission Line and Criticality Analysis of a Digital Relay
Authors: Farhana Fayaz and Gobind L. PahujaBackground: The Static VAR Compensator (SVC) has the capability of improving reliability, operation and control of the transmission system thereby improving the dynamic performance of power system. SVC is a widely used shunt FACTS device, which is an important tool for the reactive power compensation in high voltage AC transmission systems. The transmission lines compensated with the SVC may experience faults and hence need a protection system against the damage caused by these faults as well as provide the uninterrupted supply of power. Methods: The research work reported in the paper is a successful attempt to reduce the time to detect faults on a SVC-compensated transmission line to less than quarter of a cycle. The relay algorithm involves two ANNs, one for detection and the other for classification of faults, including the identification of the faulted phase/phases. RMS (Root Mean Square) values of line voltages and ratios of sequence components of line currents are used as inputs to the ANNs. Extensive training and testing of the two ANNs have been carried out using the data generated by simulating an SVC-compensated transmission line in PSCAD at a signal sampling frequency of 1 kHz. Back-propagation method has been used for the training and testing. Also the criticality analysis of the existing relay and the modified relay has been done using three fault tree importance measures i.e., Fussell-Vesely (FV) Importance, Risk Achievement Worth (RAW) and Risk Reduction Worth (RRW). Results: It is found that the relay detects any type of fault occurring anywhere on the line with 100% accuracy within a short time of 4 ms. It also classifies the type of the fault and indicates the faulted phase or phases, as the case may be, with 100% accuracy within 15 ms, that is well before a circuit breaker can clear the fault. As demonstrated, fault detection and classification by the use of ANNs is reliable and accurate when a large data set is available for training. The results from the criticality analysis show that the criticality ranking varies in both the designs (existing relay and the existing modified relay) and the ranking of the improved measurement system in the modified relay changes from 2 to 4. Conclusion: A relaying algorithm is proposed for the protection of transmission line compensated with Static Var Compensator (SVC) and criticality ranking of different failure modes of a digital relay is carried out. The proposed scheme has significant advantages over more traditional relaying algorithms. It is suitable for high resistance faults and is not affected by the inception angle nor by the location of fault.
-
-
-
An Intelligent Resource Manager Over Terrorism Knowledge Base
Authors: Archana Patel, Abhisek Sharma and Sarika JainThe complex and chaotic crisis created by terrorism demands for situation awareness which is possible with the proposed Indian Terrorism Knowledge Treasure (ITKT). Objective: This work is an effort at creating the largest comprehensive knowledge base of terrorism and related activities, people and agencies involved, and extremist movements; and providing a platform to the society, the government and the military personnel in order to combat the evolving threat of the global menace terrorism. Methods: For representing knowledge of the domain semantically, an ontology has been used in order to better integrate data and information from multiple heterogeneous sources. An Indian Terrorism Knowledge Base is created consisting of information about past terrorist attacks, actions taken at time of those attacks, available resources and more. An Indian Terrorism Resource Manager is conceived comprising of various use cases catering to searching a specified keyword for its description, navigating the complete knowledge base of Indian Terrorism and finding any answers to any type of queries pertaining to terrorism. Results: The managerial implications of this work are two-fold. All the involved parties, i.e., the government officials, military, police, emergency personnel, fire department, NGOs, media, public etc will be better informed in case of emergency and will be able to communicate with each other; hence improving situation awareness and providing decision support.
-
-
-
Dimensionality Reduction Technique in Decision Making Using Pythagorean Fuzzy Soft Matrices
Authors: Rakesh K. Bajaj and Abhishek GuleriaBackground: Dimensionality reduction plays an effective role in downsizing the data having irregular factors and acquires an arrangement of important factors in the information. Sometimes, most of the attributes in the information are found to be correlated and hence redundant. The process of dimensionality reduction has a wider applicability in dealing with the decision making problems where a large number of factors are involved. Objective: To take care of the impreciseness in the decision making factors in terms of the Pythagorean fuzzy information which is in the form of soft matrix. The perception of the information has the parameters - degree of membership, degree of indeterminacy (neutral) and degree of nonmembership, for a broader coverage of the information. Methods: We first provided a technique for finding a threshold element and value for the information provided in the form of Pythagorean fuzzy soft matrix. Further, the proposed definitions of the object-oriented Pythagorean fuzzy soft matrix and the parameter-oriented Pythagorean fuzzy soft matrix have been utilized to outline an algorithm for the dimensionality reduction in the process of decision making. Results: The proposed algorithm has been applied in a decision making problem with the help of a numerical example. A comparative analysis in contrast with the existing methodologies has also been presented with comparative remarks and additional advantages. Conclusion: The example clearly validates the contribution and demonstrates that the proposed algorithm efficiently encounters the dimension reduction. The proposed dimensionality reduction technique may further be applied in enhancing the performance of large scale image retrieval.
-
-
-
Optimization of PV Based Standalone Hybrid Energy System using Cuckoo Search Algorithm
Authors: Vinay A. Tikkiwal, Sajai Vir Singh and Hariom GuptaBackground: Renewable sources of energy have emerged as a promising eco-friendly alternative to the conventional and non-renewable sources of energy. However, highly variable and intermittent nature of renewable sources acts as a big hurdle in their widespread adoption. Hybrid energy systems provide an efficient and reliable solution to this issue, especially for the non-grid connected or stand-alone systems. Objective: The study deals with the design and optimization of a stand-alone hybrid renewable energy system. Methods: Two different configurations consisting of PV/W/B/DG have been modeled and optimized for lower annualized cost using cuckoo search, a meta-heuristic algorithm. Analysis of these system configurations has been carried to meet the energy demand at the least annualized cost. Results and Conclusion: Using a real world data for an existing educational organization in India, it is proven that the proposed optimization method meets all the requirements of the system and PV/B/DG configuration returns a lower annualized cost as well as leads to lower emissions.
-
-
-
Probabilistic and Fuzzy based Efficient Routing Protocol for Mobile Ad Hoc Networks
Authors: Madan M. Agarwal, Hemraj Saini and Mahesh Chandra GovilBackground: The performance of the network protocol depends on number of parameters like re-broadcast probability, mobility, the distance between source and destination, hop count, queue length and residual energy, etc. Objective: In this paper, a new energy efficient routing protocol IAOMDV-PF is developed based on the fixed threshold re-broadcast probability determination and best route selection using fuzzy logic from multiple routes. Methods: In the first phase, the proposed protocol determines fixed threshold rebroadcast probability. It is used for discovering multiple paths between the source and the destination. The threshold probability at each node decides the rebroadcasting of received control packets to its neighbors thereby reducing routing overheads and energy consumption. The multiple paths list received from the first phase and supply to the second phase that is the fuzzy controller selects the best path. This fuzzy controller has been named as Fuzzy Best Route Selector (FBRS). FBRS determines the best path based on function of queue length, the distance between nodes and mobility of nodes. Results: Comparative analysis of the proposed protocol named as "Improved Ad-Hoc On-demand Multiple Path Distance Vector based on Probabilistic and Fuzzy logic" (IAOMDV-PF) shows that it is more efficient in terms of overheads and energy consumption. Conclusion: The proposed protocol reduced energy consumption by about 61%, 58% and 30% with respect to FF-AOMDV, IAOMDV-F and FPAOMDV routing protocols, respectively. The proposed protocol has been simulated and analyzed by using NS-2.
-
-
-
A Novel Simplified AES Algorithm for Lightweight Real-Time Applications: Testing and Discussion
Authors: Malik Qasaimeh, Raad S. Al-Qassas, Fida Mohammad and Shadi AljawarnehBackground: Lightweight cryptographic algorithms have been the focus of many researchers in the past few years. This has been inspired by the potential developments of lightweight constrained devices and their applications. These algorithms are intended to overcome the limitations of traditional cryptographic algorithms in terms of exaction time, complex computation and energy requirements. Methods: This paper proposes LAES, a lightweight and simplified cryptographic algorithm for constricted environments. It operates on GF(24), with a block size of 64 bits and a key size of 80-bit. While this simplified AES algorithm is impressive in terms of processing time and randomness levels. The fundamental architecture of LAES is expounded using mathematical proofs to compare and contrast it with a variant lightweight algorithm, PRESENT, in terms of efficiency and randomness level. Results: Three metrics were used for evaluating LAES according to the NIST cryptographic applications statistical test suite. The testing indicated competitive processing time and randomness level of LAES compared to PRESENT. Conclusion: The study demonstrates that LAES achieves comparable results to PRESENT in terms of randomness levels and generally outperform PRESENT in terms of processing time.
-
-
-
Web Service Composition in Cloud: A Fuzzy Rule Model
Authors: Hussien Alhadithy and Bassam Al-ShargabiBackground: Cloud Computing has drawn much attention in the industry due to its costefficient schema along with more prospects, such as elasticity and scalability nature of Cloud Computing. One of the main service models of a Cloud is software as a service, where many web services are published and hosted in the Cloud environment. Many web services offered in a Cloud have similar functionality, with different of characteristics non-functional requirements such as Quality of Service (QoS). In addition, as individual web services are limited in their capability. Therefore, there is a need for composing existing services to create new functionality in the form of composite service to fulfill the requirements of Cloud user for certain processes. Methods: This paper introduces a fuzzy rule approach to compose web service based on QoS from different Cloud Computing providers. The fuzzy rule is generated based on QoS of discovered web service from Cloud in order to compose web services that only match user requirements. The proposed model is based on an agent that is responsible for discovering and composing web service that only stratified user requirements. Result: the experimental result shows that the proposed model is efficient in terms of time and the use of fuzzy rules to compose web services from different Cloud providers under different specifications and configurations of Cloud Computing environment. Conclusion: In this paper, an agent-based model was proposed to compose web services based on fuzzy rule in Cloud environment. The agent is responsible for discovering web services and generating composition plans based on offered QoS for each web service. The agent employs a set of fuzzy rules to carry out an intelligent selection to select the best composition plan that fulfills the requirements of the end user. The model was implemented on CloudSim to ensure the validity of the proposed model and performance time analysis was performed that showed good result in terms of time with regard to the Cloud Computing configuration.
-
-
-
Project Management Knowledge Areas and Skills for Managing Software and Cloud Projects: Overcoming Challenges
Authors: Sofyan Hayajneh, Mohammed Hamada and Shadi AljawarnehBackground: Cloud Computing has already started to revolutionize storing and accessing data. Although cloud computing is on its way to become a huge success, there are some challenges that arise while managing cloud services. This indeed reveals many new knowledge areas, skills and consequently new challenges that need to be overcome so that software project managers can cope and make use of the newly available cloud services. This research aims to identify the challenges faced by project managers in cloud computing and highlight the knowledge areas and skills required to meet these challenges. Methods: The findings of this paper are presented through three stages. First, the pre-survey questionnaire to validate skills and knowledge areas which will be selected and eventually adopted for the main survey. Second, interviews with experts in the field to discuss the challenges identified in the literature review and it will also be adopted with the pre-survey to build the main survey. Third, the main survey to identify the critical skills and knowledge areas that are required for managing cloud projects. Results: This study drew recommendations so that possible systems and tools can be developed and integrated to overcome some of these challenges that give it the importance of providing guidance for the managers in the field to improve the performance of project management successfully. Conclusion: This study leads to gain more understanding of the attributes of a competent cloud manager. It also determines knowledge areas and skills to help them effectively in overcoming the challenges faced in software and cloud projects.
-
-
-
A Novel Model for Aligning Knowledge Management Activities within the Process of Developing Quality Software
By Omar SabriBackground: Currently, the organization's competitive advantage is based on critical decisions to achieve their objectives by understanding the power of knowledge as a source within the organizations. However, there is a lack of qualitative models/frameworks for integrating Knowledge Life Cycle (KLC) within software development life cycle SDLC. Therefore, the goal of this research is to involve Knowledge Management activities within the SDLC in Information Technology companies to produce quality software. With the help of knowledge movements within the companies, the quality of provided software is used to improve organizations performance and products better and faster. Methods: This research highlights the importance of Knowledge Management activities during a typical software development process to provide the software as a final product/target. Moreover, the paper proposes a model to explain the relationships between knowledge management activities within the process of software development life cycle to produce quality software using three basic building blocks: people, organizations, and technologies. The success factors for the blocks are selected depending on the most recent literature occurrences and on their fitness to the nature of this study. Result: The research proposes a novel model for the success factors to evaluate the effects of the building blocks, and workflows during the software development processes. The selected success factors for the blocks are (Training, Leadership, Teamwork, Trust, IT Infrastructure, Culture, and strategies). Also, the research demonstrates the relationships between KM success factors and SDLC to produce quality software. Conclusion: In this research, we proposed a novel model to explain the relationships between knowledge management activities within the process of software development life cycle to produce quality software using three basic building blocks: people, organizations, and technologies. We selected seven success factors for the blocks depending on: 1) their importance and occurrence in a number of literature by many authors; and 2) their fitness to the nature of this study. The success factors (Training, Leadership, Teamwork, Trust, IT Infrastructure, Culture, and strategies) of the proposed model can be used to evaluate the effects of people, organizations, technologies, and workflows during the software development processes to obtain the required software quality. Finally, a quantitative study will be implemented to investigate the proposed hypothesis and to measure factors influencing the suggested model. By assessing to which degree these factors are present/ absent within the SDLC process the managers will be able to address the weakness by preparing a suitable plan and produce quality software.
-
-
-
Secure Digital Databases using Watermarking based on English-Character Attributes
Authors: Khalaf Khatatneh, Ashraf Odeh, Ashraf Mashaleh and Hind HamadeenIntroduction: The single space and the double space (DS). In this procedure, an image is used to watermark a digital database, where the image bytes are divided into binary strings that block the text attributes of the selected database, we proposed an algorithm to defend against four common database attacks. Objective: Perform the watermark is Embedding and makes extraction of the watermark. We also describe the principal of the Embedding and extraction the watermark. Methods: The procedure to extract the watermark does not require knowledge of the original database that does not carry the same watermark. This feature is extremely important because it allows the discovery of a watermark in a copy of the original database, regardless of the subsequent updates to the asset. The extraction procedure is a direct reflection of the procedure used to embed the watermark is six steps. Results: Using new algorithm ability to develop a database watermark that would make it difficult for an attacker to remove or change the watermark without discovering the value of the object. To be judged effective, the database algorithm had to be able to create a strong enough watermark that could sustain the security of the database in the face of the following four types of attack: deletion of a sub-dataset, addition of a sub-dataset. Conclusion: The performance of the proposed algorithm was assessed in respect of its ability to defend the database against four common attacks for all tuples selection.
-
Most Read This Month
