Recent Advances in Computer Science and Communications - Volume 14, Issue 4, 2021
Volume 14, Issue 4, 2021
-
-
Analysis of Relay Node Failure in Heterogeneous Wireless Sensor Networks
Authors: Gholamreza Kakamanshadi, Savita Gupta and Sukhwinder SinghIntroduction: Fault tolerance is an important issue for assuring data reliability, energy saving and prolonging the lifetime of wireless sensor networks. Since, sensor node, relay node, etc. are prone to failure, there is a need for an effective fault tolerance mechanism. Methods: Relay nodes are used as cluster heads and the concept of two disjoint paths is employed for proving fault tolerance against link failure. To evaluate the fault tolerance level, mean time to failure and subsequently, failure rate are calculated, that reflect the reliability of the network. Results: The results show that as the area size of the network increases, the average fault tolerance level of the network becomes constant. Furthermore, when the mean time to failure of the network decreases, then the failure rate increases. It means the overall reliability of the network with a smaller network size is more than the larger network size. Discussion: This paper presents a detailed analysis of relay nodes failure under distinct network configurations in heterogeneous wireless sensor networks. Conclusion: This analysis helps the network designers to decide the quantity of deployment of relay nodes with respect to fault tolerance level. It may also help to prevent relay nodes failure by taking appropriate actions so as to increase the fault tolerance level of the network as well as network reliability.
-
-
-
An Empirical Study of Predictive Model for Website Quality Analytics Using Dataset of Different Domains of Websites
By Divya GuptaIntroduction: Web analytics is the process of examining websites to uncover patterns, correlations, trends, insights, and other useful information that can be utilized to optimize web usage and improve the quality of the website. Methods: This research proffers an approach which associates the website assessment with user satisfaction and acceptance. The proposed WQA (Website Quality Analytic) Model considers websites from seven domains, and using 13 UX- based quality attributes, evaluates the quality of websites in each domain. The quality assessment is automated using supervised learning models to predict good, average, and bad websites. Results: The real-time dataset of website domains was assessed and websites were predicted as good, average, and bad, using the algorithms. Discussion: A Website quality model essentially consists of a set of criteria used to determine if a website reaches certain levels of fineness. User Experience (UX) directly measures the quality of site interactions and is an indirect representative of site success and customer conversions. That is, a bad UX bounces away visitors to seek a more reliable website. Every single second, a user spends on a website is directly attributable to the usability of a good UX. Hence, the evaluation of the quality of websites is essential to determine user acceptance; that is, the users are the parameter measured for the success of the site. Conclusion: The feature (attribute)-based predictive model for quality analytics is empirically analyzed for five classification algorithms. A qualitative analysis of the domain-wise classification of websites is also presented.
-
-
-
Adaptive Energy-Aware Algorithms to Minimize Power Consumption and SLA Violation in Cloud Computing
Authors: Monika Singh, Pardeep Kumar and Sanjay TyagiObjective: With the establishment of virtualized datacenters on a large scale, cuttingedge technology requires more energy to deliver the services 24*7 hours. With this expansion and accumulation of information on a massive scale on data centers, the consumption of an excessive amount of power results in high operational costs. Therefore, there is an urgent need to make the environment more adaptive and dynamic, where the overutilization and underutilization of hosts are well known to the system and active measures can be taken accordingly. To serve this purpose, an energy-efficient method, for the detection of overloaded and under-loaded hosts, has been proposed in this paper. For implementing VM migration, VM placement decision has also been taken to save energy and reduce SLA (Service Level Agreement) rate over the cloud. Methods: In the paper, a novel adaptive heuristics approach has been presented that concerns with the utilization of resources for dynamic consolidation of VMs based on the mustered data from the usage of resources by VMs, while ensuring the high level of relevancy to the SLA. After identification of under-load and overload hosts, VM placement decision has been taken in the way that takes minimum energy consumption. A minimum migration policy has been adopted in the proposed methodology to minimize execution time. The validation of the effectiveness and efficiency of the suggested approach has been performed by using real-world workload traces in the CloudSim simulator. Results: The results shows that the proposed methodology is ideal for SLA, but costs more for VM migration. Conclusion: A deep analysis must be done in existing energy efficient approaches and a new platform should be suggested to save energy in real life.
-
-
-
Enhance the Quality of Collaborative Filtering Using Tagging
Authors: Latha Banda and Karan SinghBackground: Due to enormous data on web sites, recommending users for every item is impossible. For this problem, Recommender Systems (RS) are introduced. RS is categorized into Content-Based (CB), Collaborative Filtering (CF) and Hybrid RS. Based on these techniques, recommendations are made to the user. In this, CF is the recent technique used in RS in which tagging features also provided. Objective: Three main issues occur in RS are scalability problem which occurs when there is a huge data, sparsity problem occurs when rating data is missing and cols start user or item problem occurs when new user or new item enters in the system. To avoid these issues, here we have proposed Tag and Time weight model with GA in Collaborative Tagging. Methods: Here we have proposed a method Collaborative Tagging (CT) with Tag and Time weight model with real value genetic algorithm which enhances the recommendation quality by removing the issues of sparsity and cold start user problems with the help of missing value prediction. Here in this system, the sparsity problem can be removed using missing value prediction and cold start problems are removed using tag and time weight model using GA. Results: In this study, we have compared the results of Collaborative Filtering with Cosine Similarity (CF-CS), Collaborative Filtering with Diffusion Similarity (CF-DS), Tag and Time Weight Model with Diffusion Similarity (TAW-TIW-DS) and Tag and Time Weight Model Using Diffusion Similarity and Genetic Algorithm (TAW-TIW-DS-GA). Conclusion: Here we have compared the proposed approach with the baseline approaches and the metrics are used MAE, prediction percentage, Hit-rate and Hit-rank. Based on these metrics for every split, TAW-TIW-DS-GA showed best results as compared to the existing approach.
-
-
-
Network Selection in Wireless Heterogeneous Environment Based on Available Bandwidth Estimation
Authors: Kiran Ahuja, Brahmjit Singh and Rajesh KhannaBackground: With the availability of multiple options in wireless network simultaneously, Always Best Connected (ABC) requires dynamic selection of the best network and access technologies. Objective: In this paper, a novel dynamic access network selection algorithm based on the real time is proposed. The Available BandWidth (ABW) of each network is required to be estimated to solve the network selection problem. Methods: Proposed algorithm estimates available bandwidth by taking averages, peaks, low points and bootstrap approximation for network selection. It monitors real-time internet connection and resolves the selection issue in internet connection. The proposed algorithm is capable of adapting to prevailing network conditions in heterogeneous environment of 2G, 3G and WLAN networks without user intervention. It is implemented in temporal and spatial domains to check its robustness. Estimation error, overhead, estimation time with the varying size of traffic and reliability are used as the performance metrics. Results: Through numerical results, it is shown that the proposed algorithm’s ABW estimation based on bootstrap approximation gives improved performance in terms of estimation error (less than 20%), overhead (varies from 0.03% to 83%) and reliability (approx. 99%) with respect to existing techniques. Conclusion: Our proposed methodology of network selection criterion estimates the available bandwidth by taking averages, peaks, and low points and bootstrap approximation method (standard deviation) for the selection of network in the wireless heterogeneous environment. It monitors realtime internet connection and resolves internet connections selection issue. All the real-time usage and test results demonstrate the productivity and adequacy of available bandwidth estimation with bootstrap approximation as a practical solution for consistent correspondence among heterogeneous wireless networks by precise network selection for multimedia services.
-
-
-
Cost-Effective Cluster-Based Energy Efficient Routing for Green Wireless Sensor Network
Authors: Sandeep Verma, Neetu Sood and Ajay K. SharmaBackground: The green Information and Communications Technologies (ICTs) have brought a revolution in uplifting the technology efficiently to facilitate the human sector in the best possible way. Green Wireless Sensor Network (WSN) tactically focuses on improving the survival period of deployed nodes (as they have a limited battery) in any target area. Objective: To address this concern, the main objective is to improve the routing in WSN. The cluster- based routing helps in acquiring the same with the appropriate Cluster Head (CH) selection. The use of energy heterogeneous nodes that normally comprise of high energy nodes, puts a lot of financial burden on the users as they incur a huge cost, thus becoming a bottleneck for the growth of green WSN. Therefore , another objective of the study is to reduce this cost involved in the network. Methods: A cost-effective routing protocol is proposed that introduces energy-efficient CH selection by incorporating parameters namely, node density, residual energy, the total energy of network and distance factor. Thus, the proposed protocol is termed as Cost-Effective Cluster-based Routing Protocol (CECRP) as it performs remarkably better with only two energy level nodes as compared to state-of-the-art protocols with three levels nodes. Results: It can be encapsulated from the simulation results that CECRP outperforms state-of-theprotocols on different performance metrics. Conclusion: Furthermore, it is comprehended from the simulation results that CECRP proves to be 33.33% more cost-effective as compared to the competent protocols, hence CECRP favors the green WSN.
-
-
-
Enhanced Auxiliary Cluster Head Selection Routing Algorithm in Wireless Sensor Networks
Authors: Gaurav K. Nigam and Chetna DabasBackground & Objective: Wireless sensor networks are made up of a huge amount of less powered small sensor nodes that can audit the surroundings, collect meaningful data, and send it to the base station. Various energy management plans that pursue to lengthen the endurance of overall network have been proposed over the years, but energy conservation remains the major challenge as the sensor nodes have finite battery and low computational capabilities. Cluster based routing is the most fitting system to help in burden adjusting, adaptation to internal failure, and solid correspondence to draw out execution parameters of wireless sensor network. Low energy adaptive clustering hierarchy is an efficient clustering based hierarchical protocol that is used to enhance the lifetime of sensor nodes in wireless sensor network. It has some basic flaws that need to be overwhelmed in order to reduce the energy utilization and inflating the nodes lifetime. Methods: In this paper, an effective auxiliary cluster head selection is used to propose a new enhanced GC-LEACH algorithm in order to minimize the energy utilization and prolong the lifespan of wireless sensor network. Results & Conclusion: Simulation is performed in NS-2 and the outcomes show that the GCLEACH outperforms conventional LEACH and its existing versions in the context of frequent cluster head rotation in various rounds, number of data packets collected at base station, as well as reduces the energy consumption 14% - 19% and prolongs the system lifetime 8% - 15%.
-
-
-
Reliability Analysis and Modeling of Green Computing Based Software Systems
Authors: Sangeeta Malik, Kapil Sharma and Manju BalaBackground: Software industries are growing very fast to develop new solutions and ease people’s life. Software reliability has been considered as a critical factor in today’s growing digital world. Software reliability models are one of the most generally used mathematical tools for estimation of software reliability. These reliability models can be applied on the development of sustainable and green computing-based software’s having their constrained development environments. Objective: This paper proposes a new reliability estimation model for green IT environment based software systems. Methods: In this paper, a new failure rate behavior-based model centered on green software development life cycle process has been developed. This model integrates a new modulation factor for incorporating changing needs in each phase of green software development methodology. Parameter estimation for proposed model has been done using hybrid Particle Swarm Optimization and Gravitational Search Algorithm. The proposed model has been tested on real-world datasets. Results: Experimental results are showing the enhanced capability of proposed model in simulating real green software development environment. Using GC-1 and GC-2 dataset, the proposed model is about 60.05% which is more significant than other models. Conclusion: This paper proposed a new failure rate model for softwares that have been developed under green IT environment.
-
-
-
Multisensory Decision Level Fusion for Improvement in Urban Land Classification
Authors: Rubeena Vohra and Kailash C. TiwariBackground: Planning and development of urban areas to meet the human developmental requirements is an ongoing process across the globe. This, in turn, requires continuous mapping of urban developmental patterns. Remote sensing and image processing techniques have greatly facilitated the study of urban developmental patterns by mapping urban areas. Objective: The objective of the paper is to carry out the object based classification in urban environments by using fusion techniques on multisensory data to classify natural and man-made objects. Multisensory data fusion using spectral and spatial features is done to improve the classification accuracy. Methods: In our research, the performance of the proposed framework has been verified via investigating the multistage feature level fusion. Spatial and spectral features are explored using feature level fusion between multisensory data and then the database is classified using linear SVM classifier. The individual probabilities (confidence measures) from all such pair of binary SVMs are combined to uniquely represent the object feature to one of the classes. After summing up all the probabilities the class with highest probability value represent the object through decision level fusion. Results: The results explored spatial and spectral features in a unique way using connected component analysis. Also, the improvement in classification accuracy has been achieved. Conclusion: The results reveal that the overall accuracies of SVM classifier are in the range of 53% to 90% for various classes which is improved to 96-98% by majority voting rule.
-
-
-
An Energy Efficient Routing Approach to Enhance Coverage for Application- Specific Wireless Sensor Networks Using Genetic Algorithm
Authors: Amandeep K. Sohal, Ajay K. Sharma and Neetu SoodBackground: An information gathering is a typical and important task in agriculture monitoring and military surveillance. In these applications, minimization of energy consumption and maximization of network lifetime have prime importance for green computing. As wireless sensor networks comprise of a large number of sensors with limited battery power and deployed at remote geographical locations for monitoring physical events, therefore it is imperative to have minimum consumption of energy during network coverage. The WSNs help in accurate monitoring of remote environment by collecting data intelligently from the individual sensors. Objective: The paper is motivated from green computing aspect of wireless sensor network and an Energy-efficient Weight-based Coverage Enhancing protocol using Genetic Algorithm (WCEGA) is presented. The WCEGA is designed to achieve continuously monitoring of remote areas for a longer time with least power consumption. Methods: The cluster-based algorithm consists two phases: cluster formation and data transmission. In cluster formation, selection of cluster heads and cluster members areas based on energy and coverage efficient parameters. The governing parameters are residual energy, overlapping degree, node density and neighbor’s degree. The data transmission between CHs and sink is based on well-known evolution search algorithm i.e. Genetic Algorithm. Results: The results of WCEGA are compared with other established protocols and shows significant improvement of full coverage and lifetime approximately 40% and 45% respectively. Conclusion: This paper proposes an evolutionary method to improve an energy-efficient clustering protocol for longer full coverage.
-
-
-
SQL Versus NoSQL Databases to Assess Their Appropriateness for Big Data Application
Authors: Mohammad A. Kausar and Mohammad NasarBackground: Nowadays, the digital world is rising rapidly and becoming very difficult in nature's quantity, diversity, and speed. Recently, there have been two major changes in data management, which are NoSQL databases and Big Data Analytics. While evolving with the diverse reasons, their independent growths balance each other and their convergence would greatly benefit organization to make decisions on-time with the amount of multifaceted data sets that might be semi structured, structured, and unstructured. Though several software solutions have come out to support Big Data analytics on the one hand, on the other hand, there have been several packages of NoSQL database available in the market. Aim and Methods: The main goal of this article is to give comprehension of their perspective and a complete study to associate the future of the emerging several important NoSQL data models. Results: Evaluating NoSQL databases for Big Data analytics with traditional SQL performance shows that NoSQL database is a superior alternative for industry condition need high-performance analytics, adaptability, simplicity, and distributed large data scalability. Conclusion: In this article we conclude with the adoption of NoSQL in various markets.
-
-
-
Link Stability Based Approach for Route Discovery in MANET Using DSR
Authors: Ganesh K. Wadhwani, Sunil K. Khatri and Sunil K. MuttooBackground: Mobile Ad-hoc network is a set of devices which are capable of communicating with each other without the help of any central entity or fixed infrastructure. The absence of fixed access points makes MANET flexible and deployable at the extreme geographical territories. Each device has routing capabilities to facilitate communication among nodes in the network. Objectives: 1) To selects the stable path which has lowest hop count. 2) To use backup path in case of link break up to minimize delay incurred in finding out the alternate path. Methods: Dynamic source routing is modified to choose the most stable path and a backup path is cached to save the route discovery time in case of link failure. Results: The modified-DSR based on Link stability and hop count is performing better as compared to DSR most of the time. Conclusion: A modified-DSR is proposed that selects the path using hop count and link stability as parameters. The advantage of modified-DSR is that if a link breaks in between data communication then back up path can be used for carrying out the data transfer. Analysis is done by varying the node density and counting number of packets received .Modified-DSR gives better results for light to moderate network and DSR performs better if the number of nodes increases beyond a certain limit.
-
-
-
Supervised Classifier Approach for Intrusion Detection on KDD with Optimal MapReduce Framework Model in Cloud Computing
Authors: Ilayaraja Murugan, Hemalatha S., Manickam P., Sathesh K. K. and Shankar K.Background: Cloud computing is characterized as the arrangement of assets or accessible administrations by the cloud service providers through web to their clients. It communicates everything as administrations over the web as per the client request, for example, the operating system, organization of equipment, storage, assets, and software. Nowadays, Intrusion Detection Systems (IDS) play a powerful role while working under the influence of the experts who act when a system is hacked or under some intrusions. Most intrusion detection frameworks are created based on the machine learning strategies. Since the datasets play a major role, this is utilized as part of the intrusion detection i.e., Knowledge Discovery in Database (KDD). Methods: In this paper, the intruded data was detected and classified utilizing Machine Learning (ML) with MapReduce model. The primary objective of the Hadoop MapReduce model is to reduce the extent of database ideal weight that was decided for reducer model and second stage by utilizing Decision Tree (DT) classifier in data detection. This DT classifier utilizes an appropriate classifier to decide the class labels for non-homogeneous leaf nodes. The decision tree fragment provided a coarse section profile while the leaf level classifier yielded the data about the qualities that influence the label inside a portion. Results: From the proposed results, the accuracy for detection was 96.21% in comparison with the existing classifiers, for example, Neural Network (NN), Naive Bayes (NB) and K Nearest Neighbor (KNN). Conclusion: This study introduced a Hadoop Map-reduce model to create diverse mappers and diminish the data utilizing OBL-GWO strategy.
-
-
-
Transaction Issues in Mobile Distributed Real-Time Database Systems
Authors: Prakash K. Singh and Udai ShankerIn recent years, a large number of populations are dependent on mobile database technology and it is difficult for us to imagine our lifestyle in absence of database. Today’s portable handy mobile devices take part in emerging new technology for sharing distributed applications or/ and information between many users even on the move (from one network to another). To manage this resulting large Volume of data in a wireless environment with time constraints such as deadline making it the fertile land of research for researchers. Fast transaction processing in many industrial applications is needed efficient algorithm and protocols in the field of mobile distributed real-time database (MDRTDBS). Transaction execution in a mobile environment has various interesting research issues like low bandwidth, storage capacity, power backup, priority scheduling policy, and concurrency and commits protocols, security, check-pointing etc. At first in our paper, we address performance issues that are important to MDRTDBS and then survey the various researches that have been done so far. In fact, this paper provides ground knowledge for addressing the performance issues important for mobile distributed real-time database and somehow helping to find out the future area of research in the field of MDRTDBS.
-
-
-
Predicting Suitable Agile Method Using Fuzzy AHP
Authors: Rajbala Singh, Deepak Kumar and Bharat B. SagarAgile methodology promotes changing requirements and its journey has come a long way in the software industries giving edge as well as dimensions to software products. According to the statistics, agile methodology is more successful than traditional projects. The agile team manages the project more efficiently and delivers quality product. In agile methodology, customer satisfaction is a priority due to rapid development as well as continuous delivery which is the core of the Dynamic System Development (DSD). Thus, the industry keeps pace with the new expertise and changing market situation. This paper reveals the utilization of the Fuzzy Analytic Hierarchical Process (FAHP) which is a Multi-Criteria Decision Making (MCDM) process used in the software industry by correlating the selected alternatives using fuzzy triangular numbers, which decides on the best process of agile. Thus, the methodology of Fuzzy AHP for agile process selection is discussed in this research paper. Moreover, by expressing Fuzzy AHP comprehensively and numerally in this paper, the best likely process selection for agile methodology amongst various criteria as well as decision making issues is implemented.
-
-
-
Analysis of Optimum Precoding Schemes in Millimeter Wave System
Authors: Divya Singh and Aasheesh ShuklaBackground: Millimeter wave technology is the emerging technology in wireless communication due to increased demand for data traffic and its numerous advantages however it suffers from severe attenuation. To mitigate this attenuation, phased antenna arrays are used for unidirectional power distribution. An initial access is needed to make a connection between the base station and users in millimeter wave system. The high complexity and cost can be mitigated by the use of hybrid precoding schemes. Hybrid precoding techniques are developed to reduce the complexity, power consumption and cost by using phase shifters in place of converters. The use of phase shifters also increases the spectral efficiency. Objective: Analysis of Optimum Precoding schemes in Millimeter Wave System. Methods: In this paper, the suitability of existing hybrid precoding solutions are explored on the basis of the different algorithms and the architecture to increase the average achievable rate. Previous work done in hybrid precoding is also compared on the basis of the resolution of the phase shifter and digital to analog converter. Results: A comparison of the previous work is done on the basis of different parameters like the resolution of phase shifters, digital to analog converter, amount of power consumption and spectral efficiency. Spectral efficiency shows the average achievable rate of different algorithms at SNR= 0 dB and 5 dB. It also compares the performance achieved by the hybrid precoder in the fully connected structure with two existing approaches, dynamic subarray structure with and without switch and sub connected or partially connected structure, and gives the comparative analysis of hybrid precoding with the different resolutions of the phase shifter and DAC. Conclusion: In this paper, some available literature is reviewed and summarized about hybrid precoding in millimeter wave communication. Current solutions of hybrid precoding are also reviewed and compared in terms of their efficiency, power consumption, and effectiveness. The limitations of the existing hybrid precoding algorithms are the selection of group and resolution of phase shifters. The mm wave massive MIMO is only feasible due to hybrid precoding.
-
Most Read This Month
