Recent Advances in Computer Science and Communications - Volume 14, Issue 2, 2021
Volume 14, Issue 2, 2021
-
-
ECG Analysis: A Brief Review
Authors: Saumendra K. Mohapatra and Mihir N. MohantyIn recent years, cardiac problems have been found proportional to technology development. As the cardiac signal (Electrocardiogram) relates to the electrical activity of the heart of a living being, the technology develops accordingly day-by-day to support physicians and diagnosis. It has many applications along with medical applications. Accurate analysis of electrocardiogram (ECG) signal can information for detection, classification, and diagnosis. This paper insight towards a short review of earlier techniques. It will support the researchers in a new direction. The review is based on preprocessing, feature extraction, classification, and different measures for accuracy proof and the related work reported during the last decade is summarized.
-
-
-
Low Cost and Centimeter-Level Global Positioning System Accuracy Using Real-Time Kinematic Library and Real-Time Kinematic GPSA
Authors: Hemant K. Gianey, Mumtaz Ali, Varadarajan Vijayakumar, Ashutosh Sharma and Rajiv KumarIntroduction: Accuracy and total design and implementation cost of the GPS framework determine the viability of GPS based projects. As the greater part of the advanced framework including telemetry, IoT, Cloud, and AUTOSAR frameworks use GPS to get exact outcomes, finding a software-controlled error correction becomes important. The execution of open source library such as RTKLIB will help in controlling and revising GPS blunders. Methods: The project utilizes the RTKLIB along with two stations for better accuracy. The RTKGPS framework works under Linux environment, which is embedded in the Beagleboard. The communication between the GPS system is set up utilizing both serial communication protocol and TCP/IP suite. Results: To get high precision inside the network, two GPS modules are utilized. One of them will be mounted on the rover and another GPS is the base station of the setup. Both the GPS will have a double radio wire setup to increase the reception level to reduce the noise and obtain centimeterlevel precision. For long-range communication, rover utilizes Wi-Fi with TCP/IP stack protocol. In this research paper, setup is intended to accomplish the centimeter level precision through libraries in a Linux environment. Conclusion: The design will be set up and tried on a college campus under various conditions with different error parameters to acquire a low cost and centimeter level GPS accuracy.
-
-
-
Pose and Illumination Invariant Hybrid Feature Extraction for Newborn
Authors: Rishav Singh, Ritika Singh, Aakriti Acharya, Shrikant Tiwari and Hari OmBackground: Abduction, swapping and mix-ups are the unfortunate events that could happen to Newborn while in hospital premises and medical personnel are finding it difficult to curb this unfortunate incident. Accurate patient identification (ID) is essential for patient safety, especially with our smallest and most vulnerable paediatric patients. The level of security is very crucial issue in maternity ward and the problem of missing and swapping of Newborn is of prime concern to the persons involved and affected. There is a common perception in the society that nothing can be done to prevent this unfortunate tragedy. In comparison to developed nations the developing countries are facing more challenges because of overcrowding and scarcity of medical facilities in the hospital. The face of a newborn baby changes appearance in different lighting conditions in spite of various contrast enhancement techniques. The class of images which are under similar environment conditions (inter-class) may result in more differences than that of the intra-class images under different illumination conditions. This gives rise to the misclassification of images. Objective: The main objective of this paper is perform newborn face recognition in a hybrid approach. Speeded Up Robust Features (SURF) and Local Binary Pattern (LBP). Methods: The scientific contributions of the proposed work are stated below: • For face recognition of newborns in a semi controlled environment. • Overcoming the pose and illumination challenge. • Uses single gallery image. • Uses a hybrid approach to improve the results. Results: The average Rank 1 accuracy of the proposed method is 93.65% whereas that of the existing algorithms is quite low. The proposed technique is 12.8% more accurate than LBP, 10.5% more accurate than SURF, 12.8% more accurate than LDA and 18.1% more accurate than PCA for rank1. Conclusion: In this paper new semi supervised technique is used for demonstrate the improve the performance of the newborn face recognition system in different illumination and pose conditions. In our.
-
-
-
Intuitionistic Fuzzy Shapley-TOPSIS Method for Multi-Criteria Decision Making Problems Based on Information Measures
Authors: Reetu Kumari, Arunodaya R. Mishra and Deelip K. SharmaAims & Background: Cloud Computing (CC) offers unique scalable and all time available services to the user. As service provision is an important part of the Cloud computing concept, the proper choice of the desired service is of most relevance according to the users’ needs. Various associations like as Microsoft and Facebook have revealed momentous investments in CC and currently offer services with top levels of reliability. The well-organized and precise evaluation of cloud-based communication network is an essential step in assurance both the business constancy and the continuous open services. However, with a vast diversity in the cloud services, selection of a suitable cloud service is a very challenging task for a user under an unpredictable environment. Due to the multidimensional criteria of Quality of Service (QoS), cloud service selection problems are treated as Multiple Criteria Decision-Making (MCDM) problem. Objectives & Methodology: In present paper, a Multi-Criteria Decision Making (MCDM) method named as Shapley-TOPSIS method, which is extension of classical TOPSIS method for cloud service selection is developed. Thereafter, new divergence measures for IFSs with multiple parameters are studied. The interesting properties among the developed divergence and entropy measures have also been derived. Then, the criterion weights are ascertained by Shapley function via entropy and fuzzy measure approach. Next, Shapleydivergence measures are applied to calculate closeness coefficient of alternative. Finally, a problem of cloud service selection is demonstrated to show the applicability of new method and also compared with some existing method. Results & Conclusion: A decision making problem of cloud computing service provider has been considered for signifying the developed proposed intuitionistic fuzzy Shapley TOPSIS method and finishes with the outcomes coincide with the already developed methods which confirms the solidity of the developed method. For future, we plan to apply the proposed approach with different multi-criteria decision making problems. Also, we can extend proposed approach with other uncertain environment by using other decision making methods.
-
-
-
A Novel and Secure Hybrid iWD-MASK Algorithm for Enhanced Image Security
Authors: Righa Tandon and P.K. GuptaBackground: Most of the work has been done using DES, AES, and MAES for key generation in the image encryption. These algorithms provide better results for text based encryption when compared with digital image encryption. These algorithms are less secure and efficient. A new hybrid approach has been introduced for digital image encryption i.e. Chaos-based approach. This approach is designed for image related data. Obtained results outperform the previous results of encryption. Objective: To get the original image after decryption at the receiver’s end. Methods: Matrix array symmetric key algorithm has been used for key generation in which same key is used for encryption and decryption process. Here, we have to fix the total number of iterations. There will be three choices of iterations 10, 12 and 14 for 128, 192 and 256 bits key lengths respectively. Once the key is generated, IWD algorithm is applied for encryption process and at the end of sender and same key has to be used for decryption purpose at receiver’s end. Otherwise, system will stop its functioning and will result into system error. Results & Conclusion: The proposed hybrid concept of MASK with iWD algorithm is implemented on different set of images. Obtained results are satisfactorily while comparing the proposed approach with other existing approaches for the various parameters like precision, recall and Fmeasure. In future work, the proposed algorithm can be applied for the other types of cryptography as well, like text, video sharing, etc.
-
-
-
A Fully Convolutional Neural Network for Recognition of Diabetic Retinopathy in Fundus Images
Authors: Manaswini Jena, Smita P. Mishra and Debahuti MishraBackground: Diabetic retinopathy is one of the complexities of diabetics and a major cause of vision loss worldwide which come into sight due to prolonged diabetes. For the automatic detection of diabetic retinopathy through fundus images several technical approaches have been proposed. The visual information processing by convolutional neural network makes itself more suitable due to its spatial arrangement of units. Convolutional neural networks are at their peak of development and best results can be gained by proper use of the technique. The local connectivity, parameter sharing and pooling of hidden units are advantageous for various predictions. Objective: Objective of this paper is to design a model for classification of diabetic retinopathy. Methods: A fully convolutional neural network model is developed to classify the diseased and healthy fundus images. Here, proposed neural network consists of six convolutional layers along with rectified linear unit activations and max pooling layers. The absence of fully connected layer reduces the computational complexity of the model and trains faster as compared to traditional convolutional neural network models. Results and Conclusion: The validation of the proposed model is accomplished by training it with a publicly available high-resolution fundus image database. The model is also compared with various existing state-of-the-art methods which show competitive result as compared to these models. A behavioural study of different parameters of the network model is represented. The intelligence of our model lies in its ability to re-tune weight to overcome outliers encountered in future. The proposed model works well with satisfactory performance.
-
-
-
Threshold Detection Scheme Based on Parametric Distribution Fitting for Optical Fiber Channels
Authors: Mohammed Usman, Mohd Wajid, Muhammad Z. Shamim, Mohd D. Ansari and Vinit K. GunjanBackground: When data is transmitted by encoding it in the amplitude of the transmitted signal, such as in Amplitude Shift Keying, the receiver makes a decision on the transmitted data by observing the amplitude of the received signal. Objective: Depending on the level of the received signal, the receiver has to decode the received data, with minimum probability of error. For binary transmission, this is achieved by choosing a decision threshold. Methods: A threshold detection mechanism is evaluated in this paper, which uses parametric probability distribution fitting to the received signal. Results: Based on our earlier work on visible light communication, the idea is extended to a fiber optic channel and it is found that in a fiber optic channel, the threshold value obtained by fitting a Rayleigh distribution to the received data results in error probability approaching zero. Conclusion: It is found that the proposed threshold estimation method works well for both equal as well as unequal apriori probabilities of the binary symbols. The proposed method of threshold detection is adaptive when the apriori probabilities for 1’s and 0’s change.
-
-
-
A Fast Method for Defogging of Outdoor Visual Images
More LessBackground: Capturing image in severe atmospheric catastrophe especially in fog critically degrades the quality of an image and thereby reduces its visibility of which in turn affects several computer vision applications like visual surveillance detection, intelligent vehicles, remote sensing, etc. Thus acquiring clear vision is the prime requirement of any image. In the last few years, many approaches have been directed towards solving this problem. Methods: In this article, a comparative analysis has been made on different existing image defogging algorithms and then a technique has been proposed for image defogging based on dark channel prior strategy. Results: Experimental results show that the proposed method shows efficient results by significantly improving the visual effects of images in foggy weather. Also the much higher computational time of the existing techniques has been reduced in this paper by using the proposed method. Discussion: Qualitative assessment evaluation was performed on both benchmark and real time data sets for determining the efficacy of the technique used. Finally, the whole work is concluded with the relative advantages and shortcomings of the proposed technique.
-
-
-
Improving Scalability, Sparsity and Cold Start User Issues in Collaborative Tagging with Incremental Clustering and Trust
Authors: Latha Banda and Karan SinghBackground: Due to huge data in web sites, recommending users for every product is impossible. For this problem Recommender Systems (RS) are introduced. RS is categorized into Content-Based (CB), collaborative Filtering (CF) and Hybrid RS. Based on these techniques recommendations are done to user. In this, CF is the recent technique used in RS in which tagging feature also provided. Objective: Three main issues occur in RS are scalability problem which occurs when there is a huge data, sparsity problem occurs when rating data is missing and cols start user or item problem occurs when new user or new item enters in the system. To avoid these issues here we have proposed Incremental clustering and Trust in Collaborative Tagging. Methods: Here we have proposed a method Collaborative Tagging (CT) with Incremental Clustering and Trust which enhances the recommendation quality by removing the issues of scalability with the help of Incremental Clustering and sparsity and cold start user or item problems are resolved with the help of Trust. Results: Here we have compared the results of Collaborative tagging with Incremental Clustering and Trust (CFT-EDIC-TS) with the baseline approaches of CT with Cosine similarity (CFT-CS), CT with Euclidian Distance and Incremental Clustering (CFT-EDIC) and CT with Trust (CFT-TS). Conclusion: Here we have compare the proposed approach with the baseline approaches and the metrics are used MAE, prediction percentage, Precision and Recall. Based on these metrics for every split CFT-EDICTS shown best results as compared to other baseline approaches.
-
-
-
A Method for Webpage Classification Based on URL Using Clustering
Authors: Sunita, Gurvinder Singh and Vijay RanaBackground: Pattern mining is the mechanism of extracting useful information from a large dataset of information. A sub-field of web mining is sequential Noisy data extraction from user query, which is considered along with redundancy handling. This redundancy handling mechanism employed in the existing literature is known as ambiguity handling. The clustering mechanism employed in the existing system includes k means, semantic search and incremental growth of the internet. Aims: The proposed works comprise an analysis of techniques used to extract useful URLs to replace noisy data. Methods: We consider noisy data extraction from user query considered along with the redundancy handling. This redundancy handling mechanism employed in the existing literature is known as ambiguity handling. The clustering mechanism used in the existing system includes k means and semantic search. These mechanisms are static, causing performance degradation in terms of execution time. It suggests the performance improvement mechanism in this literature. Results: The methods MPV (Most-Probable-Values) clustering and N-gram techniques for improvement considered in existing literature can further be improved using the research methodology specified through this literature. Conclusion: In the proposed system, results are based on MPV clustering with N-grams techniques. N-gram analyzes the instances of a word or phrase across all query data. The parameters fetch the results in terms of execution time and the number of URLs retrieves for web page classification.
-
-
-
Performance Analysis of TCP Variants Using AODV and DSDV Routing Protocols in MANETs
Authors: Rajnesh Singh, Neeta Singh and Aarti G. DinkerBackground: TCP is the most reliable transport layer protocol that provides reliable data delivery from source to destination node. The background of TCP suggests that it works well in wired networks but it is assumed that TCP is less preferred for ad-hoc networks. However, for application in ad-hoc networks, TCP can be modified to improve its performance. Various researchers have proposed improvised variants of TCP by only one or two measures. Objective: These one or two measures do not seem to be sufficient for proper analysis of improvised version of TCP. Therefore, in this paper, our objective is to evaluate the performance of different TCP variants and their results are investigated with DSDV and AODV routing protocols. Methods: Our methodology includes the analysis of various performance measures such as throughput, delay, packet drop, packet delivery ratio and number of acknowledgements. The simulation results are carried out by varying number of nodes in network simulator tool NS2. Results: It is observed in results that TCP Newreno achieved higher throughput and packet delivery ratio with both AODV and DSDV routing protocols. Whereas TCP Vegas achieved minimum delay and packet loss with both DSDV and AODV protocol. However TCP sack achieved minimum acknowledgment with both AODV and DSDV routing protocols. Conclusion: The comparison of all these TCP variants helps in drawing conclusion that, TCP Newreno provides better performance with both AODV and DSDV protocols.
-
-
-
Logistic Step Size Based Artificial Bee Colony Algorithm for Optimum Induction Motor Speed Control
Authors: Fani B. Sharma and Shashi R. KapoorBackground: The disruption in the parameters of Induction Motor (IM) is quite frequent. In case of substantial parameter disruption, the IM becomes unreliable. Methods: To deal with this complication of parameter disruption, the Proportional-Integral (PI) controllers are utilized. The selection of PI controller parameters is process dependent and any inappropriate selection of controller setting may diverge to instability. Nowadays, the tuning of PI controller parameters for IM is executed using a significant SI algorithm namely, Artificial Bee Colony (ABC) algorithm. Further, the ABC algorithm is amended based on inspiration from the mathematical logistic formula. Results and Conclusion: The propound algorithm is titled as a logistic ABC algorithm and is applied for PI controller tuning of IM. The outcomes reveal that the propound algorithm performs better as compared to other state-of-art strategies available in the literature.
-
-
-
A Hybrid Protocol Using Fuzzy Logic and Rough Set Theory for Target Coverage
Authors: Pooja Chaturvedi and Ajai K. DanielBackground: Target coverage is considered a significant problem in the area of wireless sensor networks, which usually aims at monitoring a given set of targets with the required confidence level so that network lifetime can be enhanced while considering the constraints of the resources. Objective: To maximize the lifetime of the sensor network and minimize the overhead involved in the scheduling approach, such that the pre specified set of targets is monitored for a longer duration with the required confidence level. Methods: The paper uses a fuzzy logic system based on Mamdani inference in which the node status to remain in the active state is determined on the basis of coverage probability, trust values and node contribution. The rule set for determining the set cover is optimized by using the rough set theory, which aims to determine the node validity for the trust calculation. Results: The results show that the proposed approach improved the network performance in terms of processing time, throughput and energy conservation by a factor of 50%, 74% and 74%, respectively, as compared to the existing approaches. Conclusion: The paper proposes a scheduling strategy of the nodes for target coverage as an enhancement to the Energy Efficient Coverage Protocol (EECP) on the basis of rough set theory. The rule set for determining the set cover is optimized by using the rough set theory so that the network performance is improved in terms of the processing time, throughput and energy consumption.
-
-
-
A Hybrid Technique for Selection and Minimization of Test Cases in Regression Testing
Authors: Leena Singh, Shailendra N. Singh and Sudhir DawraBackground: In today’s era, modifications in a software is a common requirement by customers. When changes are made to existing software, re-testing of all the test cases is required to ensure that the newly introduced changes do not have any unwanted effect on the behavior of the software. However, retesting of all the test cases would not only be time consuming but also expensive. Therefore, there is a need for a technique that reduces the number of tests to be performed. Regression testing is one of the ways to reduce the number of test cases. Selection technique is one such method which seeks to identify the test cases that are relevant to some set of recent changes. Objective: It is evident that most of the studies have used different selection techniques and have focused only on one parameter for achieving reduced test suite size without compromising the performance of regression testing. However, to the best of our knowledge, no study has taken two or more parameters of coverage, and/or execution time in a single testing. This paper presents a hybrid technique that combines both regression test selection using slicing technique and minimization of test cases using modified firefly algorithm with combination of parameters coverage and execution time in a single testing. Methods: A hybrid technique has been described that combines both selection and minimization. Selection of test cases is based upon slicing technique while minimization is done using firefly algorithm. Hybrid technique selects and minimizes the test suite using information on statement coverage and execution time. Results: The proposed technique gives 43.33% much superior result as compared to the other hybrid approach in terms of significantly reduced number of test cases. It shows that the resultant test cases were effective enough to cover 100% of the statements, for all the programs. The proposed technique was also tested on four different programs namely Quadratic, Triangle, Next day, Commission respectively for test suite selection and minimization which gave comparatively superior result in terms of reduction (%) in number of test cases required for testing. Conclusion: The combination of parameters used in slicing based approach, reduces the number of test cases making software testing an economical, feasible and time saving option without any fault in the source code. This proposed technique can be used by software practitioners/experts to reduce time, efforts and resources for selection and minimization of test cases.
-
-
-
A Novel and Secure Image Encryption Scheme Based on Chaotic Logistic Map and Random Scrambling Using a Special Parameter of Gray Code Transformation
Authors: Sudeept S. Yadav and Yashpal SinghBackground: Image security is a major concern in the digital communication media. So there are so many algorithms symmetric and asymmetric are available for securely transmit the image. In symmetric key algorithm many algorithms are available based on bit plane decomposition and random scrambling using chaotic maps. Objective: A secure image encryption algorithm is proposed based on the combination of (n, k, p) gray code and chaotic logistic map to improve the performance of decomposition encryption methods. Methods: We suggest a new algorithm that improves the security of image encryption based on (n, k, p) gray code and bit plane decomposition method. This new encryption technique uses the bit plane decomposition of (n, k, p) gray code then random scrambling of each bit plane and chaotic logistic by map using pixel substitution method. Results: The performance and security of proposed algorithm have been analyzed on different evaluation metrics like MSE, PSNR, UAIC and NPCR and Correlation Coefficient etc. The experimental results show that the proposed algorithm is secure and provides better results from the XOR based substitution method. Conclusion: This new method can be utilized in imaging systems like tomography images of medical field, tele-medicine and digital video, video conferencing in communication system. Further, we will study to enhance and examine the execution of (n, k, p) gray code for data hiding and picture de-noising.
-
-
-
An Un-Supervised Approach for Backorder Prediction Using Deep Autoencoder
Authors: Gunjan Saraogi, Deepa Gupta, Lavanya Sharma and Ajay RanaBackground: Backorders are an accepted abnormality affecting accumulation alternation and logistics, sales, chump service, and manufacturing, which generally leads to low sales and low chump satisfaction. A predictive archetypal can analyze which articles are best acceptable to acquaintance backorders giving the alignment advice and time to adjust, thereby demography accomplishes to aerate their profit. Objective: To address the issue of predicting backorders, this paper has proposed an un-supervised approach to backorder prediction using Deep Autoencoder. Methods: In this paper, artificial intelligence paradigms are researched in order to introduce a predictive model for the present unbalanced data issues, where the number of products going on backorder is rare. Results: Un-supervised anomaly detection using deep auto encoders has shown better Area under the Receiver Operating Characteristic and precision-recall curves than supervised classification techniques employed with resampling techniques for imbalanced data problems. Conclusion: We demonstrated that Un-supervised anomaly detection methods specifically deep auto- encoders can be used to learn a good representation of the data. The method can be used as a predictive model for inventory management and help to reduce the bullwhip effect, raise customer satisfaction as well as improve operational management in the organization. This technology is expected to create the sentient supply chain of the future - able to feel, perceive and react to situations at an extraordinarily granular level.
-
-
-
A Hybrid Combination of LS-SVM and KPCA with Bat Algorithm for Intrusion Detection
Authors: Thiruppathy K. V. and Loheswaran K.Background: Intrusion Detection System refers to a kind of software which is designed for securing a network or an information system by alerting the administrators automatically when someone is trying to compromise the overall system through malicious activities. An intrusion is a process of causing a system to enter into an insecure state. Thus, an intruder is a person or an attacker who attempts to violate the security measures by interrupting the integrity and confidentiality of a system. Objective: Today, security threads are becoming big issues when working with high speed network connections, one solution is the intrusion detection system which promises to provide network security. This paper proposed Least Square Support Vector Machine (LS-SVM) based on Bat Algorithm (BA) for intrusion detection. Methods: The proposed techniques have divided into two phases. In the first phase, KPCA is utilized as a preprocessing of LS-SVM to decrease the dimension of feature vectors and abbreviate preparing time with a specific end goal to decrease the noise caused by feature contrasts and enhance the implementation of LSSVM. In the second phase, least square support vector machine with a BA is applied for the classification of detection. BA utilizes programmed zooming to adjust investigation and abuse amid the hunting procedure. Finally, as per the ideal feature subset, the feature weights and the parameters of LS-SVM are optimized at the same time. Results: The proposed algorithm named is Kernel principal component analysis based least square support vector machine with bat algorithm (KPCA-BA-LS-SVM). To show the adequacy of proposed method, the tests are completed on KDD 99 dataset which is viewed as an accepted benchmark for assessing the execution of intrusions detection. Furthermore, our proposed hybridization method gets a sensible execution regarding precision and efficiency. Conclusion: A BA model with a novel hybrid KPCA-LS-SVM was proposed in this paper for intrusion detection system. The parameters for LS-SVM classifier are chosen with the usage of BA, the necessary features of intrusion detection data was separated using KPCA and the multi-layer SVM classifier. This classifier checks whether any activity encounters an attack. The N-RBF supports the Gaussian kernel work by measuring the preparation time and forwarding the LS-SVM classifiers execution. As future work, several algorithms should be developed by combing other available classification methods with kernel systems resulting in efficient planning of examination and online intrusion detection.
-
Most Read This Month
