Recent Advances in Electrical & Electronic Engineering - Volume 18, Issue 2, 2025
Volume 18, Issue 2, 2025
- Thematic Issue: Artificial Intelligence Models, Tools, and Applications Using Machine Learning Techniques
-
-
-
Blockchain Security Attacks, Difficulty, and Prevention
Authors: Amrita Jyoti, Vikash Yadav and Mayur RahulBlockchain technology is increasingly attracting young people because it is so well adapted to the digital age. A decentralised data management system is necessary for the blockchain idea in order to store and share data and transactions throughout the network. This study investigates various types of risks associated with blockchain technology. The research covers different aspects of blockchain, including the architecture, consensus mechanism, smart contracts, and underlying cryptographic algorithms. It also examines the risks associated with the adoption and implementation of blockchain in various industries, such as finance, healthcare, and supply chain management.
Moreover, this study identifies several types of risks, including technical risks, such as scalability, interoperability, and security, as well as non-technical risks, such as regulatory compliance, legal liability, and governance issues. This study also discusses the potential impact of these risks on blockchain-based systems and the strategies that can be used to mitigate them.
-
-
-
An Evolutionary Review on Resource Scheduling Algorithms Used for Cloud Computing with IoT Network
Authors: Santosh Shakya and Priyanka TripathiThe goal of the distributed computing paradigm known as “cloud computing,” which necessitates a large number of resources and demands, is to share the resources as services delivered over the internet. Task scheduling is a very significant stage in today's cloud computing. While lowering the makespan and cost, the task scheduling method must schedule the tasks to the virtual machines. Various academics have proposed many scheduling methods for organizing work in cloud computing environments. Scheduling has been considered the most important for cloud computing since it might directly impact a system's performance, including the efficiency of resource utilization and running costs. This paper has compared all the already used algorithms that work on different parameters. We have tried to give better solutions for resource allocation and resource scheduling. In this study, various swarm optimization, evolutionary, physical, evolving, and fusion meta-heuristic scheduling methods are categorized according to the environment of the scheduling problem, the main scheduling goal, the task-resource mapping pattern, and the scheduling constraint. More specifically, the fundamental concepts of cloud task scheduling are addressed without difficulty.
-
-
-
Inherent Insights using Systematic Analytics of Developments Tools in Ethereum Blockchain Smart Contract
Authors: Amrita Jyoti, Pradeep Gupta, Sonam Gupta, Harsh Khatter and Anurag MishraEthereum is an open-source, public, blockchain-based distributed computing platform and operating system that allows the development and execution of distributed applications without the risk of downtime, fraud, control, or intervention from a third party. Along with serving as a platform, Ethereum also offers a Turing-complete Blockchain programming language that aids in the publication of distributed applications. One of the major Ethereum projects is Microsoft's collaboration with Consensus, which provides Ethereum Blockchain as a Service (EBaaS) on Microsoft Azure to give enterprise clients and developers access to a cloud-based blockchain development environment with a single click. Only the implementation determines the size of the Ethereum blockchain. Geth's Ethereum blockchain is around 11 GB in size, compared to Parity's 6 GB. Although the total size of the Ethereum blockchain, in its entirety, may reach 60GB+. Even though the toolset you require may vary depending on the specific blockchain, the majority of tools are compatible with Ethereum, therefore here we highlighted the various enhancement tools that we use for implementing blockchain applications on the Ethereum platform.
-
-
-
An Efficient Image Captioning Method Based on Beam Search
Authors: Tarun Jaiswal, Manju Pandey and Priyanka TripathiBackgroundAn image captioning system is a crucial component in the domains of computer vision and natural language processing. Deep neural networks have been an increasingly popular tool for the generation of descriptive captions for photos in recent years.
ObjectiveOn the other hand, these models frequently have the issue of providing captions that are unoriginal and repetitious. Beam search is a well-known search technique that is utilized for the purpose of producing descriptions for images in an effective and productive manner.
MethodsThe algorithm keeps track of a set of partial captions and expands them iteratively by choosing the probable next word throughout each step until a complete caption is generated. The set of partial captions, also known as the beam, is updated at each step based on the predicted probabilities of the next words. This research paper presents an image caption generation system based on beam search. In order to encode the image data and generate captions, the system is trained on a deep neural network architecture.
ResultsThis architecture brings together the benefits of CNN with RNN. After that, the beam search method is executed in order to provide the completed captions, resulting in a more diverse and descriptive set of captions compared to traditional greedy decoding approaches. The experimental outcomes indicate that the suggested system beats existing image caption generation techniques in terms of the precision and variety of the generated captions.
ConclusionThis demonstrates the effectiveness of beam search in enhancing the efficiency of image caption generation systems.
-
-
-
An Approach to Forecast Quality of Water Effectively Using Machine Learning Algorithms
Authors: Manjusha Nambiar P.V. and Giridhar UrkudeBackgroundThe quality of water directly or indirectly impacts the health and environmental well-being. Data about water quality can be evaluated using a Water Quality Index (WQI). Computing WQI is a quick and affordable technique to accurately summarise the quality of water.
ObjectiveThe objective of this study is to find strategies for data preparation to categorize a dataset on the water quality in two remote Indian villages in different geographic locations, to predict the quality of water, and to identify low-quality water before it is made accessible for human consumption.
MethodsTo accomplish this task, four water quality features Nitrate, pH, Residual Chlorine, and Total Dissolved Solids which are crucial for human consumption, are considered to dictate the quality of water. Methods used in handling these features include five steps that are data pre-processing with min-max normalization, finding WQI, using feature correlation to identify parameter importance with WQI, application of supervised machine learning regression models such as Random Forest (RF), Multiple Linear Regression (MLR), Gradient Boosting (GB) and Support Vector Machine (SVM) for WQI prediction. Then, a variety of machine learning classification models, including K-Nearest Neighbour (KNN), Support Vector Classifier (SVC), and Multi-layer Perceptron (MLP), are ensembled with Logistic Regression (LR), acting as a meta learner, to create a stack ensemble model classifier to predict the Water Quality Class (WQC) more accurately.
ResultsThe examination of the testing model revealed that RF regression and MLR algorithms performed best in predicting the WQI with mean absolute error (MAE) of 0.003 and 0.001 respectively. Mean square error (MSE), root mean square error (RMSE), R squared (R2), and Explained Variance Score (EVS) findings are 0.002,0.005,0.988 and 0.998 respectively with RF while 0.001,0.031,0.999 and 0.999 respectively with MLR. Meanwhile, for predicting WQC, the stack model classifier showed the best performance with an Accuracy of 0.936, F1 score of 0.93, and Matthews Correlation Coefficient (MCC) of 0.893 for the dataset of Lalpura and Accuracy of 0.991, F1 Score of 0.991 and MCC of 0.981 respectively for the dataset of Heingang.
ConclusionThis study explores a method for predicting water quality that combines easy and feasible water quality measurements with machine learning. The stack model classifier performed best for multiclass classification, according to this study. To ensure that the highest quality of water is given throughout the year, information from this study will motivate researchers to look into the underlying root causes of the quality variations.
-
-
-
An Approach of Privacy Preservation and Data Security in Cloud Computing for Secured Data Sharing
Authors: Revati Raman Dewangan, Sunita Soni and Ashish MishalIntroductionCloud computing has revolutionized how individuals and businesses engage with data and software, turning the internet into a powerful computing platform by centralizing resources. Despite its benefits, there's a challenge in safeguarding sensitive information stored externally. Cryptography faces threats, particularly chosen-ciphertext attacks aiming for secret keys or system information. While more common in public-key encryption, these attacks are less frequent in symmetrically coded systems. Security efforts include validating system resilience and continuous improvement, which are vital in countering evolving threats like adaptive chosen ciphertext attacks.
MethodsIn the evaluation model, stringent measures emphasize robust encryption for system security. Despite the planning, no ciphertext attack guarantees success, necessitating adaptive security protocols. Adaptive attacks like CCA2 expose vulnerabilities, enabling attackers to manipulate ciphertexts persistently.
ResultsWe observe an average gain of 65% for the decryption algorithm. Efforts focus on strengthening security. The flawed 32-bit key-based encryption in the modified Cramer-Shoup structure undergoes remediation.
ConclusionConventional uncertainties validate resilience, emphasizing continuous evaluation and enhancement to counter evolving threats.
-
-
-
Improved DevOps Lifecycle by Integrating a Novel Tool V-Git Lab
Authors: Anurag Mishra and Ashish SharmaAimsWe propose a tool that can automatically generate datasets for software defect prediction from GitHub repositories.
BackgroundDevOps is a software development approach that emphasizes collaboration, communication, and automation in order to improve the speed and quality of software delivery.
ObjectiveThis study aims to demonstrate the effectiveness of the tool, and in order to do so, a series of experiments were conducted on several popular GitHub repositories and compared the performance of our generated datasets with existing datasets.
MethodsThe tool works by analyzing the commit history of a given repository and extracting relevant features that can be used to predict defects. These features include code complexity metrics, code churn, and the number of developers involved in a particular code change.
ResultsOur results show that the datasets generated by our tool are comparable in quality to existing datasets and can be used to train effective software defect prediction models.
ConclusionOverall, the proposed tool provides a convenient and effective way to generate high-quality datasets for software defect prediction, which can significantly improve the accuracy and reliability of prediction models.
-
-
-
A Robust Approach to Object Detection in Images using Improved T_CenterNet Method
Authors: Tarun Jaiswal, Manju Pandey and Priyanka TripathiAimsA Robust Approach to Object Detection in Images using Improved T_CenterNet Method.
BackgroundCurrently, two-stage image object detectors are the most common method of detecting objects. However, due to the slow processing of two-stage detectors, it is not always possible to apply them in real-time.
ObjectiveTo develop a more robust and effective method for detecting objects in images by building upon and improving the T_CenterNet algorithm.
MethodsIn this research paper, we propose CenterNet, a method based on a one-stage detector that takes into consideration the heat map branch of CenterNet, which is the most accurate anchor-free detector. To improve the outcomes of the upcoming image as well as detection efficiency, we disseminate the previously accurate long-term detection using a heatmap and a cascade guiding framework.
ResultsOn the COCO dataset, our technique obtains detection outcome AP50 -65.5%, (Small Objects)-33.8, (Medium Objects)-47.0 at 38 frames per second. The results of the study show the superiority of the suggested approach over the conventional one.
ConclusionIn this research, we have introduced an enhanced approach to image object detection using the improved T_CenterNet method. While the original CenterNet algorithm demonstrated superior precision compared to other single-stage algorithms, it suffered from slow detection speed. Our proposed approach, the Hourglass-208 model, built upon CenterNet, significantly improved detection performance. The proposed T_CenterNet algorithm improves upon the original CenterNet by incorporating the Hourglass backbone and feature map fusion techniques, enabling enhanced object detections. Additionally, we applied the Smooth 1 loss function to object size estimation within T_CenterNet, greatly enhancing overall detection performance by improving object size regression.
-
-
-
Enhancing Recommendation Systems with Skew Deviation Bias for Shilling Attack Detection
Authors: Sarika Gambhir, Sanjeev Dhawan and Kulvinder SinghIntroductionRecommender systems serve as a powerful tool to address the challenges of information overload by delivering personalized recommendations. However, their susceptibility to profile injection or shilling attacks poses a significant threat. Malicious entities can introduce fabricated profiles into the database of users to manipulate the popularity of specific items, subsequently influencing prediction outcomes.
MethodsDetecting and mitigating the impact of such attacks is critical for preserving recommendation accuracy and user trust. The primary objective of this study is to develop an integrated framework for robust shilling attack detection and data sparsity mitigation in recommendation systems. This approach aims to make the system more resistant to manipulative attacks and improve recommendation quality, especially when dealing with limited data. In this paper, Skew Deviation Bias (SDB), is a novel metric that gauges the skewness within rating distributions, enabling the identification of both fabricated shilling profiles and the anomalous rating behaviors exhibited by attackers. Building upon this foundation, SDB is integrated with other statistical metrics like Rating deviation from the mean agreement (RDMA), Weighted deviation from the mean agreement (WDMA), Weighted degree of agreement (WDA), and length variance. This research investigates the impact of incorporating SDB alongside existing attributes in countering various attack scenarios, including random, average, and bandwagon attacks.
ResultsExtensive experiments are conducted to compare the effectiveness of SDB when integrated with existing attributes against scenarios employing only existing attributes. These experiments cover a range of attack sizes while maintaining a fixed 50% filler size. The results of thorough comparative analyses demonstrate the consistent superiority of the SDB-integrated approach, resulting in higher accuracy across all attack types compared to scenarios using only existing attributes. Notably, the random attack scenario shows the most significant accuracy improvement among the evaluated scenarios.
ConclusionThe approach achieves a detection accuracy of 97.08% for random shilling attacks, affirming its robustness. Furthermore, in the context of data sparsity, the approach notably enhances recommendation quality.
-
-
-
On the Improvements of the Performance of Bayes-Net Classification for Diagnosis of Diabetes: A Cross-validation Approach
Authors: Sarvachan Verma, Anupama Sharma, Sachin Jain and Anuj KumarIntroductionThis study intends to support medical professionals in the early and precise detection and diagnosis of type1 and type2 Diabetes. Many complexities occur if Diabetes is unidentified and remains untreated. 9.3% of individuals worldwide have Diabetes as of 2019, according to data from the World Health Organization. The anticipated increase would be close to 11% of the global population by 2045.
Diabetes mellitus is a chronic, one of the deadliest diseases that causes Body blood glucose levels to be above average level knowing a blood sugar. Diabetes can be type1 (T1D) and type2(T2D). An autoimmune condition called Type 1 diabetes is frequently found in young children. Adults are frequently diagnosed with type 2 diabetes, which is not autoimmune. A branch of artificial intelligence called machine learning has the potential to be successful in the field of predictions.
MethodsThis research aims to design a machine learning model to predict Diabetes in a patient. The collective performance of the four algorithms is evaluated on various measures like accuracy, precision, sensitivity, and specificity. Therefore, four machine learning classification algorithms, namely Bayes-Net, Naïve-Bayes, J48, and Random Forest, were used to predict Diabetes in the early stage using 10–fold cross validation.
ResultsFor experiments, a standard PIMA Indian diabetes dataset is used. According to the results, Bayes-Net performs better than other algorithms, with an accuracy rate of 87.72%.
ConclusionDiabetes Mellitus may harm kidneys, nerves, eyes, and other organs if it is not diagnosed in time. Therefore, early diagnosis and treatment could save countless lives. The process is tedious because patients visit a diagnostic facility and see a doctor, but various machine learning and deep learning approaches solve this clinical problem earlier.
-
-
-
Real-Time Multiple Object Detection Using Raspberry Pi and Tiny-ML Approach
Authors: Tarun Jaiswal, Manju Pandey and Priyanka TripathiIntroductionObject detection has been an essential task in computer vision for decades, and modern developments in computer vision and deep learning have greatly increased the accuracy of detecting systems. However, the high computational requirements of deep learning-based object detection algorithms limit their applicability to resource-constrained systems, such as embedded devices.
MethodsWith the advent of Tiny Machine Learning (TinyML) devices, such as Raspberry Pi, it has become possible to deploy object detection systems on small, low-power devices. Due to their accessibility and cost, Tiny-ML devices, such as Raspberry Pi, a single-board tiny-ML device that is extremely well-liked, have recently attracted a lot of attention.
ResultsIn this study, we present an enhanced SSD-based object detection approach and deploy the model using a tinyML device, i.e., Raspberry Pi.
ConclusionThe proposed object detection model is lightweight and built utilizing Mobilenet-V2 as the underlying foundation.
-
-
-
Blockchain Security:"Botnets and Bitcoin Mining” - A Study on the Impacts and Countermeasures
More LessIntroductionBotnets have become a significant threat to cybersecurity, as they can be used for a wide range of malicious activities, including Distributed Denial-of-Service (DDoS) attacks, spamming, and cryptocurrency mining. Bitcoin Mining, in particular, has become a lucrative target for cybercriminals, as it requires massive computing power and can generate significant profits.
MethodsIn this paper, the author presents a study on a botnet that uses an HTA file to gain initial access and execute code on a victim's device, followed by the installation of mining software to infect the device and bitcoins.
ResultsThe author analyzes the botnet's behaviour, including its evasion techniques and Bitcoin Mining activities, and discusses the implications of current findings for cybersecurity and Bitcoin Mining.
ConclusionFuture research should also investigate the use of different command and control servers and other advanced attack frameworks in botnet operations and examine the potential connections between botnets and other cybercrime activities, such as ransomware and espionage.
-
Most Read This Month
