Recent Advances in Computer Science and Communications - Volume 15, Issue 1, 2022
Volume 15, Issue 1, 2022
-
-
Image Encryption Based on Fractional Wavelet Transform, Arnold Transform with Double Random Phases in the HSV Color Domain
Authors: Aarushi Shrivastava, Janki B. Sharma and Sunil D. PurohitObjective: In the recent multimedia technology, images play an integral role in communication. Here in this paper, we propose a new color image encryption method using FWT (Fractional Wavelet transform), double random phases and Arnold transform in the HSV color domain. Methods: Firstly, the image is changed into the HSV domain and encoding is done using the FWT which is the combination of the fractional Fourier transform with wavelet transform and the two random phase masks are used in the double-random phase encoding. In this method, inverse DWT is taken at the end in order to obtain the encrypted image. To scramble the matrices, the Arnold transform is used with different iterative values. The fractional-order of FRFT, the wavelet family and the iterative numbers of Arnold transform are used as various secret keys in order to enhance the level of security of the proposed method. Results: The performance of the scheme is analyzed through its PSNR and SSIM values, keyspace, entropy, and statistical analysis, which demonstrates its effectiveness and feasibility of the proposed technique. The simulation result verifies its robustness in comparison to nearby schemes. Conclusion: This method provides better security, enlarged and sensitive keyspace with improved PSNR and SSIM. FWT reflecting time-frequency information adds to its flexibility with additional variables and making it more suitable for secure transmission.
-
-
-
Wireless Sensor Network in IoT and Performance Optimization
Authors: Sunita Gupta, Sakar Gupta and Dinesh GoyalA serious problem in Wireless Sensor Networks (WSNs) is to attain high energy efficiency as a battery is used for power generation that has limited stored energy. They can not be suitably replaced or recharged. The appearance of renewable energy harvesting techniques and their combination with sensor devices gives Energy Harvesting Wireless Sensor Networks (EHWSNs). IoT is now becoming part of our lives, comforting simplifying our routines and work life. IoT is very popular. It connects, computes, communicates, and performs the required task. IoT is actually a network of physical devices or things that can interact with each other to share information. This paper gives an overview of WSN and IoT, related work, different ways of connecting WSN with the internet, development of smart home, challenges for WSN, etc. Next, a framework for performance optimization in IoT is given and QC-PC-MCSC heuristic is analyzed in terms of energy efficiency and life time of a sensor on energy latency density design space, a topology management application that is power efficient. QC-PC-MCSC and QC-MCSC are compared for energy efficiency and life time of a sensor over energy latency density design space, a topology management application.
-
-
-
Distributed Time-Dependent Key Management Scheme for Internet of Things
Authors: C. Tamizhselvan and V. VijayalakshmiThe Internet of Things (IoT) faces a variety of challenges and issues in administering security due to its heterogeneous nature and open communication medium. The fundamental security requirement of authentication is expected to secure the communication between the devices from anonymous access. In this article, we present a Distributed Time-Dependent Key Management (DTKM) technique for securing the communication sessions of the IoT devices. This key management technique considers the communication time and interval of the devices independently to generate session keys for authentication. Key management is preceded by the consideration of request authentication and resource utilization. Based on the communication time session, the validity of the keys is determined and is updated with the current random number consideration for authentication. This helps to prevent unnecessary exploitation of storage and also reduces computation time.
-
-
-
Door Knob Security
Authors: Mohit Kumar, Kushagra Goel, Rakshit Kandpal, Sanchit Jain, Shikha Saxena and SharmilaIntroduction: This paper deals with research on the safety of houses that are vulnerable to door locking systems. Usually, doors are locked with keys whether computerized keys or manmade locks, but the key unlocking system is vulnerable as keys can be lost or stolen and might be duplicated. Methods: The proposed methodology works using the Morse code that is delivered to the person standing outside the door to get in. The Morse code is generated randomly from software, so the code cannot be duplicated. Results: The project was designed to accomplish the goal of security with respect to the door lock. The student worked on the project individually with assistance from the professor, whenever necessary. The project could potentially be a group project for two or more students, allowing each student to focus on a particular aspect of the project, such as hardware or software. Conclusion: Most of the documentation accessed online was missing configuration steps and all of them were using a button to regenerate a secret code. The documents do not cover all potential issues which were addressed. Discussion: Artificial Intelligence will be added so that the door-knob mechanism will not work if any one member is already inside. The mechanism will be made accessible, remotely, in case the owner allows someone, but the owner is not present there at that moment.
-
-
-
VitaFALL: Advanced Multi-Threshold Based Reliable Fall Detection System
Authors: Warish D. Patel, Chirag Patel and Monal PatelBackground: The biggest challenge in our technologically advanced society is the healthy being of aging individuals and differently-abled people in our society. The leading cause for significant injuries and early death in senior citizens and differently-abled people is due to falling off. The possibility to automatically detect falls has increased demand for such devices, and the high detection rate is achieved using the wearable sensors, this technology has a quite social and monetary impact on society. Therefore, even for the daily activity in the life of aged people, an automatic fall detecting system and vital signs examining system have become a necessity. Objectives: This research work aims at helping aged people and every other necessary human by monitoring their vital signs and fall prediction. A fall detection VitaFALL (Vital Signs and Fall Monitoring) device could analyze the measurement in all three orthogonal directions using a tripleaxis accelerometer, and Vital Signs Parameters (Heartrate, Heartbeat, and Temperature monitoring) for the aged and differently-abled people. Methods: In comparison with present algorithms, there are various benefits regarding privacy, success rate, and design of devices upgraded using an implemented algorithm over the ubiquitous algorithm. Results: As concluded from the experimental outcomes, the accuracy achieved is up to 94%. ADXL335 is a 3-Axial Accelerometer Module that collects the accelerations of aged people from a VitaFALL device. A guardian can be notified by sending a text message via GSM and GPRS modules so that the aged people can be helped. Conclusion: However, a delay in the time can be noticed while comparing the gradient and minimum value to predetermine the state of the older person. The results of the experiment show the adequacy of the proposed approach.
-
-
-
Artificial Intelligence-based Myelinated White Matter Segmentation for a Pediatric Brain - A Challenging Task
Authors: S. J. Jemila and A. Brintha ThereseBackground: Segmentation of a baby brain, in particular, myelinated white matter is a very challenging and important task in medical image analysis, because of the ongoing process of myelination and structural differences present in magnetic resonance images of a baby. Most available algorithms for the segmentation of a baby brain are atlas-based segmentation, which may not be accurate because baby brain Magnetic Resonance Images (MRI) are very subjective. Objective: Artificial intelligence-based methods for myelinated white matter segmentation. Methods: Fuzzy C-means Clustering with Level Set Method (FCMLSM), Adaptively Regularized Kernel-based Fuzzy C - means clustering (ARKFCM), Multiplicative Intrinsic Component Optimization (MICO) and Particle Swarm Optimization (PSO). Results: Signal to Noise Ratio (SNR), Edge Preservation Index (EPI), Structural Similarity Index (SSIM) and Peak Signal to noise Ratio (PSNR) Accuracy, Precision, Dice and Jaccard values are maintained good and Mean squared error (MSE) is less for FCMLSM. Conclusion: FCMLSM is a very suitable method for myelinated white matter segmentation when compared to ARKFCM, MICO and PSO.
-
-
-
Novel Noise Filter Techniques and Dynamic Ensemble Selection for Classification
Authors: Thi Ngoc A. Nguyen, Quynh P. Nhu and Vijender K. SolankiBackground: Ensemble selection is one of the most researched topics for ensemble learning. Researchers have been attracted to selecting a subset of base classifiers that may perform more helpful than the whole ensemble system classifiers. Dynamic Ensemble Selection (DES) is one of the most effective techniques in classification problems. DES systems select the most appropriate classifiers from the candidate classifier pool. Ensemble models that balance diversity and accuracy in the training process improve performance than the whole classifiers. Objective: In this paper, novel techniques are proposed by combining Noise Filter (NF) and Dynamic Ensemble System (DES) to have better predictive accuracy. In other words, a noise filter and DES make the data cleaner and DES improves the performance of classification. Methods: The proposed NF-DES model, which was demonstrated on twelve datasets, especially has three credit scoring datasets and a performance measure accuracy. Results: The results show that our proposed model is better than other models. Conclusion: The novel noise filer and dynamic ensemble learning with the aim to improve the classification ability are presented. To improve the performance of classification, noise filter with dynamic ensemble learning makes the noise data toward the correct class. Then, novel dynamic ensemble learning chooses the appropriate subset classifiers in the pool of base classifiers.
-
-
-
Stacking Regression Algorithms to Predict PM2.5 in the Smart City Using Internet of Things
Authors: Alisha Banga, Ravinder Ahuja and Subhash C. SharmaBackground: With the increase in populations in urban areas, there is an increase in pollution also. Air pollution is one of the challenging environmental issues in smart cities. Objective: Real-time monitoring of air quality can help the administration to take appropriate decisions on time. Advancement in the Internet of Things based sensors has changed the way to monitor air quality. Methods: In this paper, we have applied two-stage regressions. At the first stage, ten regression algorithms (Decision Tree, Random Forest, Elastic Net, Adaboost, Extra Tree, Linear Regression, Lasso, XGBoost, Light GBM, AdaBoost, and Multi-Layer Perceptron) are applied and at second stage best four algorithms are selected and stacking ensemble algorithms are applied using python to predict the PM2.5 pollutants in the air. Dataset of five Chinese cities (Beijing, Chengdu, Guangzhou, Shanghai, and Shenyang) is taken into consideration and compared based on MAE (Mean Absolute Error), RMSE (Root Mean Square Error) and R2 parameters. Results: We observed that out of ten regression algorithms applied, extra tree algorithm exhibited the best performance on all the five datasets, and further stacking improved the performance. Conclusion: Feature importance for Sheyang and Beijing city was computed using three regression algorithms, and we found that the four most important features are humidity, wind speed, wind direction and dew point.
-
-
-
A Novel Approach for Extraction of Distinguishing Emotions for Semantic Granularity Level Sentiment Analysis in Multilingual Context
Authors: Midde V. Naik, Vasumathi D and A.P. S. KumarIntroduction: Extraction of distinguishing semantic level emotions posed in multilanguages over social media is an essential task in the field of sentiment analysis or opinion mining. The extraction of emotions expressed in Dravidian or local languages combining with multilanguages over social media has become an essential challenge in the field of big data sentiment analysis. Methods: In the proposed approach, an innovative framework to recognize the sentiments of users in multi-languages or Dravidian languages text data using scientific linguistic theories has been defined. The proposed method used machine learning techniques such as naïve Bayes, support vector machine for fine-grained classification of multilingual text with the help of lexicon-based features groups. Results: The results obtained by the experiments conducted on collected benchmark datasets in the proposed approach are outperformed and better in comparison with corpus-based and world level, phrase-level sentiment analysis for multi-languages text. Conclusion: Machine learning technique SVM has outperformed for sentiment and emotion extraction.
-
-
-
IoT Enabled Crop Prediction and Irrigation Automation System Using Machine Learning
Authors: Raj Kumar and Vivek SinghalAim: India, an agricultural country with more than 50% population, depends on agriculture as the main source of income. Increasing population, shrinking agricultural land and fragmentation may create severe challenges to food safety in India. Therefore, there is an intense need for the application of technological innovations in the agriculture sector to improve its growth. Objectives: The objective of the study is to design and develop an automated crop yield maximization system with an irrigation automation process using the Internet of Things (IoT) and machine learning. The proposed study focuses on the maximization of crop yield and profits of the farmer using crop prediction and automation of the irrigation process leading to water conservation. Methods: The proposed system firstly rolls around, making the irrigation system automated to ease the burden of getting water for the plants when they need it. In an automatic irrigation system, data sensed by the temperature sensor and soil sensor in real-time are sent to the microcontroller board, and based on the values received, it will instruct the relay to turn on/off the water pump and thus automate the irrigation process. Secondly, it involves predicting suitable crop types using soil parameters and weather conditions that may be implanted by the farmers, using machine learning techniques, displayed on an android based application. Results: The system proposed in this work recommends suitable crops as per the current weather and soil conditions in a particular geographical area by applying the XGBoost classifier with the automation of the irrigation process. The study also does a comparative performance analysis of the XGBoost classifier based on the F1 Score and Accuracy. Conclusion: The study proposed a micro-controller-based model for crop maximization and irrigation automation. The system mainly fulfills two major objectives of agriculture-maximization of crop yield and farmer's profit using crop prediction, and automation of the irrigation process leads to water conservation.
-
-
-
Indian Sign Language Recognition on PYNQ Board
Authors: Sukhendra Singh, G. N. Rathna and Vivek SinghalIntroduction: Sign language is the only way to communicate for speech-impaired people. But this sign language is not known to normal people so this is a barrier in communication. This is the problem faced by people with speech impairments or disorder. In this paper, we have presented a system which captures hand gestures with a Kinect camera and classifies the hand gesture into its correct symbol. Methods: We used the Kinect camera, not the ordinary web camera, because the ordinary camera does not capture its 3d orientation or depth of an image; however, Kinect camera can capture 3d image and this will make the classification more accurate. Results: Kinect camera produces a different image for hand gestures for ‘2’ and ‘V’ and similarly for ‘1’ and ‘I’; however, a simple web camera cannot distinguish between these two. We used hand gestures for Indian sign language and our dataset contained 46339, RGB images and 46339 depth images. 80% of the total images were used for training and the remaining 20% for testing. In total, 36 hand gestures were considered to capture alphabets and alphabets ranged from A-Z and 10 for numerics. Conclusion: Along with real-time implementation, we have also shown the comparison of the performance of various machine learning models in which we found that CNN working on depth- images has more accuracy than other models. All these resulted were obtained on the PYNQ Z2 board. Discussion: We performed labeling of the data set, training, and classification on PYNQ Z2 FPGA board for static images using SVM, logistic regression, KNN, multilayer perceptron, and random forestalgorithms. For this experiment, we used our own 4 different datasets of ISL alphabets prepared in our lab. We analyzed both RGB images and depth images.
-
-
-
A New Approach for EOQ Calculation Using Modified Opportunity Cost
Authors: Juhi Singh, Mandeep Mittal and Sarla PareekIntroduction: Optimal inventory levels are necessary for a firm to avoid shortage/ excess of an item. The shortage of an item leads to stock out conditions resulting in loss of profit. When items are correlated with each other, the stock out condition of one item may result in the nonpurchase of its associated items also which, in turn, further brings down the profit. In this paper, this loss in profit is used to modify the opportunity cost of an item resulting in its modified EOQ. Methods: One illustrative example has been discussed which incorporates purchase dependencies in retail multi-item inventory management. The model discussed in this research paper will be motivational for researchers and inventory managers and provides a method for incorporating correlation among items while managing inventory. Results: The EOQs of items are estimated both by using the traditional method and then by using modified opportunity cost (modeled as loss profit). Results show that in frequent itemset {A, B, D}, EOQs of all three items increased when correlation among them is considered, resulting in an increase in the profit. Discussion: In inventory management system, for increasing the profit of a firm, EOQs of items need to be calculated in order to avoid shortage or excess of inventory. For explaining the approach, a very small database is taken consisting of only 5 items and 10 transactions, therefore, the increase in profit is minimal however when this approach is applied on a real database consisting of thousands of items and transactions, the increase in profit will be significant. Conclusion: One of the major focus areas of inventory management is to determine when and how much quantity of items needs to be ordered so that total inventory cost can be minimized and the profit of a firm can be maximized. However, while calculating the true value of an item and the profit it brings to the firm, it is very essential to analyze its effect on the sale of other items. Association rule mining provides a way to correlate items by calculating support and confidence factor.
-
-
-
Assessment of Risks for Successful Implementation of Industry 4.0
Authors: Rimalini Gadekar, Bijan Sarkar and Ashish GadekarPurpose: The transformation happening globally, though referred to by different names and nomenclatures, the overall objective to inspire digitalization and smart practices by reducing human intervention and enhancing machine intelligence to take on the global manufacturing and production to another level of excellence is a proven fact now. However, earlier research has been found lacking in the strategic approach to evaluate and analyze the I4.0 adoption-related risks for its implementation. This ultimately deprived organizations of a multitude of the benefits of I4.0 adoption. This research proposes a systematic methodology for understanding and evaluating the most evident risks in the context of I4.0 implementation. Design/Methodology/Approach: The research is mainly based on the inputs from experts/consultants along with robust literature review and researcher’s experience in the area of risk handling. The MCDM methods used for investigation and assessment are Fuzzy AHP and Fuzzy TOPSIS. The outcomes of the study are further validated through sensitivity analysis and real-world scenario. Results: Technical and Information Technology (IT) risks are found to be on the top of the priority list, which needs urgent attention while embarking on I4.0 adoption in the industry, and the most important criteria, which needed urgent attention was Information Security. The paper has also developed the ‘Industry 4.0 Risks Iceberg model’ and systematically categorized the challenges into 5 dimensions for easy assessment and analysis. Practical Implications: This systematic and holistic study of the I4.0 associated risks can be used to find the most critical and crucial risks based on which the strategies and policies may be modified to harness the best of I4.0. This will not only ensure the returns on investment but also will build trust in the system. The research would be very beneficial to managers, academicians, researchers, and technocrats who would be involved in I4.0 implementation.
-
-
-
Optimal Inventory Policies for Price and Time Sensitive Demand Under Advertisement
Authors: Nita Shah, Ekta Patel and Kavita RabariAims: This article analyzes an inventory system for deteriorating items. The demand is a quadratic function of time and is dependent on time, price and advertisement. Shortages are allowed and partially backlogged. Background: Demand and pricing are the two most crucial factors in inventory policy for any business to be successful. In today’s era of competitive circumstances, any product is promoted through advertisement, which plays a vital role in changing the demand pattern among the community. The marketing of an item through advertisement in media such as TV, radio, newspaper etc., and also through the trade person, attracts the customers to buy it more. However, this idea is not always true for some goods like wheat, vegetables, fruits, food grains, medicines and other perishable goods due to their deteriorating nature and this, in turn, decreases demand for such goods. Deterioration may define as decay, damage, spoilage, evaporation, obsolescence, or pilferage of an item. Hence, the deterioration effect is a major part of the inventory control theory. So in this article, the demand rate is considered to be a function of selling price and the time of the occurrence of advertisement. Objective: A solution procedure is obtained to find the optimal price change and the optimal selling price to maximize the total profit. Methods: Classical Optimization works under the necessary conditions proposed by Kuhn-Tucker method. The optimal values of the decision variables are obtained by setting partial derivatives of the objective function equal to zero. Sufficiency conditions are explored through second-order derivative of the objective function with respect to decision variables. Results: From the sensitivity analysis table, it can be seen that the optimal profit is highly sensitive to advertisement coefficient and purchase cost. With an increase in the rate of deterioration, selling price decreases. Scale demand has a reasonable effect on cycle time and selling price. When the value increases, the cycle length and profit decrease. With an increase in the demand, the total profit increases significantly.. Price elasticity has a negative impact on the selling price. If the backlogging rate increases, the profit will decrease. The inventory parameters, holding cost, back-order cost and lost sale cost have a marginal effect on the total profit. Conclusion: In this article, an inventory model is proposed for deteriorating items with variable demand depending upon the advertisement, selling price of the item and time of the deteriorating cycle. The shortages of such items are allowed and partially backlogged, and the backlogging rate depends on the waiting time for the next replenishment. From this article, one can conclude that the parameters are not sensitive to optimal profit, cycle time, selling price and the rest of the parameters have a practical effect on the total profit.
-
-
-
Convolution Neural Network Based Visual Speech Recognition System for Syllable Identification
Authors: Hunny Pahuja, Priya Ranjan, Amit Ujlayan and Ayush GoyalIntroduction: This paper introduces a novel and reliable approach for people with speech impairment to assist them in communicating effectively in real-time. A deep learning technique named as convolution neural network is used as its classifier. With the help of this algorithm, words are recognized from an input which is visual speech, disregarding the audible or acoustic property. Methods: This network extracts the features from mouth movements and different images, respectively. With the help of a source, non-audible mouth movements are taken as an input and then segregated as subsets to get the desired output. The Complete Datum is then arranged to recognize the word as an affricate. Results: Convolution neural network is one of the most effective algorithms that extract features, perform classification and provides the desired output from the input images for the speech recognition system. Conclusion: Recognizing the syllables at real-time from visual mouth movement input is the main objective of the proposed method. When the proposed system was tested, datum accuracy and quantity of training sets proved to be satisfactory. A small set of datum is taken as the first step of learning. In future, a large set of datum can be considered for analyzing the data. Discussion: On the basis of the type of datum, the network proposed in this paper is tested for its precision level. A network is maintained to identify the syllables, but it fails when syllables are of the same set. There is a requirement of a higher end graphics processing units to reduce the time consumption and increase the efficiency of a network.
-
Most Read This Month
