14th International Conference on Networks & Communications (NeTCoM 2022)

June 25 ~ 26, 2022, Copenhagen, Denmark

Accepted Papers


Supervised Machine Learning Algorithm for the Prediction of Fluid Flow Parameters

Dr. P. Priyadharshini and M. Vanitha Archana, Department of Mathematics, PSG College of Arts and Science, Coimbatore, India

ABSTRACT

An incompressible MHD nanofluid boundary layer flow over a stretching surface employing Buongiornos design investigated by considering the heat and mass transfer convective states. The Brownian motion and thermophoresis models were used to implement the nanofluid model. Operating the similarity transmutations, transforms the governing nonlinear partial differential equations to ordinary differential equations consisting of the multiple slips (momentum, energy, and concentration) and later worked by using a program written together with the stiffness shifting in Wolfram Language. The consequences of different physical parameters, magnetic parameter M, Brownian motion parameter Nb, thermophoresis parameter Nt, Lewis number Le, Suction parameter temperature and concentration Biot numbers on the multiple slips as well as the variation of the skin friction coefficient, reduced Nusselt and reduced Sherwood numbers concerning M and for various values of physical parameters obtained graphically, then compared with other recent works in tabular form. Finally, introduce a new environment to employ machine learning by performing the sensitivity analysis based on the iterative method for predicting the variation of skin friction coefficient, reduced Nusselt number and, reduced Sherwood number concerning the magnetic parameter. Here is investigated for estimating the cost function by comparing the actual and expected outcomes to fit the best fit line with the help of graphical forms. Machine learning gives a powerful and intelligent data processing architecture, and may hugely enrich the existing research techniques and industrial application of fluid mechanics.

KEYWORDS

Magnetohydrodynamics, Nanofluid, Stiffness Shifting, Machine Learning Algorithms.


Mining Biomedical Literature to Discover Natural Cure for Recurrent Disease

Farhi Marir1, Hussein Fakhry1 and Aida Azar2, 1College of Technological Innovation, Zayed University, Academic City, Dubai, UAE, 2College of Medicine, Mohammed Bin Rashid University of Medicine and Health Sciences (MBRU), Dubai Health Care City, Dubai, United Arab Emirates

ABSTRACT

The advances in digital data collection and storage technology allowed the storage of huge amounts of medical publications in MEDLINE database which contains more than 25 million references to journal articles and abstracts in life sciences and biomedicine. This research work builds on Swanson use of mathematical association between A and C concepts/terms through a list of B concept/terms retrieved from medical articles that contain either A&B or B&C terms links A to C. Swanson discovered evidence that fish oil (A) cure vessel blood disorder (C) and magnesium (A) help with migraine headache (C) which were clinically approved two years later. In this research paper we present co-occurrence mining algorithm and an A&C pre-defined domain knowledge base (containing for instance Garlic Composition and Blood pressure causes) to filter and reduce the exponential number of shared B terms retrieved from MEDLINE articles using Swanson’s Arrowsmith machine. The reduced number of relevant B terms makes it easier to build scientific evidence to validate publicly known remedies for recurrent diseases like for instance garlic reduces blood pressure and honey relieves coughs and soothes a sore throat, etc.

KEYWORDS

Co-occurrence Text Mining, ABC Arrowsmith Discovery Machine, Dietary Aliments & Disease Knowledge Base, and MEDLINE medical database.


Brexit: Predicting the Brexit UK Election Results by Constituency using Twitter Location based Sentiment and Machine Learning

James Usher and Pierpaolo Dondio, School of Computing Technological University Dublin, Dublin, Ireland

ABSTRACT

After parliament failed to approve his revised version of the ‘Withdrawal Agreement’, UK Prime Minister Boris Johnson called a snap general election in October 2019 to capitalise on his growing support to ‘Get Brexit Done’. Johnson’s belief was that he had enough support countrywide to gain a majority to push his Brexit mandate through parliament based on a parliamentary seat majority strategy. The increased availability of large-scale Twitter data provides rich information for the study of constituency dynamics. In Twitter, the location of tweets can be identified by the GPS and the location field. This provides a mechanism for location-based sentiment analysis which is the use of natural language processing or machine learning algorithms to extract, identify, or distinguish the sentiment content of a tweet (in our case), according to the location of origin of said tweet. This paper examines location-based Twitter sentiment for UK constituencies per country and aims to understand if location-based Twitter sentiment majorities per UK constituencies could determine the outcome of the UK Brexit election. Tweets are gathered from the whisperings of the UK Brexit election on September 4th 2019 until polling day, 12th December 2019. A Naive Bayes classification algorithm is applied to assess political public Twitter sentiment. We identify the sentiment of Twitter users per constituency per country towards the political parties’ mandate on Brexit and plot our findings for visualisation. We compare the grouping of location-based sentiment per constituency for each of the four UK countries to the final Brexit election first party results per constituency to determine the accuracy of location-based sentiment in determining the Brexit election result. Our results indicate that location-based sentiment had the single biggest effect on constituency result predictions in Northern Ireland and Scotland and a marginal effect on Wales base constituencies whilst there was no significant prediction accuracy to England’s constituencies. Decision tree, neural network, and Naïve Bayes machine learning algorithms are then created to forecast the election results per constituency using location-based sentiment and constituency-based data from the UK electorate at national level. The predictive accuracy of the machine learning models was compared comprehensively to a computed-baseline model. The comparison results show that the machine learning models outperformed the baseline model predicting Brexit Election constituency results at national level showing an accuracy rate of 97.87%, 95.74 and 93.62% respectively. The results indicate that locationbased sentiment is a useful variable in predicting elections.

KEYWORDS

BREXIT Election, Twitter, Sentiment, UK Election.


Enhancing Motor Imagery Decoding Via Transfer Learning

Olawunmi George and Sheikh Iqbal Ahamed, Computer Science Department, Marquette University, Milwaukee, Wisconsin, USA

ABSTRACT

Motor imagery (MI) is arguably one of the most common brain-computer interface (BCI) paradigms. The decoding process, in many cases, involves the use of small amounts of data gathered over a period. The decoding performance might therefore be limited, due to the size of available data. Also, the non-stationarity of signals across sessions and subjects can pose a challenge to effective decoding. To solve these challenges, transfer learning is proposed as the suitable approach, which could yield optimal performance even with small amounts of data and handle the non-stationarity of signals with adaptation. It has been applied across domains and tasks where only small amounts of data are available and where signal distribution changes are more rapid. Transfer learning (TL) aids the decoding process by utilizing previous knowledge learnt in the decoding process to enhance future decoding. In this study, we apply the concept of transfer learning in the classifier space, yielding improvements in the decoding process up to 3%. We investigate its effect across two main scenarios: withinand across-subject motor imagery decoding. Within each of these scenarios, we consider how to optimally transfer useful knowledge, across sessions, to aid the decoding process, taking into consideration factors such as the update mode and data sources to be used for transfer across subjects. To our knowledge, this is one of the few works making such considerations in the application of transfer learning in electroencephalography (EEG)-based MI. From our results, we conclude that transfer learning is useful in motor imagery experiments, where small amounts of data are available, and can be used in mitigating the effect of non-stationarity across sessions and subjects.

KEYWORDS

EEG, BCI, motor imagery, deep learning, machine learning, transfer learning, knowledge transfer, supervised learning.


Study on Emotional State Change based on Dynamic Expression Similarity

Yan Zhang, Xiangyang Feng and Ming Zhu, College of Computer Science and Technology, Donghua University, Shanghai, China

ABSTRACT

Facial expressions can express different emotions.Similar facial expressions usually correspond to same emotions, and the changing process of emotional states is reflected in the dynamic changes of facial expressions. However, existing studies mainly focus on instantaneous emotional states, which cannot reflect the intensity of emotions.This paper proposes a method to study the process of emotion change based on dynamic expression similarity, which can evaluate not only changes in emotional state but also changes in emotional intensity.First, the features of dynamic expressions are extracted based on the VGG16 network model. Then, the cosine similarity of the expression features is calculated to match the corresponding emotions. At the same time, the expression intensity of each frame is calculated to evaluate the change in emotional intensity. The experimental results show that the similarity calculated in this paper is increased by 9.7% on average, which can be used for the study of emotional state.

KEYWORDS

Dynamic Expression Similarity, The Emotional State Change, Emotional Intensity.


Anti-Virus Autobots: Predicting more Infectious Virus Variants for Pandemic Prevention through Deep Learning

Glenda Tan Hui En1*, KoayTze Erhn1* and Shen Bingquan2, 1Raffles Institution, Singapore, 1Raffles Institution, Singapore, 2DSO National Laboratories, Singapore

ABSTRACT

More infectious virus variants can arise from rapid mutations in their proteins, creating new infection waves. These variants can evade one’s immune system and infect vaccinated individuals, lowering vaccine efficacy. Hence, to improve vaccine design, this project proposes Optimus PPIme– a deep learning approach to predict future, more infectious variants from an existing virus (exemplified by SARS-CoV-2). The approach comprises (i)an algorithm which acts as a “virus” attacking a host cell. To increase infectivity, the “virus” mutates to bind better to the host’s receptor. 2 algorithms were attempted –greedy searchand beam search. The strength of thisvariant-host bindingwas then assessed by a transformer network we developed, with a high accuracy of 90%. With both components, beam search eventually proposed more infectious variants. Therefore, this approach can potentially enable researchers to develop vaccines that provide protection against future infectious variants before theyemerge, pre-empting outbreaks and saving lives.

KEYWORDS

Virus Variants, Transformers, Deep Learning.


Sustainable Hardware for Lpwan-based Location Tracking of Construction Assets within Smart Cities

Anthony S. Deese1, Thomas Brennan2, Andrew Bechtel2, Joe Jesson1, and Efrain Rodriguez3, 1Department of Electrical and Computer Engineering, The College of New Jersey, Pennington, New Jersey, USA, 2Department of Civil Engineering, The College of New Jersey, Pennington, New Jersey, USA, 3Tenna LLC, Edison, New Jersey, USA

ABSTRACT

This paper discusses the design and testing of an application of digital IoT technology to track relatively small and inexpensive non-powered assets like concrete roadway barriers within a smart city, focusing on knowledge innovation, organizational systems and technology, and location intelligence. This application is of significant interest to many private corporations. The authors objective is to develop, implement, and test wireless digital IoT hardware having the ability to track construction assets while being sufficiently inexpensive to warrant their placement on smaller assets while having sufficient range to track and communicate with assets over ranges up to 2.5km. The proposed solution is durable enough to withstand the stresses of use on construction sites and sustainable enough to operate for 5+ years without manual intervention or introducing an external energy source to recharge the unit. This paper provides significant physical test results that validate the efficacy of the approach in practical use-case scenarios.

KEYWORDS

Internet of Things (IoT), Wireless Network, Transportation.


Verifying Outsourced Computation in an Edge Computing Marketplace

Christopher Harth-Kitzerow & Gonzalo Munilla Garrido, Technical University of Munich, Germany

ABSTRACT

An edge computing marketplace could enable IoT devices (Outsourcers) to outsource computation to any participating node (Contractors) in their proximity. In return, these nodes receive a reward for providing computation resources. In this work, we propose a scheme that verifies the integrity of arbitrary deterministic functions and is resistant to both dishonest Outsourcers and Contractors who try to maximize their expected payoff. We tested our verification scheme with state-of-the-art pre-trained Convolutional Neural Network models designed for object detection. On all devices, our verification scheme causes less than 1ms computational overhead and a negligible network bandwidth overhead of at most 84 bytes per frame. Our implementation can also perform our verification scheme’s tasks parallel to the object detection to eliminate any latency overhead. Compared to other proposed verification schemes, our scheme resists a comprehensive set of protocol violations without sacrificing performance.

KEYWORDS

Edge Computing, Internet of Things, Function Verification, Computing Marketplaces.


Parametric Security and Privacy Computing for Internet of Things

Oluwadare Oatunji1, Oluwashola Adeniji2 and Ukeme Sunday3, 1Department of Computer Science, University of Ibadan, Nigeria, 2Department of Computer Science, University of Ibadan, Nigeria, 3Lagos Business School, Lagos, Nigeria

ABSTRACT

Internet of things often referred to as (IOT) refers to the collective network of connected devices and the technology that facilitates communication between devices and the cloud. It comprises various components that enhances high level of technology in the digital world. IOT has helped to revolutionize the world in various fields such as digital tech, E-Health, Artificial Intelligence, Cybernetics, E-currency, robotic science and various aspect of lives. Since the introduction of IOT in the New Millennium, it has risen to be the back bone Organization via its Network. Despite the enormous benefit of IOT, security and privacy challenges has been a major factor militating against it. This is feasible via: insecurities, zero-day attacks, cyber threats, cyberattacks etc. Internet hijackers and cyber fraud stars are threatening the future of IOT.

KEYWORDS

cyber-attack, Internet of things (IOT), Network Telescope, Computational Algorithm, security, Hardware, software, cloud computing.


The Development of Resource Allocation Management Systems [Rams] for Company A

Samantha Hilario, Nicole Mallari, Martin Dizon, Inigo Salazar, Joel Mendoza, University of Asia and the Pacific, Pasig City, Metro Manila, Philippines

ABSTRACT

The researchers discovered that Company A is experiencing challenges in managing and tracking their furniture, appliances, and IT resources. The company currently uses MS Excel as their asset management; however, it became inefficient enough to be used in their business operations anymore since the company is growing and gaining more clients. The researchers meet with the client to discuss the issues, business processes, possible solutions, and other constraints. The researchers implemented an Agile Methodology approach in developing the Resource Allocation Management System [RAMS]. In addition, they would provide a web-based system that keeps track of the whole asset’s life cycle (item’s specifications, procurement, current user and location, and usage history) where every asset record is protected. In conclusion, the researchers found that the Agile approach was effective in identifying issues and designing solutions that tailor-fit to the client’s business.

KEYWORDS

Asset management, Agile, Asset, Resources, MS Excel.


Multimedia Network Performance Assessment Under Different Operating Configurations

M.A. Ba Humaish, School of Science and Engineering, The American University in Cairo, Cairo, Egypt

ABSTRACT

Routing the packets from the source to the destination is the main responsibility of the network layer, so many routing algorithms in this layer can be used to select the best routes and data structure of packets. In this paper, we have used two routing protocols Routing Information Protocol (RIP) and Open Shortest Path First (OSPF) to analyze the performance of a network. Different applications such as video, FTP, print, and voice were configured based on different parameters. Additionally, we have looked at the same networks behavior when we implement Quality of service using Weighted Fair Queuing algorithm (WFQ) and Multi-Protocol Label Switching protocol (MPLS) to ensure that the multimedia networks performance is a guaranteed endeavor. The simulation results analyzed using the Optimized Network Engineering Tools (OPNET), and we compared these protocols to the effectiveness and performance in the network topology, also Multimedia applications are analyzed also, and the most essential QoS requirements are determined and quantified.

KEYWORDS

Quality of Service (QoS), Multimedia Applications, Open Shortest Path First (OSPF), Routing Information Protocol (RIP), Weighted Fair Queuing algorithm (WFQ) and Multi-Protocol Label Switching (MPLS).


The usability challenges in E-banking and mobile banking applications

Lul Mohamed Osman1 and DR. COLLINS ODUOR2, 1Student, Department of Computing, United States International University-Africa, Nairobi, Kenya, 2Assistant Professor, School of Science and Technology, United States International University-Africa, Nairobi, Kenya

ABSTRACT

The study aimed to identify the usability challenges of E-banking and mobile banking applications. Both quantitative and qualitative analyses were used in this study. For qualitative analysis, a review and analysis were done. This enabled the researcher to collect information on identifying the usability challenges of E-banking and mobile banking applications. This studys findings show that the users of internet banking and mobile banking applications encountered several challenges such as language barrier due to the use of English and Arabic languages in the banking applications. Some banking applications could not adapt to the different screen sizes. Compatibility issues were noted hence users on various mobile device operating systems such as iOS encountered issues accessing the applications. Some applications page take more time to load, and also, response time to customers is low. Some applications dont have a support center and no user guide to problems and application usage.

KEYWORDS

Usability, E-banking, mobile banking.


Architect: A Framework for the Migration to Microservices

Evgeny Volynsky, Merlin Mehmed and Stephan Krusche, Technical University of Munich, Munich, Germany

ABSTRACT

The migration from a monolithic to a microservice architecture is a recurring step in many software projects. In this paper, we describe a case study of a migration process of the learning management system Artemis. We migrate the shared database into multiple databases based on the database-per-microservice pattern. We developed Architect, a framework which is based on a domain-specific language for building dependable distributed systems as a template to ensure the data consistency of the distributed transactions using the Saga pattern. The Architect framework helped to reduce the complexity of using the Saga pattern. It introduced the eventual consistency in a distributed database system and decreased the coupling of the data storage. Architect currently does not provide a software evolution approach. We will add support for reengineering projects, which can facilitate the migration process of existing system.

KEYWORDS

Microservices architecture, Migration of monolithic application, Saga pattern, Domain-specific-language, Distributed transactions.


A Novel Approach to Network Intrusion Detection System using Deep Learning for SDN: Futuristic Approach

Mahmood Radhi Hadi1 and Adnan Saher Mohammed2, 1Department of Computer Engineering, Karabük University, Karabük, Turkey, 2Adnan Saher Mohammed, Karabük University, Karabük, Turkey

ABSTRACT

Software-Defined Networking (SDN) is the next generation to change the architecture of traditional networks. SDN is one of the promising solutions to change the architecture of internet networks. Attacks become more common due to the centralized nature of SDN architecture. It is vital to provide security for the SDN. In this study, we propose a Network Intrusion Detection System-Deep Learning module (NIDS-DL) approach in the context of SDN. Our suggested method combines Network Intrusion Detection Systems (NIDS) with many types of deep learning algorithms. Our approach employs 12 features extracted from 41 features in the NSL-KDD dataset using a feature selection method. We employed classifiers (CNN, DNN, RNN, LSTM, and GRU). When we compare classifier scores, our technique produced accuracy results of (98.63%, 98.53%, 98.13%, 98.04%, and 97.78%) respectively. Our proposed approach was successful in binary classification and detecting attacks, implying that our approach (NIDS-DL) might be used with great efficiency in the future.

KEYWORDS

Network Intrusion Detection System, Software Defined Networking, Deep Learning.


Quality Increases as the Error Rate Decreases

Fabrizio dAmore, Department of Computer, Control and Management Engineering, Sapienza University of Rome, Italy

ABSTRACT

In this paper we propose an approach to the design of processes and software that aims at decreasing human and software errors, that so frequently happen, making affected people using and wasting a lot of time for the need of fixing the errors. We base our statements on the natural relationship between quality and error rate, increasing the latter as the error rate decreases. We try to classify errors into several types and address techniques to reduce the likelihood of making mistakes, depending on the type of error. We focus on this approach related to organization, management and software design that will allow to be more effective and efficient in this period where mankind has been affected by a severe pandemic and where we need to be more efficient and effective in all processes, aiming at an industrial renaissance which we know to be not too far and easily reachable once the path to follow has been characterized, also in the light of the experience.

KEYWORDS

Errors, Quality, Processes, Governance, Digitalization, Digital hygiene, Information technology, Computer Science.


Evaluation of Semantic Answer Similarity Metrics

Farida Mustafazade1 and Peter F. Ebbinghaus2, 1GAM Systematic, 2Teufel Audio

ABSTRACT

We propose a cross-encoder augmented BERTScore model for semantic answer similarity trained on our new dataset of non-common names in English contexts. There are several issues with the existing general machine translation or natural language generation evaluation metrics, and question answering (QA) systems are indifferent in that sense. To build robust QA systems, we need the ability to have equivalently robust evaluation systems to verify whether model predictions to questions are similar to ground-truth annotations. The ability to compare similarity based on semantics as opposed to pure lexical overlap is important to compare models fairly and to indicate more realistic acceptance criteria in real-life applications. We build upon the first to our knowledge paper that uses transformer-based model metrics to assess semantic answer similarity and achieve higher correlations to human judgement in the case of no lexical overlap.

KEYWORDS

Question-answering, semantic textual similarity, exact match, pre-trained language models, cross-encoder, bi-encoder.


Native and non-native speech fluency analysis using prat software. A case study of English natives and non-English natives.-natives speech rate

Abderrahim Bouderbane, Department of English, University of Mila, Algeria

ABSTRACT

This research is conducted to investigate the characteristics of speaking fluently and its components like: pauses, filled pauses, and hesitations. These components are very important in speaking, and they are used to generate ideas, plan what to say next and organize the content. the aim of this research is to divide samples of recorded speech extracts of non-native speakers of English to distinguish between pauses, fillers and hesitations. A sample of ten native speakers (British), and 10 non native speakers (Algerian teachers of english) are evaluated using a software called Praat. This software is mainly used in phonetics to describe pitch, stress and intonation. This software can also be used to calculate the total time of speaking, and demonstrate the pauses graphically as they are produced by speakers.The measurement of one pause is made reliable by Praat as far as the measurement is adjustable at all levels (Styler 2013). The adjustment can start from a 0.00Hz to 5000 or 6000 Hz. This software cannot tell where one word start and where it ends. As such, the researcher needs to segment sound filles with information when using any sort of automated information. This is generally done by creating a textgrid annotation in a text file, which is saved separetly. These annotations are composed of different tiers which are marked either by intervals or specific points in the file. The results found that pauses and hesitations are more frequent and longer in non-native speaking production.

KEYWORDS

Fluency, speech processing, native speakers, non-native speakers, Pratt software.


A Distributed Energy-Efficient Unequal Clustering based Kruskal Heuristic for IoT Networks

Mohamed Sofiane BATTA1, 2, Zibouda ALIOUAT2, Hakim MABED1, and Malha MERAH2, 1FEMTO-ST Institute/DISC, University of Bourgogne Franche-Comte, Montbeliard, France, 2LRSD Laboratory, Computer Science Dept, Ferhat Abbas University Setif 1, Setif, Algeria

ABSTRACT

Energy efficiency is a major concern and a critical issue for energy constrained wireless networks. In this context, clustering is commonly used for topology man- agement and maximizing the network lifetime. Clustering approaches typically use a multi-hopping mechanism where Cluster Heads (CHs) near the Base Sta- tion (BS) consume higher energy since they relay data of farther CHs. Therefore, nodes close to the BS are strangled with an overloaded routing task and tend to die earlier than their intended lifetime, which affects the network performance. This situation is known as the hot spot problem that induces unbalanced energy consumption among CHs. The concern in this work is to address the intra- clustering structure in large scale environments to tolerate the network scaling and reasonably balance the energy consumption among CHs. In this regard, we propose a new Unequal Clustering algorithm based on Kruskal heuristic (UCKA) to optimize the network lifetime. UCKA applies the Kruskal heuristic in a dis- tributed fashion to perform a minimum spanning tree within large cluster which strengthen the intra-cluster routing structure and reduce the energy devoted to wireless communications. To the best of our knowledge, this is the first solution that combines the Kruskal heuristic and the unequal clustering to extend the devices durability and alleviate the hot spot problem. Simulation results indi- cate that UCKA can effectively reduce the energy consumption and lengthen the network lifetime.

KEYWORDS

IoT, WSN, Energy-aware protocols, Unequal Clustering, Hot spot energy prob- lem, Kruskal Heuristic.


Development Web Spatial Database System of the Royal Projects Under the Inspiration of King Rama 9Th

Ponthip Limlahapun and Puntip Jongkroy, Geography Department, Kasetsart University, Bangkok, Thailand

ABSTRACT

His Majesty King Bhumibol Adulyadej of Thailand, conferred with the title King Bhumibol or King Rama IX, was the ninth monarch of Thailand in the Chakri dynasty. His Majesty initiated more than 2,000 projects throughout his 70 year reign. He worked with diligence and inspired the Thai people in their living, gratitude, and morality. The Royal Development Projects Board (RDPB) collects all information, approves plans and activities, and directs, monitors, and coordinates the operation of government agencies and state enterprises concerning Royal Development Projects. The various kinds of projects spread all over the country, from north to south. The geolocation would present and benefit younger generations who may not realize where those projects took place. More than 50 years ago, his Majesty made an effort to visit the entirety of Thailand, even inconvenient places with poor road systems. Therefore, this research aims to develop an online web spatial database of the Royal Projects under the inspiration of King Rama 9th, which will manage various information (i.e., project name, covered areas, responsibility agencies) systematically based on geographic locations. Online data could be accessed anywhere, anytime, and at any place, and users can select their interest area/project based on places and project types, which is more flexible compared to information in printed text formats. The research process was comprised of data requested from the office of the RDPB, organizing it in a format used for further geospatial web development, as well as locating the projects’ location and designing a map display. The process discovered that the geocoding function embedded in a geographic information system (GIS) software (i.e., ArcGIS or QGIS) could not identify the locations in Thailand; therefore, written programming was used to search for the location in Google, which worked faster. Since there are approximately more than 2000 projects, online mapping should consider how quantitative data relates in a spatial form with map communication, aesthetics, and sufficient information. GIS could be applied in a variety of fields i.e., disaster, public facilities, utilities, etc. Data presentations in only number or text format may provide information that lacks project distribution or referenced surrounding areas. Therefore, the interactive map would help supplement the existing information to make it clearer and easier to understand. Additional, the virtual learning platform was developed to attract young generations.

KEYWORDS

geolocation, geospatial web-based system, royal projects, virtual learning platform.


On Accountable and Distributed Audit of Outsourced Data

Amit Kumar Dwivedi1, Naveen Kumar1 and Manik Lal Das2, 1IIIT Vadodara, Gujarat, India, 2DA-IICT, Gandhinagar, Gujarat, India

ABSTRACT

Data outsourcing in a third-party storage server is a costeffective business model, which is widely used by small and medium-sized enterprises. As data is managed by a third party, the accountability of the involved entities becomes an important aspect of this business model. Proof of violation of agreed terms between the data owner, users, and the storage service provider ensures accountability and is desirable as it enhances the trust of the data owner towards other system entities. Furthermore, strong accountability identifies the misbehaving entity’s activities and protects the honest entity from false accusations. In this paper, a strong, accountable distributed auditing scheme is proposed for outsourcing data in a public cloud setup. A proof-based solution with fine-grained read-write access is used to address the strong accountability and auditing of data as well as services. The scheme is analyzed and compared with the related schemes. The experimental results of the scheme show that the proposed scheme is efficient and practical.

KEYWORDS

Accountability, Distributed audit, Data integrity, Distributed cloud.