8,919 results on '"Data flow diagram"'
Search Results
2. Method of information technology for structure analysis of urban network fire-rescue units
- Author
-
Svitlana Danshyna and Artem Nechausov
- Subjects
p-median model ,geospatial analysis ,model of information flows of the process ,idef0-model ,data flow diagram ,fire service coverage area ,Computer engineering. Computer hardware ,TK7885-7895 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
The subject of this study is the process of analyzing the structure of the network of fire-rescue units of the city in the context of optimizing their spatial distribution. The purpose of this work is to increase the objectivity of decisions made while forming a city network of fire-rescue units by creating an information technology (IT) method for analysis of its structure based on the use of spatially distributed data. Objectives: to find ways to improve the level of fire safety, analyze existing approaches to the formation of a network of fire-rescue units, considering the peculiarities of building and organizing a network, and adapt the classical problem of placement; to propose a method for solving it to minimize the distance between the fire-rescue unit and the possible place of fire while ensuring maximum coverage of the territory by fire service; and to develop IT structure for its implementation based on the information flow model of the process of analyzing the fire-rescue units network using a geospatial approach. The following results were obtained. The study of the classical location problem and its adaptation to real problems arising from the analysis of the urban network of fire-rescue units made it possible to represent it as a set of independent complete bipartite graphs. To search the location of network nodes while solving an adapted problem, an IT method is developed, which, based on the p-median model, combines the author’s methodology for studying information processes and methods of geospatial analysis. Summarizing the requirements of the current legislation, a set of input and output IT data and a set of operations have been formed. The representation of the IT structure in the form of a data flow diagram explains how the set of factors is processed and generalized when making decisions on the creation and / or improvement of the existing city fire department. Conclusions. The results of the bibliographic search confirm the need to consider the spatial features of the area where it is planned to create a fire-rescue unit, as well as the spatial configuration of the urban network of existing fire stations, to evaluate its effectiveness using an integrated indicator. This requires the development of specialized methods focused on the use of geo-information systems for their implementation in decision support systems. Scientific and methodological support for IT has been developed, which gives local authorities a tool for analyzing fire safety in the city to create and / or improve the existing fire protection. An experiment to study the capabilities of the proposed method based on volunteered geographic information on Kharkiv city showed the effectiveness of its use for solving classical problems of placement, considering the accepted restrictions on the spatial availability of fire-rescue units. At the same time, additional opportunities appear in the formation of options for improving the network of fire-rescue units, considering their spatial distribution, workload, accessibility, and the resulting areas of coverage / non-coverage by the fire service. For example, the fire service coverage area of the existing structure of fire stations has been assessed. During the regulated time, it reaches 70% of the Kharkiv city area, and depending on the real road traffic, it can vary from 64.61% to 73.44%. It is illustrated that the creation of two additional fire and rescue units in the northern and southern parts of Kharkiv will increase the coverage area by approximately 5%, on average increasing it to 75.1%
- Published
- 2023
- Full Text
- View/download PDF
3. A data model for integrating BIM and blockchain to enable a single source of truth for the construction supply chain data delivery
- Author
-
Hijazi, Amer A., Perera, Srinath, Calheiros, Rodrigo N., and Alashwal, Ali
- Published
- 2023
- Full Text
- View/download PDF
4. A Web Application for Moving from One Spot to Another by Using Different Transport—A Scenario of the Research
- Author
-
Shoilekova, Kamelia, Ivanova, Boyana, Blazhev, Blagoy, Kacprzyk, Janusz, Series Editor, Gomide, Fernando, Advisory Editor, Kaynak, Okyay, Advisory Editor, Liu, Derong, Advisory Editor, Pedrycz, Witold, Advisory Editor, Polycarpou, Marios M., Advisory Editor, Rudas, Imre J., Advisory Editor, Wang, Jun, Advisory Editor, Silhavy, Radek, editor, and Silhavy, Petr, editor
- Published
- 2023
- Full Text
- View/download PDF
5. METHOD OF INFORMATION TECHNOLOGY FOR STRUCTURE ANALYSIS OF URBAN NETWORK FIRE-RESCUE UNITS.
- Author
-
DANSHYNA, Svitlana and NECHAUSOV, Artem
- Subjects
INFORMATION technology ,GEOSPATIAL data ,BIPARTITE graphs ,INFORMATION processing ,DATA flow computing - Abstract
The subject of this study is the process of analyzing the structure of the network of fire-rescue units of the city in the context of optimizing their spatial distribution. The purpose of this work is to increase the objectivity of decisions made while forming a city network of fire-rescue units by creating an information technology (IT) method for analysis of its structure based on the use of spatially distributed data. Objectives: to find ways to improve the level of fire safety, analyze existing approaches to the formation of a network of fire-rescue units, considering the peculiarities of building and organizing a network, and adapt the classical problem of placement; to propose a method for solving it to minimize the distance between the fire-rescue unit and the possible place of fire while ensuring maximum coverage of the territory by fire service; and to develop IT structure for its implementation based on the information flow model of the process of analyzing the fire-rescue units network using a geospatial approach. The following results were obtained. The study of the classical location problem and its adaptation to real problems arising from the analysis of the urban network of fire-rescue units made it possible to represent it as a set of independent complete bipartite graphs. To search the location of network nodes while solving an adapted problem, an IT method is developed, which, based on the p-median model, combines the author's methodology for studying information processes and methods of geospatial analysis. Summarizing the requirements of the current legislation, a set of input and output IT data and a set of operations have been formed. The representation of the IT structure in the form of a data flow diagram explains how the set of factors is processed and generalized when making decisions on the creation and / or improvement of the existing city fire department. Conclusions. The results of the bibliographic search confirm the need to consider the spatial features of the area where it is planned to create a fire-rescue unit, as well as the spatial configuration of the urban network of existing fire stations, to evaluate its effectiveness using an integrated indicator. This requires the development of specialized methods focused on the use of geo-information systems for their implementation in decision support systems. Scientific and methodological support for IT has been developed, which gives local authorities a tool for analyzing fire safety in the city to create and / or improve the existing fire protection. An experiment to study the capabilities of the proposed method based on volunteered geographic information on Kharkiv city showed the effectiveness of its use for solving classical problems of placement, considering the accepted restrictions on the spatial availability of fire-rescue units. At the same time, additional opportunities appear in the formation of options for improving the network of firerescue units, considering their spatial distribution, workload, accessibility, and the resulting areas of coverage / non-coverage by the fire service. For example, the fire service coverage area of the existing structure of fire stations has been assessed. During the regulated time, it reaches 70% of the Kharkiv city area, and depending on the real road traffic, it can vary from 64.61% to 73.44%. It is illustrated that the creation of two additional fire and rescue units in the northern and southern parts of Kharkiv will increase the coverage area by approximately 5%, on average increasing it to 75.1%. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. Systematic development of a data model for the blockchain-based embodied carbon (BEC) Estimator for construction
- Author
-
Rodrigo, M.N.N., Perera, Srinath, Senaratne, Sepani, and Jin, Xiaohua
- Published
- 2022
- Full Text
- View/download PDF
7. From User Stories to Data Flow Diagrams for Privacy Awareness: A Research Preview
- Author
-
Herwanto, Guntur Budi, Quirchmayr, Gerald, Tjoa, A. Min, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Gervasi, Vincenzo, editor, and Vogelsang, Andreas, editor
- Published
- 2022
- Full Text
- View/download PDF
8. Getting the UML’s Behavior and Interaction Diagrams by Extracting Business Rules Through the Data Flow Diagram
- Author
-
Kharmoum, Nassim, Rhalem, Wajih, Retal, Sara, bouchti, Karim El, Ziti, Soumia, Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Balas, Valentina E., editor, and Ezziyyani, Mostafa, editor
- Published
- 2022
- Full Text
- View/download PDF
9. Mathematical Model Investigation of a Technological Structure for Personal Data Protection.
- Author
-
Romansky, Radi
- Subjects
- *
DATA protection , *DIGITAL technology , *MATHEMATICAL models , *ELECTRONIC data processing , *PERSONALLY identifiable information , *CLOUD computing - Abstract
The contemporary digital age is characterized by the massive use of different information technologies and services in the cloud. This raises the following question: "Are personal data processed correctly in global environments?" It is known that there are many requirements that the Data Controller must perform. For this reason, this article presents a point of view for transferring some activities for personal data processing from a traditional system to a cloud environment. The main goal is to investigate the differences between the two versions of data processing. To achieve this goal, a preliminary deterministic formalization of the two cases using a Data Flow Diagram is made. The second phase is the organization of a mathematical (stochastic) model investigation on the basis of a Markov chain apparatus. Analytical models are designed, and their solutions are determined. The final probabilities for important states are determined based on an analytical calculation, and the higher values for the traditional version are defined for data processing in registers ("2": access for write/read −0.353; "3": personal data updating −0.212). The investigation of the situations based on cloud computing determines the increasing probability to be "2". Discussion of the obtained assessment based on a graphical presentation of the analytical results is presented, which permits us to show the differences between the final probabilities for the states in the two versions of personal data processing. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. K-harmonic Mean-Based Approach for Testing the Aspect-Oriented Systems
- Author
-
Vats, Richa, Kumar, Arvind, Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Singh, Vijendra, editor, Asari, Vijayan K., editor, Kumar, Sanjay, editor, and Patel, R. B., editor
- Published
- 2021
- Full Text
- View/download PDF
11. Perkembangan Riset Desain Sistem Informasi Menggunakan Pendekatan Terstruktur : Sistematic Literature Review
- Author
-
Yerik Afrianto Singgalen
- Subjects
data flow diagram ,entity relationship diagram ,sistematic literature review ,Mathematics ,QA1-939 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
This article aims to analyze the dynamics of information systems design using a structured model, based on the year of publication, the scope of application of information systems, and the information system platform designed (website, desktop, mobile). The method adopted is a literature study using a Systematic Literature Review (SLR) approach. The supporting instrument used in the literature study was the Nvivo 12 Plus. The articles that are evaluated are limited to journals indexed by Garuda (garuda.ristekbrin.go.id) from 2010-2021. The results of the evaluation of structured model-based information system design are classified based on three characteristics of information systems, namely: website-based information systems; desktop; and mobile. The classification based on case studies of information system design consists of five characteristics of institutions, namely government institutions; educational institutions; health institutions; profit institutions; and non-profit organizations. Tools evaluated are limited to Data Flow Diagrams (DFD) and Entity Relationship Diagrams (ERD). Based on the results of the literature study, it can be seen that the design of information systems based on a structured approach using Data Flow Diagrams (DFD) has increased in 2018 and 2019. Meanwhile, database system design using a structured approach-based Entity Relationship Diagram (ERD) has increased in 2019. The most studied scope of application of information systems based on the characteristics of institutions is profit institutions, educational institutions, and government agencies.
- Published
- 2021
- Full Text
- View/download PDF
12. Systematics: Classification and Grouping
- Author
-
Banning, Edward B., Eerkens, Jelmer, Series Editor, and Banning, Edward B.
- Published
- 2020
- Full Text
- View/download PDF
13. Data-Driven Threat Analysis for Ensuring Security in Cloud Enabled Systems.
- Author
-
Alwaheidi, Mohammed K. S. and Islam, Shareeful
- Subjects
- *
INDUSTRIAL management , *CLOUD computing , *SURFACE analysis , *ELECTRONIC data processing , *INFRASTRUCTURE (Economics) , *CLOUD storage - Abstract
Cloud computing offers many benefits including business flexibility, scalability and cost savings but despite these benefits, there exist threats that require adequate attention for secure service delivery. Threats in a cloud-based system need to be considered from a holistic perspective that accounts for data, application, infrastructure and service, which can pose potential risks. Data certainly plays a critical role within the whole ecosystem and organisations should take account of and protect data from any potential threats. Due to the variation of data types, status, and location, understanding the potential security concerns in cloud-based infrastructures is more complex than in a traditional system. The existing threat modeling approaches lack the ability to analyse and prioritise data-related threats. The main contribution of the paper is a novel data-driven threat analysis (d-TM) approach for the cloud-based systems. The main motivation of d-TM is the integration of data from three levels of abstractions, i.e., management, control, and business and three phases, i.e., storage, process and transmittance, within each level. The d-TM provides a systematic flow of attack surface analysis from the user agent to the cloud service provider based on the threat layers in cloud computing. Finally, a cloud-based use case scenario was used to demonstrate the applicability of the proposed approach. The result shows that d-TM revealed four critical threats out of the seven threats based on the identified assets. The threats targeted management and business data in general, while targeting data in process and transit more specifically. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
14. A V-Model Software Development Application for Sustainable and Smart Campus Analytics Domain
- Author
-
Kadir Hızıroğlu, Semih Bitim, and Onur Doğan
- Subjects
real-time analytics ,smart campus ,software development methodology ,data flow diagram ,context diagram ,Electronic computers. Computer science ,QA75.5-76.95 ,Information technology ,T58.5-58.64 - Abstract
As small cities, university campuses contain many opportunities for smart city applications to increase service quality and efficient use of public resources. Enabling technologies for Industry 4.0 play an important role in the goal of building a smart campus. An earlier work of the authors proposed a framework that was proposed for the development of a smart campus applications. It was the digital transformation process of İzmir Bakırçay University which is a newly established university in Turkey. This study is related to the final part of the developed framework. It aims to systematically develop a software for a sustainable and smart campus. V-model software development methodology was followed in the study. The methodology was applied for the corresponding stage which mainly includes real-time analytics, monitoring, reporting and performance management. The data flow diagrams were presented at three levels, a context diagram for a basic form of the system and parent diagram for the detailed software modules, and a child diagram for a selected module. This study can guide to the following researches to create a smart campus framework and a real-time analytics software.
- Published
- 2021
- Full Text
- View/download PDF
15. Ripple Effect Analysis Method of Data Flow Diagrams in Modifying Data Flow Requirements
- Author
-
Heayyoung, Jo, Omori, Takayuki, Ohnishi, Atsushi, Howlett, Robert James, Series Editor, Jain, Lakhmi C., Series Editor, Virvou, Maria, editor, Kumeno, Fumihiro, editor, and Oikonomou, Konstantinos, editor
- Published
- 2019
- Full Text
- View/download PDF
16. Design of Information Systems for Research Permit Application with Agile Method and Website Based Laravel Framework
- Author
-
Augie David Manuputty, Steven Hendrawan, and Budi Haryanto
- Subjects
laravel ,waterfall ,unified modelling language ,data flow diagram ,white box testing ,framework ,Mathematics ,QA1-939 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Era sekarang adalah era digital, hampir semua orang menggunakan teknologi untuk melakukan aktivitas mereka sehari-hari. Dalam Pemerintah Kota Salatiga, diperlukan teknologi yang dapat mendukung pekerjaan yang dilakukan, di bagian pengelolaan data permohonan kerja praktek, pengambilan data dan survei. Oleh karena itu, peneliti dalam masa kerja praktek di Dinas Komunikasi dan Informatika (Diskominfo) Pemerintah Kota Salatiga, diberikan tugas untuk membangun suatu sistem yang dapat mendukung kinerja karyawan di Badan Kesatuan Bangsa dan Politik (Bakesbangpol) Pemerintah Kota Salatiga dalam pengelolaan data permohonan dan pembuatan laporan. Peneliti menggunakan metode perancangan Agile dan framework Laravel dalam pembuatan website. Untuk menggambarkan kebutuhan pengguna, peneliti menggunakan Unified Modelling Language (UML). Pengujian aplikasi dilakukan dengan menggunakan metode White Box Testing.
- Published
- 2020
- Full Text
- View/download PDF
17. Mathematical Model Investigation of a Technological Structure for Personal Data Protection
- Author
-
Radi Romansky
- Subjects
personal data protection ,cloud ,formalization ,data flow diagram ,Markov chain ,stochastic investigation ,Mathematics ,QA1-939 - Abstract
The contemporary digital age is characterized by the massive use of different information technologies and services in the cloud. This raises the following question: “Are personal data processed correctly in global environments?” It is known that there are many requirements that the Data Controller must perform. For this reason, this article presents a point of view for transferring some activities for personal data processing from a traditional system to a cloud environment. The main goal is to investigate the differences between the two versions of data processing. To achieve this goal, a preliminary deterministic formalization of the two cases using a Data Flow Diagram is made. The second phase is the organization of a mathematical (stochastic) model investigation on the basis of a Markov chain apparatus. Analytical models are designed, and their solutions are determined. The final probabilities for important states are determined based on an analytical calculation, and the higher values for the traditional version are defined for data processing in registers (“2”: access for write/read −0.353; “3”: personal data updating −0.212). The investigation of the situations based on cloud computing determines the increasing probability to be “2”. Discussion of the obtained assessment based on a graphical presentation of the analytical results is presented, which permits us to show the differences between the final probabilities for the states in the two versions of personal data processing.
- Published
- 2023
- Full Text
- View/download PDF
18. Enrichment of UML by Introjection of Functional Model
- Author
-
Handigund, Shivanand M., Arunakumari, B. N., Chikkamannur, Ajeet, Howlett, Robert James, Series Editor, Jain, Lakhmi C., Series Editor, Satapathy, Suresh Chandra, editor, Bhateja, Vikrant, editor, and Das, Swagatam, editor
- Published
- 2018
- Full Text
- View/download PDF
19. Project Management Model with Designed Data Flow Diagram: The Case of ICT Hybrid Learning of Elderly People in the Czech Republic
- Author
-
Svobodová, Libuše, Černá, Miloslava, Hutchison, David, Series Editor, Kanade, Takeo, Series Editor, Kittler, Josef, Series Editor, Kleinberg, Jon M., Series Editor, Mattern, Friedemann, Series Editor, Mitchell, John C., Series Editor, Naor, Moni, Series Editor, Pandu Rangan, C., Series Editor, Steffen, Bernhard, Series Editor, Terzopoulos, Demetri, Series Editor, Tygar, Doug, Series Editor, Weikum, Gerhard, Series Editor, Nguyen, Ngoc Thanh, editor, Pimenidis, Elias, editor, Khan, Zaheer, editor, and Trawiński, Bogdan, editor
- Published
- 2018
- Full Text
- View/download PDF
20. eReserba Cardinal: An Integrated Room Reservation System for Higher Education Institutions.
- Author
-
Barzaga, Paz Clariz A., German, Josephine D., Binoya, Guiller O., Bucao, Samantha Dominique C., Ibe, Samantha Cyrine R., and Yap, Dave Cullen G.
- Subjects
INFORMATION technology ,FLOW charts ,CHARTS, diagrams, etc. ,UNIVERSITIES & colleges - Abstract
The advancement of technology required most organizations today to utilize an integrated information system. The use of such system was proven to help improve different types of processes through elimination of delays and minimizing errors. This study was conducted to optimize the scheduling of room reservation of a higher education institution in the Philippines which was found to be highly manual, time consuming, and tedious. Different units were responsible for the management of various rooms and their office locations were scattered around the campus that required too much traveling activities. Similarly, delays were also experienced due to availability issues of the concerned and authorized personnel who will approve the usage of rooms. Through information system design, an online system called eReserba Cardinal was created to facilitate ease of room management, provide real-time information, and offer a convenient manner of room reservation for students, faculty and personnel. [ABSTRACT FROM AUTHOR]
- Published
- 2020
21. Data-Driven Threat Analysis for Ensuring Security in Cloud Enabled Systems
- Author
-
Mohammed K. S. Alwaheidi and Shareeful Islam
- Subjects
threat modelling ,data level ,cloud based system ,data flow diagram ,control ,cloud service provider ,Chemical technology ,TP1-1185 - Abstract
Cloud computing offers many benefits including business flexibility, scalability and cost savings but despite these benefits, there exist threats that require adequate attention for secure service delivery. Threats in a cloud-based system need to be considered from a holistic perspective that accounts for data, application, infrastructure and service, which can pose potential risks. Data certainly plays a critical role within the whole ecosystem and organisations should take account of and protect data from any potential threats. Due to the variation of data types, status, and location, understanding the potential security concerns in cloud-based infrastructures is more complex than in a traditional system. The existing threat modeling approaches lack the ability to analyse and prioritise data-related threats. The main contribution of the paper is a novel data-driven threat analysis (d-TM) approach for the cloud-based systems. The main motivation of d-TM is the integration of data from three levels of abstractions, i.e., management, control, and business and three phases, i.e., storage, process and transmittance, within each level. The d-TM provides a systematic flow of attack surface analysis from the user agent to the cloud service provider based on the threat layers in cloud computing. Finally, a cloud-based use case scenario was used to demonstrate the applicability of the proposed approach. The result shows that d-TM revealed four critical threats out of the seven threats based on the identified assets. The threats targeted management and business data in general, while targeting data in process and transit more specifically.
- Published
- 2022
- Full Text
- View/download PDF
22. Systems Design
- Author
-
Sajja, Priti Srinivas and Sajja, Priti Srinivas
- Published
- 2017
- Full Text
- View/download PDF
23. An Intelligence-Defined Networking Architecture With Importance-Based Network Resource Control
- Author
-
Won-Tae Kim, Seongjin Yun, Hanjin Kim, Hyeong-su Kim, and Deun-Sol Cho
- Subjects
Network complexity ,Computer Networks and Communications ,business.industry ,Computer science ,Distributed computing ,Throughput ,Computer Science Applications ,Data flow diagram ,Resource (project management) ,Network interface controller ,Hardware and Architecture ,Control theory ,Signal Processing ,Resource allocation ,The Internet ,business ,Information Systems - Abstract
As network complexity increases because of the diversification of applications and services based on internet of autonomous things, it is difficult for humans to design the optimal control rule for software-defined networking controllers. Intelligence-defined networking, called IDN, is proposed to overcome this limitation through machine learning algorithms. Since the existing IDN approaches are mostly designed to optimize only the network quality of services, including throughput, jitter, and latency, the controllers don’t consider the importance of data in the applications and the services. This causes the controller to allocate insufficient resources to the crucial data flow, which leads critical problems, such as self-driving car accidents. To prevent this problem, we propose an importance-based IDN architecture that enables network controllers to manage network traffic with importance levels of data flows. Firstly, we devise an importance estimation scheme to set the importance level for the flows. Secondly, a dynamic resource allocation model of the controllers is developed by means of deep learning algorithms in order to make the optimal network resource. Additionally, an online learning mechanism based on weighted auto-labeling is adopted to continue enhancing the adaptability of the resource allocation model on runtime as the network conditions change. The evaluation results of the proposed architecture under various autonomous things scenarios show that the loss rate for data flows of higher importance is reduced by one-quarter compared to the case of a network controller without importance level and that the bandwidth waste ratio is reduced by ten percent compared to the rule-based model.
- Published
- 2023
24. Systematic Synthesis of Multiple-Input and Multiple-Output DC–DC Converters for Nonisolated Applications
- Author
-
Wei Liu, Dachu Dong, Hao Zhang, Han Ren, and Feng Zheng
- Subjects
Optimal design ,Data flow diagram ,Buck converter ,Computer science ,Interfacing ,Energy Engineering and Power Technology ,Topology (electrical circuits) ,Electrical and Electronic Engineering ,Converters ,Inductor ,Topology ,Network topology - Abstract
Multiple-input converters (MICs) and multiple-output converters (MOCs) are attractive solutions for interfacing various voltage levels. In order to reveal the intrinsic relationships among the diverse topologies and provide as many viable topologies as possible for practical applications, this paper aims to analyze the topology construction principles and propose a systematic approach to derive MICs and MOCs. To begin with, the general principle of the topology derivation is analyzed according to circuit network theory. Inspired by the idea of design controllable inductor power-flow loops (IPFLs), five construction types are proposed to create multiple power-flow network (MPN) in MICs and MOCs. Then, four basic switching cells for the topology synthesis are proposed and a flow diagram for the optimal design procedure is provided to guide the topology derivation. As one example, a family of viable and optimized MICs and MOCs with various characteristics are derived from typical Buck converter. Based on the analysis of the derived converters in continuous conduction mode (CCM) and discontinuous conduction mode (DCM), the number of magnetic components is not increased at all so that they are promising candidates for applications requiring compact size and high integration. Besides, topology comparison and selection among a family of MICs is also conducted in view of practical specifications. Finally, one derived dual-input converter is analyzed in detail and experimentally verified to demonstrate the theoretical results.
- Published
- 2022
25. Bibliography
- Author
-
Koelsch, George and Koelsch, George
- Published
- 2016
- Full Text
- View/download PDF
26. RDBMS Design and Implementation Tools
- Author
-
Lakhe, Bhushan and Lakhe, Bhushan
- Published
- 2016
- Full Text
- View/download PDF
27. SISTEM ANALISIS PENYAKIT MATA BERBASIS PHP DAN MYSQL MENGGUNAKAN METODE TF-IDF
- Author
-
Sobari, Dicky Iskandar and Sobari, Dicky Iskandar
- Abstract
The development of information technology has had a positive impact on the healthcare sector, particularly in disease analysis. Eye diseases are among the health issues that require early detection for effective prevention and treatment. In this context, this research proposes the development of a PHP and MySQL-based Eye Disease Analysis System utilizing the Term Frequency-Inverse Document Frequency (TF-IDF) method. The system is designed to provide a web platform facilitating the analysis of textual data related to eye diseases. Involving PHP as the programming language and MySQL as the database, the system allows efficient storage, management, and retrieval of medical information. The TF-IDF method is employed to analyze unique characteristics within textual data, enabling the identification of symptoms, diagnosis, and treatment of eye diseases. Context Diagram, Data Flow Diagram (DFD), and Entity-Relationship Diagram (ERD) serve as visual guides in designing this system. The Context Diagram provides a general overview of the system's interaction with its external environment, while DFD illustrates the flow of data within the system. ERD is used to represent the data structure within the database. The research results are expected to significantly contribute to the efficiency of eye disease analysis, providing faster information to medical professionals and enhancing the general public's understanding of eye diseases. This system is anticipated to serve as a foundation for further developments in the application of information technology in the healthcare sector, particularly in disease analysis., Perkembangan teknologi informasi membawa dampak positif pada sektor kesehatan, khususnya dalam analisis penyakit. Penyakit mata merupakan salah satu masalah kesehatan yang memerlukan deteksi dini untuk pencegahan dan penanganan yang efektif. Dalam konteks ini, penelitian ini mengusulkan pengembangan Sistem Analisis Penyakit Mata Berbasis PHP dan MySQL dengan penerapan Metode Term Frequency-Inverse Document Frequency (TF-IDF). Sistem ini dirancang untuk menyediakan platform web yang memungkinkan analisis data teks terkait penyakit mata. Melibatkan PHP sebagai bahasa pemrograman dan MySQL sebagai basis data, sistem ini memungkinkan penyimpanan, pengelolaan, dan pengambilan informasi medis dengan efisien. Metode TF-IDF digunakan untuk menganalisis karakteristik unik dalam data teks, memungkinkan identifikasi gejala, diagnosa, dan pengobatan penyakit mata. Diagram Konteks, Diagram Alur Data (DFD), dan Diagram Entity-Relationship (ERD) digunakan sebagai panduan visual dalam perancangan sistem ini. Diagram Konteks memberikan gambaran umum tentang interaksi sistem dengan lingkungan eksternalnya, sementara DFD digunakan untuk menggambarkan alur data dalam sistem. ERD digunakan untuk merepresentasikan struktur data dalam basis data. Hasil penelitian diharapkan dapat memberikan kontribusi signifikan dalam efisiensi analisis penyakit mata, memberikan informasi yang lebih cepat kepada profesional medis, dan meningkatkan pemahaman masyarakat umum tentang penyakit mata. Sistem ini diharapkan menjadi landasan untuk pengembangan lebih lanjut dalam penerapan teknologi informasi dalam bidang kesehatan, khususnya pada analisis penyakit.
- Published
- 2023
28. SISTEM INFORMASI PELAYANAN PENDAFTARAN PASIEN RAWAT INAP DI PUSKESMAS MENGGUNAKAN PEMOGRAMAN PHP
- Author
-
Haq, Haris Nizhomul, Muhajir, Akrom, Sobari, Dicky Iskandar, Haq, Haris Nizhomul, Muhajir, Akrom, and Sobari, Dicky Iskandar
- Abstract
This research discusses the development of an Inpatient Patient Registration Service Information System at Community Health Centers (Puskesmas) using PHP programming, as well as implementing DFD (Data Flow Diagram) and ERD (Entity Relationship Diagram) models. The system is designed to enhance efficiency and accuracy in the inpatient patient registration process, covering patient information management, bed availability scheduling, and coordination among medical staff. The DFD model diagram is utilized to illustrate the data flow within the system, while the ERD is employed to detail the relationships between entities in the database. The implementation of this system employs PHP programming to ensure code reliability and readability. The research results indicate that the system can expedite the patient registration process, improve information management, and optimize the utilization of inpatient beds. This study is expected to make a positive contribution to improving healthcare services at Community Health Centers and provide a foundation for the development of similar systems in a broader healthcare service environment. Consequently, the research findings are anticipated to offer significant benefits in enhancing the efficiency and effectiveness of inpatient patient registration services., Penelitian ini membahas pengembangan Sistem Informasi Pelayanan Pendaftaran Pasien Rawat Inap di Puskesmas dengan menggunakan pemrograman PHP, serta menerapkan model diagram DFD (Data Flow Diagram) dan ERD (Entity Relationship Diagram). Sistem ini dirancang untuk meningkatkan efisiensi dan akurasi dalam proses pendaftaran pasien rawat inap, mencakup manajemen informasi pasien, jadwal ketersediaan tempat tidur, dan koordinasi antara staf medis. Model diagram DFD digunakan untuk menggambarkan alur data dalam sistem, sedangkan ERD digunakan untuk merinci hubungan antar entitas dalam basis data. Implementasi sistem ini menggunakan pemrograman PHP untuk memastikan kehandalan dan keterbacaan kode. Hasil penelitian menunjukkan bahwa sistem dapat membantu mempercepat proses pendaftaran pasien, meningkatkan pengelolaan informasi, dan mengoptimalkan penggunaan tempat tidur rawat inap. Penelitian ini dapat memberikan kontribusi positif terhadap peningkatan pelayanan kesehatan di Puskesmas dan memberikan dasar untuk pengembangan sistem serupa di lingkungan pelayanan kesehatan yang lebih luas dan dapat memberikan manfaat yang signifikan dalam meningkatkan efisiensi dan efektivitas pelayanan pendaftaran pasien rawat inap.
- Published
- 2023
29. Security in vehicle networks of connected cars
- Author
-
Happel, Armin, Ebert, Christof, Bargende, Michael, editor, Reuss, Hans-Christian, editor, and Wiedemann, Jochen, editor
- Published
- 2015
- Full Text
- View/download PDF
30. Information Systems Development Methodologies
- Author
-
Isaias, Pedro, Issa, Tomayess, Isaias, Pedro, and Issa, Tomayess
- Published
- 2015
- Full Text
- View/download PDF
31. Purwarupa Aplikasi Android Jual Beli Ikan Hias Online
- Author
-
Yaman Khaeruzzaman and Firman Arrosyid
- Subjects
Blackbox Testing ,Database ,Entity Relationship Diagram ,MySQL ,PHP ,Java ,Data Flow Diagram - Abstract
Transaksi jual beli online semakin meningkat terutama untuk jenis barang atau benda mati. Hal ini dikarenakan sarana penunjang logistic pengiriman tidak ada kendala kebijakan pengiriman. Berbeda dengan jual beli mahluk hidup seperti ikan hias yang membutuhkan penanganan proses logistik yang khusus seperti penyediaan media air beroksigen di dalam bungkus yang tidak boleh bocor dan membutuhkan waktu pengiriman yang singkat untuk menjaga agar ikan hias tersebut tetap hidup dan sehat. Permasalah lain yang dihadapi yaitu sulitnya memahami penjelasan lokasi penjual, lamanya proses pengiriman, ikan hias tidak sesuai dengan gambar dan deskripsinya, harga ikan tidak transparan dan dicantumkan, serta penjual atau pembeli sulit dihubungi. Oleh sebab itu kebutuhan pengembangan aplikasi android jual beli ikan hias online muncul pada komunitas penggemar ikan hias Mega Mendung Betta Cirebon, dengan tujuan membangun sistem yang dapat menghubungkan penjual dan pembeli, dan memfasilitasi kemudahan proses jual beli seperti memberikan lokasi penjual dan pembeli yang lebih akurat dan estimasi waktu pengiriman dari tempat penjual ke tempat pembeli, serta membantu mempromosikan produk ikan hiasnya lebih luas. Penelitian ini menggunakan metode analisis deskriptif dan metode pengembangan sistem prototipe. Sumber data yang digunakan adalah data primer sekunder, didapat dari hasil observasi, wawancara dan studi pustaka. Implementasi aplikasi menggunakan bahasa pemrograman PHP dan Java, serta menggunakan MySQL sebagai pengolah database. Perancangan sistem dimodelkan dengan menggunakan Data Flow Diagram (DFD) dan Entity Relationship Diagram (ERD). Purwarupa aplikasi android jual beli ikan hias online berhasil dibuat dan diuji menggunakan blackbox testing dengan hasil 23 dari 24 butir pengujian atau 95,83 % telah berhasil dan sesuai dengan yang diharapakan.
- Published
- 2023
- Full Text
- View/download PDF
32. A Method of Data Flow Diagram Drawing Based on Word Segmentation Technique
- Author
-
Yuwen, Shuli, Wang, Kaifei, Li, Shaozi, editor, Jin, Qun, editor, Jiang, Xiaohong, editor, and Park, James J. (Jong Hyuk), editor
- Published
- 2014
- Full Text
- View/download PDF
33. An Evaluation of Argument Patterns Based on Data Flow
- Author
-
Yamamoto, Shuichiro, Linawati, editor, Mahendra, Made Sudiana, editor, Neuhold, Erich J., editor, Tjoa, A Min, editor, and You, Ilsun, editor
- Published
- 2014
- Full Text
- View/download PDF
34. Some Remarks on the Argument of Probability
- Author
-
Rocchi, Paolo and Rocchi, Paolo
- Published
- 2014
- Full Text
- View/download PDF
35. Balanced Scorecard Framework for Knowledge Management Solution Implementation
- Author
-
Ganesh, K., Mohapatra, Sanjay, Nagarajan, S., Ganesh, K., Mohapatra, Sanjay, and Nagarajan, S.
- Published
- 2014
- Full Text
- View/download PDF
36. DFI: The Data Flow Interface for High-Speed Networks
- Author
-
Jan Skrzypczak, Carsten Binnig, Lasse Thostrup, Matthias Jasny, and Tobias Ziegler
- Subjects
Data processing ,Remote direct memory access ,business.industry ,Computer science ,Data management ,Distributed computing ,Interface (computing) ,02 engineering and technology ,Network interface ,Data processing system ,Data flow diagram ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,Software ,Abstraction (linguistics) ,Information Systems - Abstract
In this paper, we propose the Data Flow Interface (DFI) as a way to make it easier for data processing systems to exploit high-speed networks without the need to deal with the complexity of RDMA. By lifting the level of abstraction, DFI factors out much of the complexity of network communication and makes it easier for developers to declaratively express how data should be efficiently routed to accomplish a given distributed data processing task. As we show in our experiments, DFI is able to support a wide variety of data-centric applications with high performance at a low complexity for the applications.
- Published
- 2022
37. Topology Derivation of Multiple-Port DC–DC Converters Based on Voltage-Type Ports
- Author
-
Min Jing, Feng Zheng, Dachu Dong, Wei Liu, and Hao Zhang
- Subjects
Optimal design ,Data flow diagram ,Control and Systems Engineering ,Buck converter ,Computer science ,Topology optimization ,Topology (electrical circuits) ,Electrical and Electronic Engineering ,Converters ,Topology ,Network topology ,Voltage - Abstract
Multiple-port DC-DC converters are characterized by a variety of kinds and a large number of circuit topologies. This paper aims to reveal the intrinsic relationship among the topologies of multiple-port DC-DC converters and propose the topology derivation method. Firstly, voltage- and current-type ports are summarized from basic DC-DC converters, and the approach of converting current-type ports into voltage-type ports is discussed. Then, according to the Kirchhoffs laws, four types of multiple-input multiple-output converters named input-port-series output-port-series (IPSOPS), input-port-series output-port-parallel (IPSOPP), input-port-parallel output-port-series (IPPOPS) and input-port-parallel output-port-parallel (IPPOPP), are presented. Secondly, the topology derivation method of multiple-port bidirectional DC-DC converters based on voltage-type ports is studied in terms of power flow paths in various operation modes, and then the topology optimization method is proposed. Particularly, a flow diagram for the optimal design procedure is given to guide the topology derivation for some specific requirements. Based on the proposed approach, a family of multiple-port DC-DC converters can be derived, which provides lots of viable candidates for practical engineering. Furthermore, one derived converter named the parallel-type three-port bidirectional Buck converter (Buck-TPC) is analyzed in three operation modes to demonstrate the topology derivation. Finally, the effectiveness of the above theoretical analysis is verified by those experimental results.
- Published
- 2022
38. IMPLEMENTATION OF HIGH FIDELITY PROTOTYPING IN ONLINE FIELD PLACEMENT PROGRAM SYSTEM DESIGN
- Author
-
Goldie Gunadi and Agus Budiyantara
- Subjects
Process (engineering) ,Computer science ,business.industry ,media_common.quotation_subject ,Fidelity ,Field (computer science) ,Data flow diagram ,Business Process Model and Notation ,Unified Modeling Language ,Systems design ,General Agricultural and Biological Sciences ,Software engineering ,business ,Root cause analysis ,computer ,media_common ,computer.programming_language - Abstract
Service activities carried out by the Academic and Student Administration Bureau (BAAK) STMIK Widuri for students who will carry out field placement program (KKP) are currently still carried out conventionally, this makes the process of making KKP cover letters and also assignments for supervisors takes time, which is quite long and less efficient. To overcome these problems, it is necessary to create a web-based application system that can simplify and speed up the activities of student KKP administration services. In this study, observations were made on the current system modeled using Business Process Modeling Notation (BPMN) and Data Flow Diagrams (DFD). The problem analysis process uses the Root Cause Analysis (RCA) method. The system design runs with UML diagrams, namely Use Case, Activity and Deployment Diagrams. Making a prototype with a high fidelity concept using the PHP programming language, HTML 5, CSS, jQuery, Bootstrap and DataTables frameworks and the Chart.js library. This research resulted in a prototype that describes the overall functionality of the Online MPA system, interactive and has an attractive interface.
- Published
- 2022
39. A Learning-Based Data Placement Framework for Low Latency in Data Center Networks
- Author
-
Zhiwu Huang, Kaiyang Liu, Jianping Pan, Jun Peng, Zhuofan Liao, Boyang Yu, and Jingrong Wang
- Subjects
Dynamic network analysis ,Artificial neural network ,Computer Networks and Communications ,Computer science ,Distributed computing ,020206 networking & telecommunications ,02 engineering and technology ,Computer Science Applications ,Data flow diagram ,Hardware and Architecture ,Asynchronous communication ,020204 information systems ,Distributed data store ,0202 electrical engineering, electronic engineering, information engineering ,Reinforcement learning ,Data as a service ,Latency (engineering) ,Software ,Information Systems - Abstract
Low-latency data service is an increasingly critical challenge for data center applications. In the modern distributed storage systems, proper data placement helps reduce the data movement delay, which can contribute to the service latency reduction tremendously. Existing data placement solutions have often assumed the prior distribution of data requests or discovered it via trace analysis. However, data placement is a difficult online decision-making problem faced with dynamic network conditions and time-varying user request patterns. The conventional static model-based solutions are less effective to handle the dynamic system. With an overall consideration of data movement and analytical latency, we develop a reinforcement learning-based framework DataBot+, automatically learning the optimal placement policies. DataBot+ adopts neural networks, trained with a variant of Q-learning, whose input is the real-time data flow measurements and whose output is a value function estimating the near-future latency. For instantaneous decision making, DataBot+ is decoupled into two asynchronous production and training components, ensuring that the training delay will not introduce extra overheads to handle the data flows. Evaluation results driven by real-world traces demonstrate the effectiveness of our design.
- Published
- 2022
40. RMLIM: A Runtime Machine Learning Based Identification Model for Approximate Computing on Data Flow Graphs
- Author
-
Yanxin Liu, C.G. Wang, Gang Qu, Ye Wang, and Jian Dong
- Subjects
Control and Optimization ,Renewable Energy, Sustainability and the Environment ,business.industry ,Computer science ,Computation ,Process (computing) ,Machine learning ,computer.software_genre ,Data flow diagram ,Embedded software ,Software ,Computational Theory and Mathematics ,Hardware and Architecture ,Overhead (computing) ,Artificial intelligence ,business ,Representation (mathematics) ,computer ,Data-flow analysis - Abstract
Approximate computing (AC) is an effective energy-efficient method for applications that have intrinsic error resilience. Early research efforts select noncritical portion of the computation, operations that have little impact on the accuracy of the results, for approximation. They ignore the runtime information and result in either under-approximation, which fails to reach the full potential of energy saving, or over-approximation, which causes unacceptable errors in the computation. A recently proposed runtime approach first estimates the noncritical portion for given input values and then performs accurate computation only on the critical portion. However, its complicated estimation process brings large runtime overhead and may not be suitable for real time embedded software. In this paper, we solve this problem by proposing a Runtime Machine Learning based Identification Model (RMLIM) to locate the noncritical portion in the data flow graph representation of any software program. RMLIM is trained offline by carefully generated training data set and then applied at runtime for each input. This reduces the runtime complexity of identifying noncritical parts. Our experiments show that, compared with the existing runtime AC method, our machine learning based approach can maintain similar energy efficiency and computation accuracy, but reduces the execution time by 40%-60%.
- Published
- 2022
41. Data integrity verification using HDFS framework in data flow material environment using cloud computing
- Author
-
P. Senthil Kumari and Nurul Aisha Binti Aboo Bucker
- Subjects
Data flow diagram ,business.industry ,Computer science ,Data integrity ,Distributed computing ,Server ,Big data ,Checksum ,Data analysis ,Data Corruption ,Cloud computing ,business - Abstract
A distributed computing platform is preferred to meet out the current challenges of cloud computing paradigm. Hadoop manages distribution of large volume of big data and data is stored using cluster analysis of data. Hadoop Distributed File System (HDFS) is used to manage the problems in handling big data. HDFS can easily store large terabytes of data using commodity servers. The proposed system provides efficient data integrity verification service for big data management based on HDFS. HDFS has data integrity commitment, since it handles large volumes of data. Hadoop HDFS framework uses CRC-32C checksum verification on data node of content blocks to identify data corruption. A data aware module is implemented in Hadoop which provides more clustering process and reduces the computing performances of server. A broad range of data types is analyzed and clustered. The proposed research work uses balanced and proxy encryption technique using Cloud me tool and gives optimized query time and resource usage.
- Published
- 2022
42. Building of online evaluation system based on socket protocol
- Author
-
Haijian Chen, Hai Sun, Peng Jiang, and Kexin Yan
- Subjects
General Computer Science ,business.industry ,Process (engineering) ,Computer science ,media_common.quotation_subject ,Port (computer networking) ,Data flow diagram ,Scalability ,Asynchronous I/O ,Software engineering ,business ,Function (engineering) ,Implementation ,Protocol (object-oriented programming) ,media_common - Abstract
As an important part of the evaluation reform, online evaluation system can effectively improve the efficiency of evaluation work, which has been paid attention by teaching institutions. The online evaluation system needs to support the safe and stable transmission of information between the client and the server, and socket protocol establishes the connection through the listening port, which can easily carry out the message transmission and process control. Because it can well meet the construction requirements of online evaluation system, it is applied in our study. The building of online evaluation system based on socket protocol includes the function design of students and teachers, data flow design, evaluation difficulty grading design and system implementation. The system uses Java language and MVC mode for development, which has good scalability and platform-independence. It realizes the paperless examination process and greatly reduces the workload of teachers. The contribution of this paper is mainly reflected in two aspects. One is to explore the construction of an online evaluation system based on the socket protocol, and it provide an Asynchronous IO technical solution for the network communication between the student and the server, which provides a reference for the development of similar systems. The second is to give the realization method of the difficulty classification of the evaluation, and classify the difficulty of the test questions, which lays the foundation for carrying out personalized testing and evaluation.
- Published
- 2022
43. Sistem Informasi Berbasis Web Pada Kantor Urusan Agama Di Tamasari Kabupaten Bogor
- Author
-
Anton Sukamto, Ade Mulyana, Suci Sri Utami Sutjipto, and M. Alvy Eka Fauzi
- Subjects
Data flow diagram ,World Wide Web ,Data processing ,Data collection ,Process (engineering) ,Computer science ,Information system ,Line (text file) ,Database application ,Witness - Abstract
In this fast-paced era, the need for precise, fast, and accurate information is something that must be done. Effectiveness and efficiency should be carried out according to needs so that the information can be relevant to its users. Thus, the registration information system, which has been manual in agencies or organizations, can be changed to online with accurate and timely data. Likewise, the Tamansari Religious Affairs Office in Bogor Regency. The current marriage registration system at the Tamansari Bogor Regency Office of Religious Affairs is no longer in line with the current development. The current system does not use computers as data processing tools, the results are still not well structured. Especially for processing the data of the prospective bride and groom. In this system, what is processed is registration data, bridal data, guardian data, witness data, admin data, and other data. The problem that occurs at this time is that the processed data feels slow, the resulting report has to pass data collection several times. In the information system to fulfill this final project, we will explain how to design and build a web-based marriage registration information system. The tools used in designing this system are MySQL, Entity Relational Diagrams, and Data Flow Diagrams, while for data collection using observation, interviews, and literature study. Meanwhile, database application development tools use MySQL and the programming language uses PHP. From this application, the benefits of data processing of brides, guardians, witnesses, etc. will be automatically obtained. Reports can also help by showing what is needed to process data quickly and easily. Keyword: Sistem Informasi, PHP My Admin, MySQL, Website, Entity Relational Diagram dan Data Flow Diagram
- Published
- 2021
44. Global Green Production by Integration of Automated Decision Layers
- Author
-
Hosseini, Reza, Helo, Petri, and Azevedo, Américo, editor
- Published
- 2013
- Full Text
- View/download PDF
45. Steps That Lead to the Diagnosis of Thyroid Cancer: Application of Data Flow Diagram
- Author
-
Paschali, Kallirroi, Tsakona, Anna, Tsolis, Dimitrios, Skapetis, Georgios, Iliadis, Lazaros, editor, Maglogiannis, Ilias, editor, Papadopoulos, Harris, editor, Karatzas, Kostas, editor, and Sioutas, Spyros, editor
- Published
- 2012
- Full Text
- View/download PDF
46. A Heuristic Local-sensitive Program-Wide Diffing Method for IoT Binary Files
- Author
-
Hui Huang, Lu Yu, Pan Zulie, Yuliang Lu, and Yi Shen
- Subjects
Multidisciplinary ,Source code ,Computer science ,Heuristic (computer science) ,Firmware ,media_common.quotation_subject ,Distributed computing ,Code reuse ,Denial-of-service attack ,Dependence analysis ,computer.software_genre ,Data flow diagram ,Precision and recall ,computer ,media_common - Abstract
Code reuse brings vulnerabilities in third-party library to many Internet of Things (IoT) devices, opening them to attacks such as distributed denial of service. Program-wide binary diffing technology can help detect these vulnerabilities in IoT devices whose source codes are not public. Considering the architectures of IoT devices may vary, we propose a data-aware program-wide diffing method across architectures and optimization levels. We rely on the defined anchor functions and call relationship to expand the comparison scope within the target file, reducing the impact of different architectures on the diffing result. To make the diffing result more accurate, we extract the semantic features that can represent the code by data flow dependence analysis. Earth mover distance is used to calculate the similarity of functions in two files based on semantic features. We implemented a proof-of-concept DAPDiff and compared it with baseline BinDiff, TurboDiff and Asm2vec. Experiments showed the availability and effectiveness of our method across optimization levels and architectures. DAPDiff outperformed BinDiff in recall and precision by 41.4% and 9.2% on average when making diffing between standard third-party library and the real-world firmware files. This proves that DAPDiff can be applicable for the vulnerability detection in IoT devices.
- Published
- 2021
47. Toward Taming the Overhead Monster for Data-flow Integrity
- Author
-
HuangJiayi, FengLang, HuJiang, and HuangJeff
- Subjects
FOS: Computer and information sciences ,business.industry ,Computer science ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Data flow diagram ,Software ,Embedded system ,Hardware Architecture (cs.AR) ,Overhead (computing) ,Electrical and Electronic Engineering ,Computer Science - Hardware Architecture ,business ,Range (computer programming) ,Monster - Abstract
Data-Flow Integrity (DFI) is a well-known approach to effectively detecting a wide range of software attacks. However, its real-world application has been quite limited so far because of the prohibitive performance overhead it incurs. Moreover, the overhead is enormously difficult to overcome without substantially lowering the DFI criterion. In this work, an analysis is performed to understand the main factors contributing to the overhead. Accordingly, a hardware-assisted parallel approach is proposed to tackle the overhead challenge. Simulations on SPEC CPU 2006 benchmark show that the proposed approach can completely enforce the DFI defined in the original seminal work while reducing performance overhead by 4x, on average., 24 pages, 16 figures, published in ACM Transactions on Design Automation of Electronic Systems
- Published
- 2021
48. GLYFE: review and benchmark of personalized glucose predictive models in type 1 diabetes
- Author
-
Mounim A. El Yacoubi, Mehdi Ammi, and Maxime De Bois
- Subjects
Blood Glucose ,Source code ,Computer science ,media_common.quotation_subject ,Biomedical Engineering ,Machine learning ,computer.software_genre ,Field (computer science) ,Humans ,Time series ,media_common ,Artificial neural network ,business.industry ,Blood Glucose Self-Monitoring ,Reproducibility of Results ,Usability ,Benchmarking ,Computer Science Applications ,Data flow diagram ,Diabetes Mellitus, Type 1 ,Glucose ,Benchmark (computing) ,Artificial intelligence ,business ,computer - Abstract
Due to the sensitive nature of diabetes-related data, preventing them from being easily shared between studies, and the wide discrepancies in their data processing pipeline, progress in the field of glucose prediction is hard to assess. To address this issue, we introduce GLYFE (GLYcemia Forecasting Evaluation), a benchmark of machine learning-based glucose predictive models. We present the accuracy and clinical acceptability of nine different models coming from the literature, from standard autoregressive to more complex neural network-based models. These results are obtained on two different datasets, namely UVA/Padova Type 1 Diabetes Metabolic Simulator (T1DMS) and Ohio Type-1 Diabetes Mellitus (OhioT1DM), featuring artificial and real type 1 diabetic patients respectively. By providing extensive details about the data flow as well as by providing the whole source code of the benchmarking process, we ensure the reproducibility of the results and the usability of the benchmark by the community. Those results serve as a basis of comparison for future studies. In a field where data are hard to obtain, and where the comparison of results from different studies is often irrelevant, GLYFE gives the opportunity of gathering researchers around a standardized common environment.
- Published
- 2021
49. DAQ readout prototype for JUNO
- Author
-
Kejun Zhu, Fei Li, Xiaolu Ji, and Tong Zhou
- Subjects
Ethernet ,Nuclear and High Energy Physics ,Photomultiplier ,Computer science ,business.industry ,Detector ,Gigabit Ethernet ,Data flow diagram ,Data acquisition ,Nuclear Energy and Engineering ,business ,Computer hardware ,Graphical user interface ,Jiangmen Underground Neutrino Observatory - Abstract
The Jiangmen Underground Neutrino Observatory (JUNO) is a multi-purpose underground neutrino experiment. About 18000 20-inch photomultipliers (PMTs) are instrumented in the Central Detector to detect the photons, and the signals will be captured by high-speed high-resolution waveform full sampling technique. In addition, about 25000 3-inch PMTs are also instrumented to capture T/Q Hits information. This work builds a DAQ readout prototype system for JUNO, which can be used to test the full readout chain for 20-inch and 3-inch PMTs. The system is designed to continuously readout the data flow from multiple electronics channels, check the raw data, and finally save them into disk. Meanwhile, the graphical user interface (GUI) provides a real time display of the sampled waveforms. The system is developed with the open source Qt platform. The design and performance of the system have been verified in detail, particularly by performance test based on Gigabit Ethernet and 10-Gigabit Ethernet. A DAQ readout prototype system has been developed for JUNO, which is successfully applied to electronics testing system in Italy and PMT testing system in Guangdong.
- Published
- 2021
50. ASSESSING THE IMPLICATIONS OF SCHREMS II FOR EU–US DATA FLOW
- Author
-
Maria Helen Murphy
- Subjects
Data flow diagram ,Political science ,Political Science and International Relations ,Econometrics ,Law - Abstract
With the constant flow of data across jurisdictions, issues regarding conflicting laws and the protection of rights arise. This article considers the EU–US data transfer relationship in the aftermath of the decision in Data Protection Commissioner v Facebook Ireland and Maximillian Schrems where the Court of Justice of the European Union (CJEU) invalidated an EU–US data transfer agreement for the second time in just five years. This judgment continues the line of cases emphasising the high value the Court places on securing EU personal data in accordance with EU data protection standards and fundamental rights. This article assesses the implications of the ruling for the vulnerable EU–US data transfer relationship.
- Published
- 2021
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.