5,754 results
Search Results
2. Discussion paper: implications for the further development of the successfully in emergency medicine implemented AUD2IT-algorithm.
- Author
-
Przestrzelski, Christopher, Jakob, Antonina, Jakob, Clemens, and Hoffmann, Felix R.
- Subjects
DOCUMENTATION ,CURRICULUM ,HUMAN services programs ,EMERGENCY medicine ,EXPERIENCE ,MEDICAL records ,ELECTRONIC publications ,ALGORITHMS ,PATIENTS' attitudes - Abstract
The AUD2IT-algorithm is a tool to structure the data, which is collected during an emergency treatment. The goal is on the one hand to structure the documentation of the data and on the other hand to give a standardised data structure for the report during handover of an emergency patient. AUD2IT-algorithm was developed to provide residents a documentation aid, which helps to structure the medical reports without getting lost in unimportant details or forgetting important information. The sequence of anamnesis, clinical examination, considering a differential diagnosis, technical diagnostics, interpretation and therapy is rather an academic classification than a description of the real workflow. In a real setting, most of these steps take place simultaneously. Therefore, the application of the AUD2IT-algorithm should also be carried out according to the real processes. A big advantage of the AUD2IT-algorithm is that it can be used as a structure for the entire treatment process and also is entirely usable as a handover protocol within this process to make sure, that the existing state of knowledge is ensured at each point of a team-timeout. PR-E-(AUD2IT)-algorithm makes it possible to document a treatment process that, in principle, does not have to be limited to the field of emergency medicine. Also, in the outpatient treatment the PR-E-(AUD2IT)-algorithm could be used and further developed. One example could be the preparation and allocation of needed resources at the general practitioner. The algorithm is a standardised tool that can be used by healthcare professionals of any level of training. It gives the user a sense of security in their daily work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Data Paper as a Reward? Motivation, Consideration, and Perspective behind Data Paper Submission.
- Author
-
Huang, Pao Pei and Jeng, Wei
- Subjects
CITATION analysis ,PUBLICATIONS ,SCHOLARLY communication ,DATA ,SCHOLARLY publishing - Abstract
Data papers, as one of the channels to encourage researchers to open up research data under the open science movement, are expected to provide strong incentives through formal citations. However, few studies have investigated the drivers of this emerging type of publication. This study examines researchers' motivations, and considerations for data paper submission, as well as their perspectives on this scholarly publication. Through an in‐depth interview approach with ten data paper authors, our preliminary results found that, researchers are often driven by extrinsic factors to increase their publications, and data papers are sometimes viewed as territory claims before further research. Although the academic community widely recognizes the benefits of publishing data papers, some still cast a doubtful eye on its academic value and impact. We anticipate such insights on the driving forces and point of views of data papers could provide opportunities for stakeholders to fill gaps and strengthen the open science ecosystem. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
4. Analysis of Arch Bridge Condition Data to Identify Network-Wide Controls and Trends.
- Author
-
Campbell, Kristopher, Lydon, Myra, Stevens, Nicola-Ann, and Taylor, Su
- Subjects
ARCH bridges ,DATABASE management ,BRIDGE inspection ,ANALYSIS of variance - Abstract
This paper outlines an initial analysis of 20 years of data held on an electronic bridge management database for approximately 3500 arch bridges across Northern Ireland (NI) by the Department for Infrastructure. Arch bridges represent the largest group of bridge types, making up nearly 56% of the total bridge stock in NI. This initial analysis aims to identify trends that might help inform maintenance decisions in the future. Consideration of the Bridge Condition Indicator (BCI) average value for the overall arch bridge stock indicates the potential for regional variations in the overall condition and the potential for human bias in inspections. The paper presents the most prevalent structural elements and associated defects recorded in the inspections of arch bridges. This indicated a link to scour and undermining for the worst-conditioned arch bridges. An Analysis of Variance (ANOVA) analysis identified function, number of spans, and deck width as significant factors during the various deterioration stages in a bridge's lifecycle. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Equality, diversity and inclusion: A way forward for aquaculture in Scotland.
- Author
-
Kelling, Ingrid and Lawan, Ibrahim
- Subjects
AQUACULTURE ,EQUALITY ,SOCIAL justice ,VALUE chains ,SEAFOOD - Abstract
This paper focuses on the importance of equality, diversity, and inclusion (EDI) in the aquaculture industry, with a particular emphasis on the sector in Scotland. Aquaculture is a particularly important industry when it comes to EDI, given its potential to address Sustainable Development Goals on gender equality and diversity. The paper highlights the increasing attention being paid to EDI in many areas and the significant benefits to businesses that adopt EDI policies, including improved reputation, increased innovation, and greater profitability. This paper draws on a survey of EDI in Scottish aquaculture, a workshop and interviews with industry experts to suggest concrete actions that could improve EDI in the sector. A key priority is the collection and publication of workforce data in Scottish aquaculture as well as industry champions who raise awarenes and promote EDI, and supporting cross-sector organizations who provide EDI training. We conclude by calling for more research to support the development of EDI in Scottish aquaculture, which will contribute to future resilience and fairness as well as a dynamic, relevant, and accessible industry. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Medical Device Industry Growth, Challenges and Opportunities: An Overview.
- Author
-
Chaturvedi, Prerna, Goyal, Devesh, and Dwivedi, Sumeet
- Subjects
GROWTH industries ,MEDICAL software ,GOVERNMENT websites ,MEDICAL equipment ,BUDGET ,PRESS releases - Abstract
Any tool, apparatus, machine, appliance, implant, reagent for in vitro usage, software, material, or other like or related item that the producer intends to be used, either alone or in combination, for medical purposes is considered a medical device. A thorough review was conducted for the purpose of revealing the medical device data and medical device industry in the current paper. Some of the data incorporated in this paper has been taken from Government Websites, Press Releases, Media Reports, Deloitte Report, Union Budget 2022-23 and dully citation was mentioned. [ABSTRACT FROM AUTHOR]
- Published
- 2024
7. Geospatial Data Development for Rural Roads Planning, Construction and Management: Case Study of ADRAMP-2 Project.
- Author
-
Naphtali, Geoffrey
- Subjects
GEOSPATIAL data ,RURAL roads ,GEOGRAPHIC information systems ,INFORMATION retrieval ,DATA analysis - Abstract
Geospatial data describe objects and things with relation to geographic space often with location coordinate in a spatial referenced system. Rural roads are geospatial entities which can be captured and stored using geographic information system techniques. Therefore, a geographic information system is an essential tool to be placed on comprehending the information of spatial and non-spatial data over space and time. Data required for this paper include high resolution satellite imageries (QuickBird, SPOTS, IKONOS), Landsat (EOI Hyperion, DEM); local, state, and international boundaries; all Edges of transport routes connecting all settlements in the state, settlement data, stream network data, and terrain data. Roads associated attributes include location of potholes, bumps, drainages, drainage direction, and last date of road repaired, highest point, lowest point, mean elevation, maximum slope, average slope, road tears and wears which is expressed as roads condition. Road geometry data involve length of each road edge, width, and referential measurement. Data on nature of surfacing such as tar, asphalt, concrete, and laterite. Other data on roads are name, type, classification, and Geotagged pictures and video of all roads in Adamawa state. The field survey involves trailing the whole length of the roads from a referenced baseline at a vehicle speed using GPS Waypoint Navigators, handheld GPSs, and RoadLab application in iPad. These devices were used in collecting data on roads roughness index expressed as good, bad, excellent; visual assessment of road conditions and drainages were carried out during the field survey. When navigating the roads records taking of roads data, geotagged pictures, videos, and coordinates of event areas were captured. However, the use of RoadLab in assessing road conditions was only limited to Trunk a, b, and c roads across the state since they are the most tarred roads in regional road classification. Therefore, rigorous physical/visual surveys and assessment on all other rural roads were conducted. The result of the research indicate that trunk b, c and feeder roads are in bad shape and geospatial database of all road network in Adamawa State was developed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Doing place through data: Proliferation, profiling and the perils of portrayal in local climate action.
- Author
-
Knox, Hannah
- Abstract
Building on work which has shown the role of digital technologies in reframing environmental relations, this paper explores ethnographically how environmental data is reconfiguring the concept of place. The paper takes as its focus an action-research project within a UK based, citizen-oriented initiative called Newtown Energy Futures, in which we sought to enfold climate and energy data into a social-justice informed attempt at climate action. By exploring how the project used data as an invitation for citizens to engage with and participate in local infrastructural and environmental dynamics, the paper sheds light on how environmental data came to participate in the making of place and in doing so raised questions about how to rebuild the socio-material relations through which 'a sense of place' might be reproduced. As climate and energy data increasingly demand that places become enrolled into environmental projects our findings suggest that data enables place to emerge as a 'socio-technical potentiality' an observation that has implications both for both engagement with, and the study of data and place. In practical terms, we suggest that this refiguration of place has the effect of creating hopeful trajectories for change, whilst also posing difficult questions about the limits of participation in a data-infused form of place-based politics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Understanding a Key Electoral Tool: A New Dataset on the Global Distribution of Voter Identification Laws.
- Author
-
Barton, Tom
- Subjects
VOTER identification laws ,ELECTIONS - Abstract
Relating electoral laws to electoral integrity has long been a focus of academic research. Using the strategic-relational approach, as outlined in this special issue, it is possible to better understand how electoral laws shape the voter experience and electoral outcomes. This paper contributes to this understanding by looking at Voter Identification (ID) laws. With no consolidated dataset of voter ID laws existing outside the U.S.A. it is difficult to answer research questions put forward in this special issue, especially the second. This paper begins to address this shortfall by presenting the Comparative Voter ID Law (CVIL) index. Which has collected data on 246 individual electoral jurisdictions. Data presented show how voter ID laws are distributed globally, regionally, by regime type and level of democracy. The second part of the analysis goes on to describe voter ID laws by whether a jurisdiction has compulsory ID laws, how many different types of ID are accepted and the minimum number of ID documents that must be shown. Thirdly, other variables within the dataset are described. The CVIL will provide opportunities to understand how Voter ID laws are part of institutional design, are used by actors, shape the voter experience and electoral out-comes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Data-Driven Participatory Decision-making in Municipalities: Case Finland.
- Author
-
Leppäniemi, Osmo, Lipping, Tarmo, and Stenvall, Jari
- Subjects
TECHNOLOGICAL innovations ,DEEP learning ,ARTIFICIAL intelligence ,INNOVATION management ,DECISION making - Abstract
The decision-making process within public administration is a multifaceted procedure encompassing various stages such as initiation, preparation, actual decision-making, and execution. This paper investigates the structuring of data within municipal decision-making boards, highlighting its importance in fostering efficient data-driven decision-making practices. Specifically, the study focuses on the extraction of data from municipal systems for analysis using Artificial Intelligence techniques. Two primary analyses are conducted: categorization of decision-making cases and extraction of emotions from textual data. Initial findings from these analyses are presented, along with a discussion on the implications for improving public administration decisionmaking processes. Furthermore, this research addresses a significant gap in the literature pertaining to data-driven decision-making in public administration, with a particular emphasis on enhancing citizen participation in decisionmaking processes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
11. Assessing the enabling conditions for investment in Armenia's water security: Scorecard pilot test.
- Author
-
Trancon, Delia Sanchez and Halpern, Guy
- Subjects
WATER management ,SUSTAINABLE investing ,WATERMARKS ,PUBLIC investments ,WATER security - Abstract
Copyright of OECD Environment Working Papers is the property of Organisation for Economic Cooperation & Development and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
12. Assessing the enabling conditions for investment in water security.
- Author
-
Trancon, Delia Sanchez, Woodruff, Allison, Leflaive, Xavier, Davies, Lylah, and Agustsson, Sigurjon
- Subjects
WATER security ,WATER management ,SUSTAINABLE investing - Abstract
Copyright of OECD Environment Working Papers is the property of Organisation for Economic Cooperation & Development and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
13. REFINING DATA TO PREDICT VALVE FAILURES.
- Author
-
Golfetto, Guilherme
- Subjects
RELIEF valves ,VALVES ,PAPER industry ,CULTURAL maintenance ,MANUFACTURING processes ,MAINTENANCE costs ,FACTORIES - Abstract
A model is presented for using and refining data to monitor and identify failures early on in control and on-off valves. This model can help increase valve reliability, strengthen predictive maintenance, decrease operation risk and evolve programs adhering to industry 4.0. The maintenance strategy applied in the pulp and paper industry has a direct impact on operation and maintenance costs. In the valve department there still exists a preventive maintenance culture based on time. Therefore, the purpose of this work is to present a model for utilizing and refining data, providing the steps to structure and develop a predictive work methodology in valves using history and performance trend data presented in valve controllers of valves in the pulp and paper industry. The methodology is oriented to share with readers to provide fast decision making, allowing work to be developed with focus on data refinement, that is, to identify and understand what your valves are "saying" through history or performance trends, to boost operation safety and high valve performance. A Gartner Group study discovered that in industrial plant processes, 50% of preventive maintenance tasks were not necessary and 10% were actually harmful. As a result, the company allocated resources to the "wrong" problems, and it was also possible to identify the inclusion of problems in good valves. Using this methodology oriented at refining data and focusing on valve triage, as presented in this model, guides maintenance teams towards the "right" problems and not spend time in the field working on the "wrong" problems and, consequently focus tasks on the valves that are really need attention. The amount of maintenance tasks in the field can be decreased, therefore, industrial plants become a safer place by reducing maintenance teams' exposure to risk without the real need to remove the valve and thus decrease potential failures and accidents. [ABSTRACT FROM AUTHOR]
- Published
- 2024
14. 2022 BenchCouncil International Symposium on benchmarking, measuring and optimizing (Bench 2022) call for papers.
- Author
-
Chunjie Luo and Wanling Gao
- Subjects
BENCHMARKING (Management) ,DATA management ,HARDWARE ,COMPUTER software ,DATA - Published
- 2022
- Full Text
- View/download PDF
15. A future of AI-driven personalized care for people with multiple sclerosis.
- Author
-
Praet, Jelle, Anderhalten, Lina, Comi, Giancarlo, Horakova, Dana, Ziemssen, Tjalf, Vermersch, Patrick, Lukas, Carsten, van Leemput, Koen, Steppe, Marjan, Aguilera, Cristina, Kadas, Ella Maria, Bertrand, Alexis, van Rampelbergh, Jean, de Boer, Erik, Zingler, Vera, Smeets, Dirk, Ribbens, Annemie, and Paul, Friedemann
- Subjects
DIAGNOSIS ,QUALITY of life ,PROGNOSTIC models ,CENTRAL nervous system ,MULTIPLE sclerosis - Abstract
Multiple sclerosis (MS) is a devastating immune-mediated disorder of the central nervous system resulting in progressive disability accumulation. As there is no cure available yet for MS, the primary therapeutic objective is to reduce relapses and to slow down disability progression as early as possible during the disease to maintain and/or improve health-related quality of life. However, optimizing treatment for people with MS (pwMS) is complex and challenging due to the many factors involved and in particular, the high degree of clinical and subclinical heterogeneity in disease progression among pwMS. In this paper, we discuss these many different challenges complicating treatment optimization for pwMS as well as how a shift towards a more pro-active, data-driven and personalized medicine approach could potentially improve patient outcomes for pwMS. We describe how the ‘Clinical Impact through AI-assisted MS Care’ (CLAIMS) project serves as a recent example of how to realize such a shift towards personalized treatment optimization for pwMS through the development of a platform that offers a holistic view of all relevant patient data and biomarkers, and then using this data to enable AI-supported prognostic modelling. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Fake news detection: recent trends and challenges.
- Author
-
Thakar, Hemang and Bhatt, Brijesh
- Abstract
The proliferation of fake news in the digital age has spurred extensive research efforts toward developing effective detection techniques. This abstract delves into recent trends and challenges within the domain of fake news detection. The ubiquity of social media platforms and user-generated content has led to the rapid dissemination of misinformation, necessitating robust mechanisms for differentiating between authentic and fabricated news. This paper explores emerging approaches, such as advanced machine learning models, natural language processing techniques, and cross-modal analysis, which leverage textual, visual, and contextual cues to enhance detection accuracy. However, as fake news tactics become more sophisticated, challenges like adversarial attacks, data scarcity, and domain adaptation come to the forefront. This abstract highlights the ongoing efforts to address these challenges and emphasizes the importance of interdisciplinary collaboration to devise comprehensive solutions for combating the intricate landscape of fake news dissemination. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Pre-control of grinding surface quality by data-driven: a review.
- Author
-
Fu, Xiaojing, Lv, Lishu, Chen, Bing, Deng, Zhaohui, and Wu, Mingtao
- Abstract
Grinding surface quality is a key indicator to determine the performance of parts and the reliability of products. During the grinding process, the pre-control technology makes appropriate adjustments in advance on its key aspects to improve the grinding surface quality. Therefore, this paper constructs a data-driven pre-control system for grinding surface quality including target layer, acquisition layer, analysis layer, and application layer. From the perspective of this system, the research progress of grinding surface quality by data-driven is reviewed. In the target layer, the connotation, formation mechanism, and main influencing indexes of grinding surface quality are described in detail. In the acquisition layer, the acquisition and processing methods of grinding data and their advantages and disadvantages are comprehensively analyzed. In the analysis layer, the methods of using result data, process data, and real-time signal data to characterize grinding surface quality are introduced. In the application layer, the research status of grinding surface quality prediction, optimization, and control are comprehensively and systematically summarized. Finally, the paper presented a summary and outlook on the improvement of grinding surface quality by data-driven. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Responsibility for the Environmental Impact of Data-Intensive Research: An Exploration of UK Health Researchers.
- Author
-
Samuel, Gabrielle
- Abstract
Concerns about research’s environmental impacts have been articulated in the research arena, but questions remain about what types of role responsibilities are appropriate to place on researchers, if any. The research question of this paper is: what are the views of UK health researchers who use data-intensive methods on their responsibilities to consider the environmental impacts of their research? Twenty-six interviews were conducted with UK health researchers using data-intensive methods. Participants expressed a desire to take responsibility for the environmental impacts of their research, however, they were unable to consolidate this because there were often obstacles that prevented them from taking such role responsibilities. They suggested strategies to address this, predominantly related to the need for regulation to monitor their own behaviour. This paper discusses the implications of adopting such a regulatory approach as a mechanism to promote researchers’ role responsibilities using a neo-liberal critique. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Frankfurt School Legacy and the Critical Sociology of Media: Lifeworld in Digital Capitalism.
- Author
-
Bilić, Paško
- Subjects
FRANKFURT school of sociology ,WORKING class ,SOCIOLOGY ,MASS media ,CAPITALISM - Abstract
Just as the Frankfurt School responded to the radicalisation of the working class in Germany and the rise of post-war consumerism in the United States, today, we are confronted by platform monopolies, automated hyper-consumption and technological control. Critical approaches to digital media have exposed the structural coupling of Internet use and capital accumulation for almost two decades. However, many authors building on this tradition can struggle to understand how online social interaction is controlled beyond the worn-out critique of false consciousness or beyond conceptualising all digital activity mediated by data as labour. This paper will attempt to theoretically untangle the Marxian ontology of labour and the Frankfurt School-inspired critique of everyday life. This is not just theoretical nit-picking. Society becomes completely dominated if we accept no difference between wage labour and lifeworld activities. Each contains its internal struggles. The value form regulates both in different ways. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Supervised machine learning for the prediction of post‐operative clinical outcomes of hip and knee replacements: a review.
- Author
-
Ghadirinejad, Khashayar, Milimonfared, Roohollah, Taylor, Mark, Solomon, Lucian B., Graves, Stephen, Pratt, Nicole, de Steiger, Richard, and Hashemi, Reza
- Subjects
TOTAL knee replacement ,TOTAL hip replacement ,SUPERVISED learning ,MACHINE learning ,DATA analytics - Abstract
Prediction models are being increasingly used in the medical field to identify risk factors and possible outcomes. Some of these are presently being used to develop guidelines for improving clinical practice. The application of machine learning (ML), comprising a powerful set of computational tools for analysing data, has been clearly expanding in the role of predictive modelling. This paper reviews the latest developments of supervised ML techniques that have been used to analyse data related to post‐operative total hip and knee replacements. The aim was to review the most recent findings of relevant published studies by outlining the methodologies employed (most‐widely used supervised ML techniques), data sources, domains, limitations of predictive analytics and the quality of predictions. This paper reviews the latest developments of supervised machine learning (ML) techniques that have been used to analyse data related to post‐operative total hip and knee replacements. The aim was to review the most recent findings of relevant published studies by outlining the methodologies employed (most widely used supervised ML techniques), data sources, domains, limitations of predictive analytics and the quality of predictions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. MetaConfigurator: A User-Friendly Tool for Editing Structured Data Files.
- Author
-
Neubauer, Felix, Bredl, Paul, Xu, Minye, Patel, Keyuriben, Pleiss, Jürgen, and Uekermann, Benjamin
- Abstract
Textual formats to structure data, such as JSON, XML, and YAML, are widely used for structuring data in various domains, from configuration files to research data. However, manually editing data in these formats can be complex and time-consuming. Graphical user interfaces (GUIs) can significantly reduce manual efforts and assist the user in editing the files, but developing a file-format-specific GUI requires substantial development and maintenance efforts. To address this challenge, we introduce MetaConfigurator: an open-source web application that generates its GUI depending on a given schema. Our approach differs from other schema-to-UI approaches in three key ways: 1) It offers a unified view that combines the benefits of both GUIs and text editors, 2) it enables schema editing within the same tool, and 3) it supports advanced schema features, including conditions and constraints. In this paper, we discuss the design and implementation of MetaConfigurator, backed by insights from a small-scale qualitative user study. The results indicate the effectiveness of our approach in retrieving information from data and schemas and in editing them. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. SubEpiPredict: A tutorial-based primer and toolbox for fitting and forecasting growth trajectories using the ensemble n-sub-epidemic modeling framework.
- Author
-
Chowell, Gerardo, Dahal, Sushma, Bleichrodt, Amanda, Tariq, Amna, Hyman, James M., and Luo, Ruiyan
- Subjects
EPIDEMICS ,COVID-19 ,EPIDEMIOLOGY ,DATA ,PLATEAUS - Abstract
An ensemble n-sub-epidemic modeling framework that integrates sub-epidemics to capture complex temporal dynamics has demonstrated powerful forecasting capability in previous works. This modeling framework can characterize complex epidemic patterns, including plateaus, epidemic resurgences, and epidemic waves characterized by multiple peaks of different sizes. In this tutorial paper, we introduce and illustrate SubEpiPredict, a user-friendly MATLAB toolbox for fitting and forecasting time series data using an ensemble n-sub-epidemic modeling framework. The toolbox can be used for model fitting, forecasting, and evaluation of model performance of the calibration and forecasting periods using metrics such as the weighted interval score (WIS). We also provide a detailed description of these methods including the concept of the n-sub-epidemic model, constructing ensemble forecasts from the top-ranking models, etc. For the illustration of the toolbox, we utilize publicly available daily COVID-19 death data at the national level for the United States. The MATLAB toolbox introduced in this paper can be very useful for a wider group of audiences, including policymakers, and can be easily utilized by those without extensive coding and modeling backgrounds. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. DIKW as a General and Digital Twin Action Framework: Data, Information, Knowledge, and Wisdom.
- Author
-
Grieves, Michael
- Subjects
DIGITAL twins ,WISDOM ,ARTIFICIAL intelligence ,GOAL (Psychology) ,PHRONESIS - Abstract
This paper will discuss Data, Information, Knowledge, and Wisdom, which is commonly referred to as DIKW. The DIKW Pyramid Model is a hierarchical model that is often referenced in both academic and practitioner circles. This model will be discussed and shown to be faulty on several levels, including a lack of definitional agreement. A new DIKW framework with systems orientation will be proposed that focuses on what the DIKW elements do in the way humans think, not what they are by definition. Information as a replacement for wasted physical resources in goal-oriented tasks will be a central organizing point. The paper will move the DIKW discussion to the computer-based concept of Digital Twins (DTs) and its augmentation of how we can use DIKW to be more effective and efficient. This will especially be the case as we move toward Intelligent Digital Twins (IDTs) with Artificial Intelligence (AI). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. AUTOMATED DECISION-MAKING AND ACCESS TO DATA.
- Author
-
DACAR, Rok
- Subjects
DECISION making ,LEGAL instruments ,ANTITRUST law ,INTERNET marketing ,EUROPEAN Union law ,PERSONALLY identifiable information - Abstract
This paper explores the mechanisms by which companies can gain access to data necessary for automated decision-making in scenarios without direct contractual agreements, focusing on market-driven approaches. It introduces the concept of the essential facilities doctrine under EU competition law and examines its applicability to sets of data, alongside an examination of current ex-ante regulatory instruments which grant data access rights, such as the Type Approval Regulation, the Open Data Directive, the Electricity Directive, the Digital Markets Act, and the Data Act. These legal instruments are analysed in terms of their ability to facilitate access to data necessary for the automation of decision-making processes. In addition, the study looks at the challenges and opportunities presented by these legal instruments, including the nuances of applying the essential facilities doctrine to data. The article concludes that the most efficient way for a company to gain access to sets of data required for automated decision-making (in the absence of a contractual agreement) is to base its data access claim on an act of ex-ante regulation. If, however, such legal basis does not exist, a company could still base its data access claim on the essential facilities doctrine. The practical applicability of the doctrine to sets of data, however, remains unclear. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Security Concerns and Data Breaches for Data Deduplication Techniques in Cloud Storage: A Brief Meta-Analysis.
- Author
-
Goel, Anjuli, Prabha, Chander, Malik, Meenakshi, and Sharma, Preeti
- Subjects
CLOUD storage ,DATA security failures ,DATA warehousing ,LITERATURE reviews ,CLOUD computing ,TELECOMMUNICATION systems - Abstract
Over the last decade, data has exploded on cloud storage, and outsourcing data to cloud storage has become an appealing trend, which is not a fully reliable service. Data growth has made cloud storage management difficult. To provide more efficient and secure data storage on the cloud, cloud service providers employed a deduplication technique. Cloud data deduplication has developed as a popular research subject to boost the efficiency of cloud storage and minimize network communication traffic. It is not easy to provide security while increasing the huge volume of data on the cloud. Users are more concerned about the security of their data, as the data on the cloud is not secure and safe even after encryption. The various encryption and decryption available for providing the security to data. Conventional encryptions cannot be employed in data deduplication. To maintain the confidentiality and integrity of data, Convergent encryption and proof of ownership are applied. Various other approaches like proof of retrievability, Dekey, DupLESS, Identity-based encryption, Message-locked encryption, Attribute-based encryption, provable data possession, proof of storage with Deduplication, etc. are the security research topics these days. This paper presents a review of the literature on several proposed methodologies for safe deduplication techniques in cloud storage and current research trends. The primary contribution of the paper is to offer a complete understanding of the issues and solutions associated with safe data deduplication in cloud storage systems. The study dives into the various encryption solutions, security concerns, and potential challenges with data deduplication in the cloud. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Theorising Digital Afterlife as Techno-Affective Assemblage: On Relationality, Materiality, and the Affective Potential of Data.
- Author
-
Harju, Anu A.
- Subjects
AFTERLIFE ,AFFECT (Psychology) ,DATA management ,SOCIAL factors ,DEATH threats ,RITES & ceremonies - Abstract
In the ongoing academic discussion regarding what happens to our data after we die, how our data are utilised for commercial profit-making purposes, and what kinds of death-related practices our posthumous data figure in, the notion of digital afterlife is attracting increasing attention. While the concept of digital afterlife has been approached in different ways, the main focus remains on the level of individual loss. The emphasis tends to be on the role of posthumous digital artefacts in grief practices and death-related rituals or on data management issues relating to death. Building on a socio-technical view of digital afterlife, this paper offers, as a novel contribution, an understanding of digital afterlife as a techno-affective assemblage. It argues for the necessity of examining technological and social factors as mutually shaping and brings into the discussion of digital afterlife the notions of relationality, materiality, and the affective potential of data. The paper ends with ruminations about digital afterlife as a posthumanist project. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Mobilizing New Sources of Data: Opportunities and Recommendations.
- Author
-
Grégoire, Denis A., Ter Wal, Anne L. J., Little, Laura M., Bermiss, Sekou, Kotha, Reddi, and Gruber, Marc
- Subjects
DATA ,RESEARCH methodology ,RESEARCH ethics ,MACHINE learning ,THEORY ,ORGANIZATIONAL research - Abstract
An editorial is presented in which the authors propose opportunities and recommendations for the mobilization of new sources of data in management and organizational research. Topics discussed include methodology to analyze new data sources, including the use of machine learning for big data analysis, transparency and ethical considerations in the acquisition and handling of new data sources, and the importance of aligning theory and method.
- Published
- 2024
- Full Text
- View/download PDF
28. Data Privacy and Technological Ethics in Rural Area.
- Author
-
Porji, N. S., Mhatre, V. J., and Naik, R. P.
- Subjects
DATA privacy ,DIGITAL technology ,TECHNOLOGICAL innovations ,RURAL geography ,ETHICS ,TECHNOLOGY assessment - Abstract
In today's world of rapid increasing of technology, Data privacy and Ethics has become a major concern. The complexity of data and ethics poses incredible issues by collecting user personal information in digital world. the Digital service company usually bear responsibility of protecting users' personal data from unauthorized access however issue arise when service provider and platforms do not protect user collective data, and this led to data misuse and exposing sensible data of the person. To handling once data responsibly and ethically is important for safeguarding people's rights and preserving public trust. This study examines the ethical challenges in emerging technologies, including data privacy and ethics. The paper aims to provide awareness of data privacy and ethics in technology. [ABSTRACT FROM AUTHOR]
- Published
- 2023
29. Unintended consequences: data practice in the backstage of social media.
- Author
-
Zheng, Ken
- Subjects
INFORMATION technology personnel ,SOCIAL media ,SOCIAL impact ,DIGITAL technology ,SOCIAL sciences education - Abstract
Through an ethnographic study of Chinese IT professionals who integrate a form of data culture into the digital platforms they design, maintain, and operate daily within one of China's tech giants, this paper reveals numerous overlaps and interrelations between the data practices of Chinese IT professionals and the broader social implications that arise from them. The aim is to foster a more productive dialogue between the social studies of quantification and platform studies. This original research proposes the backstage as a potent methodology for inquiring into the role of Chinese IT professionals and domestic tech giants in advancing measuring systems and audit culture. This paper concludes by suggesting that such an approach can also be applied to wider studies of the paradox in quantification between its general claims and specific effects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. A Critical Analysis of the Escalating Cybercrime and its Impact in Bangladesh.
- Author
-
Rana, Md. Sohel
- Subjects
CRITICAL analysis ,COMPUTER crimes ,LAW enforcement ,COMPUTER hacking - Abstract
Cybercrime is a growing concern in Bangladesh, and this paper will critically analyze the nature and extent of cybercrime in Bangladesh and its impact on individuals, businesses, and the economy as a whole. The research will begin by exploring the various types of cybercrime prevalent in Bangladesh and the methods cybercriminals use to carry out these crimes. It will then examine the reasons behind the increase in cybercrime in Bangladesh, including factors such as inadequate cyber-security measures, low awareness among the public, and the widespread use of digital technology. Moreover, the research question for this paper is: What are the main causes of the growing cybercrime problem in Bangladesh, and what can be done to mitigate its impact? Finally, this paper also discusses the challenges faced by law enforcement agencies in tackling cybercrime and the measures that can be taken to address this issue. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. A Study of Entity Relationship Extraction Algorithms Based on Symmetric Interaction between Data, Models, and Inference Algorithms.
- Author
-
Feng, Ping, Su, Nannan, Xing, Jiamian, Bian, Jing, and Ouyang, Dantong
- Subjects
MACHINE learning ,ALGORITHMS ,CHINESE language ,WORD recognition ,SEMANTICS - Abstract
The purpose of this paper is to address the extraction of entities and relationships from unstructured Chinese text, with a particular emphasis on the challenges of Named Entity Recognition (NER) and Relation Extraction (RE). This will be achieved by integrating external lexical information and utilizing the abundant semantic information available in Chinese. We utilize a pipeline model that is applied separately to NER and RE by introducing an innovative NER model that integrates Chinese pinyin, characters, and words to enhance recognition capabilities. Simultaneously, we incorporate information such as entity distance, sentence length, and part-of-speech to improve the performance of relation extraction. We also delve into the interactions among data, models, and inference algorithms to improve learning efficiency in addressing this challenge. In comparison to existing methods, our model has achieved significant results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Different kinds of data: samples and the relational framework.
- Author
-
Potiron, Aline
- Abstract
This paper proposes an original definition of samples as a kind of data within the relational framework of data. The distinction between scientific objects (e.g., samples, data, models) often needs to be clarified in the philosophy of science to understand their role in the scientific inquiry. The relational framework places data at the forefront of knowledge construction. Their epistemic status depends on their evaluation as potential evidence in a research situation and their ability to circulate among researchers. While samples are significant in data-generating science, their role has been underexplored in the philosophy of data literature. I draw on a case study from data-centric microbiology, viz. amplicon sequencing, to introduce specifications of the relational framework. These specifications capture the distinctive epistemic role of samples, allowing the discussion of their significance in the inquiry process. I argue that samples are necessarily transformed to be considered as evidence, portable in the limits of a situation, and they act as world anchors for claims about a phenomenon. I compare these specifications with other data and evidence frameworks and suggest they are compatible. The paper concludes by considering the extension of these criteria in the context of biobanking. The specifications proposed here help analyze other life sciences cases and deepen our understanding of samples and their epistemological role in scientific research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. TRENDS IN DATA PROCESSING CONTROL IN CONTINUOUS PRODUCTION SYSTEMS.
- Author
-
DANEL, Roman, GAJDZIK, Bożena, and ROPYAK, Liubomyr
- Subjects
PROCESS control systems ,CONTINUOUS processing ,ELECTRONIC data processing ,INFORMATION storage & retrieval systems ,INDUSTRY 4.0 - Abstract
Purpose: This paper presents application solutions related to the use of Fourth Industrial Revolution technologies in industry. The aim of the paper is including in a discussion about new trends in the development of information systems supporting monitoring, control and diagnostics of production processes. The objective was realised using the example of continuous production in steelworks. Design/methodology/approach: The aim of the paper was realized in theoretical and practical part of the paper. The empirical part of the was the form of a case study – approach to mathematical model for predicting the enthalpy of foundry ladles. Findings: The conclusion is simply: each industrial branch has to use own transformation path towards Industry 4.0. Practical implications: Prepared approaches - based on analysis of trends in data processing control system in the continues production - can be used to build algorithm of predictive model in its system. Research limitations/implications: The authors are aware of the limitations resulting from the excessive generality of the model, but they ensure that they will continue research on the implementation of Industry 4.0 technology in the metallurgical (metal) industry, with particular emphasis on the continuous course of manufacturing processes. Originality/value: The value of the paper are prepared approaches to mathematical model for predicting the enthalpy of foundry ladles. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. Are We Cobblers without Shoes? Making Computer Science Data FAIR: In search of more efficient data sharing.
- Author
-
Noy, Natasha and Goble, Carole
- Subjects
COMPUTER science ,DATA ,INFORMATION sharing - Abstract
The article discusses the lack of efficiency in how research data within the Computer Science discipline are shared. The author uses the acronym FAIR -- findable, accessible, interoperable, and reusable -- to capture how data should be made available at conferences and in journals.
- Published
- 2023
- Full Text
- View/download PDF
35. A century of statistical Ecology.
- Author
-
Gilbert, Neil A., Amaral, Bruna R., Smith, Olivia M., Williams, Peter J., Ceyzyk, Sydney, Ayebare, Samuel, Davis, Kayla L., Leuenberger, Wendy, Doser, Jeffrey W., and Zipkin, Elise F.
- Subjects
ECOSYSTEMS ,STATISTICAL models ,ECOLOGISTS ,HISTORY of science ,BEST practices - Abstract
As data and computing power have surged in recent decades, statistical modeling has become an important tool for understanding ecological patterns and processes. Statistical modeling in ecology faces two major challenges. First, ecological data may not conform to traditional methods, and second, professional ecologists often do not receive extensive statistical training. In response to these challenges, the journal Ecology has published many innovative statistical ecology papers that introduced novel modeling methods and provided accessible guides to statistical best practices. In this paper, we reflect on Ecology's history and its role in the emergence of the subdiscipline of statistical ecology, which we define as the study of ecological systems using mathematical equations, probability, and empirical data. We showcase 36 influential statistical ecology papers that have been published in Ecology over the last century and, in so doing, comment on the evolution of the field. As data and computing power continue to increase, we anticipate continued growth in statistical ecology to tackle complex analyses and an expanding role for Ecology to publish innovative and influential papers, advancing the discipline and guiding practicing ecologists. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Sport, surveillance and the data economy: an expanding horizon for research and governance.
- Author
-
Bowles, Harry and McGee, Darragh
- Subjects
SERVER farms (Computer network management) ,PRACTICE (Sports) ,SPORTS ,GOVERNMENT policy ,DIGITAL technology ,CHILDREN'S rights - Abstract
Sport has undergone a data revolution. The unrelenting extraction of athlete data has become a topic of controversy and the focus of recent public campaigns and policy proposals. However, research and governance are lagging in addressing the full scope and complexity of sport's data ecosystem and the commercial assemblage involved in the generation and subsequent exploitation of athlete data. This paper examines the need to move beyond the current emphasis on the role and use of data as a product of situated surveillance practices in the sporting workplace to the capitalist orientation at the centre of a burgeoning data economy. In so doing, two interdependent theoretical concepts – surveillance culture and surveillance capitalism – are introduced as an analytical framework to shape future research, policy and debate aimed at understanding and protecting the rights of athletes in the light of their exposure to highly surveillant digital technologies used in the production of elite performance, and sport as a multi-mediated form of consumption. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. On informational injustice and epistemic exclusions.
- Author
-
Bagwala, Abbas
- Abstract
Information is a unique resource. Asymmetries that arise out of information access or processing capacities, therefore, enable a distinctive form of injustice. This paper builds a working conception of such injustice and explores it further. Let us call it informational injustice. Informational injustice is a consequence of informational asymmetries between at least two agents, which are deeply exacerbated due to modern information and communication technologies but do not necessarily originate with them. Informational injustice is the injustice of having information from an informational surplus being used to disadvantage the agent with less information. This paper argues that informational injustice exploits an agent as a knower, specifically exploiting the agent’s limitation in possessing or processing information—an agent is exploited because she is not informed or lacks in her ability to process information. In the case of lack of information, the agent simply does not know the information under consideration; a person is algorithmically manipulated or nudged to buy a product or vote for someone. In the case of a lack of capacity to process information, the agent simply cannot use the information, despite having access to it, to reach epistemically valuable states such as knowledge; a lawyer dupes you because he knows more about the inner workings of a courtroom and the law. Technically, you have access to the information your lawyer has, but you cannot make use of it due to constraints on time and cognitive effort. Informational injustice excludes the harmed agent from participating in knowledge practices. Thus, informational injustice is also a kind of epistemic exclusion. After fixing the concept of informational injustice, the paper distinguishes between two kinds of informational injustices: interactional informational injustice and structural informational injustice. The former concerns interactions between agents, while the latter concerns social structures that emerge out of interactions between agents. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Harmonizing data on correlates of sleep in children within and across neurodevelopmental disorders: lessons learned from an Ontario Brain Institute cross-program collaboration.
- Author
-
McPhee, Patrick G., Vaccarino, Anthony L., Naska, Sibel, Nylen, Kirk, Santisteban, Jose Arturo, Chepesiuk, Rachel, Andrade, Andrea, Georgiades, Stelios, Behan, Brendan, Iaboni, Alana, Wan, Flora, Aimola, Sabrina, Cheema, Heena, and Gorter, Jan Willem
- Subjects
SLEEP interruptions ,QUALITY of life ,DATA harmonization ,NEURAL development ,INTERNALIZING behavior ,DATA management - Abstract
There is an increasing desire to study neurodevelopmental disorders (NDDs) together to understand commonalities to develop generic health promotion strategies and improve clinical treatment. Common data elements (CDEs) collected across studies involving children with NDDs afford an opportunity to answer clinically meaningful questions. We undertook a retrospective, secondary analysis of data pertaining to sleep in children with different NDDs collected through various research studies. The objective of this paper is to share lessons learned for data management, collation, and harmonization from a sleep study in children within and across NDDs from large, collaborative research networks in the Ontario Brain Institute (OBI). Three collaborative research networks contributed demographic data and data pertaining to sleep, internalizing symptoms, health-related quality of life, and severity of disorder for children with six different NDDs: autism spectrum disorder; attention deficit/hyperactivity disorder; obsessive compulsive disorder; intellectual disability; cerebral palsy; and epilepsy. Procedures for data harmonization, derivations, and merging were shared and examples pertaining to severity of disorder and sleep disturbances were described in detail. Important lessons emerged from data harmonizing procedures: prioritizing the collection of CDEs to ensure data completeness; ensuring unprocessed data are uploaded for harmonization in order to facilitate timely analytic procedures; the value of maintaining variable naming that is consistent with data dictionaries at time of project validation; and the value of regular meetings with the research networks to discuss and overcome challenges with data harmonization. Buy-in from all research networks involved at study inception and oversight from a centralized infrastructure (OBI) identified the importance of collaboration to collect CDEs and facilitate data harmonization to improve outcomes for children with NDDs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Why and how did LifeWatch emerge?
- Author
-
Los, Wouter
- Subjects
DIGITAL technology ,BIODIVERSITY ,GRID computing ,ELECTRONIC data processing ,RESEARCH - Abstract
The original vision on what later became LifeWatch ERIC started about a quarter of a century ago in 1996. In those days, the promise of digital technologies entered biodiversity and ecosystem research. Not only by digitiing relevant information, but also with applications to process such data. While several (inter)national initiatives embarked on specific topics, there was also an idea that the upcoming view on grid computing provided attractive solutions for federated data sources, together with a strong computing capacity. This paper presents the history from conception to early actions, until actual preparations towards a research infrastructure on the European scale. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Best practices for Core Argo floats - part 1: getting started and data considerations.
- Author
-
Morris, Tamaryn, Scanderbeg, Megan, West-Mack, Deborah, Gourcuff, Claire, Poffa, Noé, Bhaskar, T. V. S. Udaya, Hanstein, Craig, Diggs, Steve, Talley, Lynne, Turpin, Victor, Zenghong Liu, and Owens, Breck
- Subjects
TELECOMMUNICATION satellites ,BEST practices ,METADATA ,DATA management ,QUALITY control ,RESEARCH teams - Abstract
Argo floats have been deployed in the global ocean for over 20 years. The Core mission of the Argo program (Core Argo) has contributed well over 2 million profiles of salinity and temperature of the upper 2000 m of the water column for a variety of operational and scientific applications. Core Argo floats have evolved such that the program currently consists of more than eight types of Core Argo float, some of which belong to second or third generation developments, three unique satellite communication systems (Argos, Iridium and Beidou) and two types of Conductivity, Temperature and Depth (CTD) sensor systems (Seabird and RBR). This, together with a well-established data management system, delayed mode data quality control, FAIR and open data access, make the program a very successful ocean observing network. Here we present Part 1 of the Best Practices for Core Argo floats in terms of how users can get started in the program, recommended metadata parameters and the data management system. The objective is to encourage new and developing scientists, research teams and institutions to contribute to the OneArgo Program, specifically to the Core Argo mission. Only by leveraging sustained contributions from current Core Argo float groups with new and emerging Argo teams and users who are eager to get involved and are actively encouraged to do so, can the OneArgo initiative be realized. This paper presents a list of best practices to get started in the program, set up the recommended metadata, implement the data management system with the aim to encourage new scientists, countries and research teams to contribute to the OneArgo Program. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. 'I know you like the back of my hand': biometric practices of humanitarian organisations in international aid.
- Author
-
Açιkyιldιz, Çağlar
- Subjects
BIOMETRIC identification ,BIOMETRY ,DATA protection ,SEMI-structured interviews ,HUMAN fingerprints ,HUMANITARIAN assistance - Abstract
Copyright of Disasters is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
42. PLATFORMS AND THE CRITIQUE OF POLITICAL ECONOMY.
- Author
-
BILIĆ, Paško
- Subjects
POLITICAL economic analysis ,CAPITALISM ,MEDIA studies ,IMPERIALISM ,SOCIAL sciences ,LABOR theory of value - Abstract
Copyright of Etkileşim: Academic Journal of Uskudar University Faculty of Communication is the property of Etkilesim and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
43. Chest Wall Motion Model of Cardiac Activity for Radar-Based Vital-Sign-Detection System.
- Author
-
Fan, Shaocan and Deng, Zhenmiao
- Subjects
MOTION capture (Human mechanics) ,MOTION capture (Cinematography) ,DEEP learning ,HEART beat ,MATHEMATICAL functions ,VITAL signs - Abstract
An increasing number of studies on non-contact vital sign detection using radar are now beginning to turn to data-driven neural network approaches rather than traditional signal-processing methods. However, there are few radar datasets available for deep learning due to the difficulty of acquiring and labeling the data, which require specialized equipment and physician collaboration. This paper presents a new model of heartbeat-induced chest wall motion (CWM) with the goal of generating a large amount of simulation data to support deep learning methods. An in-depth analysis of published CWM data collected by the VICON Infrared (IR) motion capture system and continuous wave (CW) radar system during respiratory hold was used to summarize the motion characteristics of each stage within a cardiac cycle. In combination with the physiological properties of the heartbeat, appropriate mathematical functions were selected to describe these movement properties. The model produced simulation data that closely matched the measured data as evaluated by dynamic time warping (DTW) and the root-mean-squared error (RMSE). By adjusting the model parameters, the heartbeat signals of different individuals were simulated. This will accelerate the application of data-driven deep learning methods in radar-based non-contact vital sign detection research and further advance the field. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Data use and data needs in critical infrastructure risk analysis.
- Author
-
Larsson, Aron and Große, Christine
- Subjects
INFRASTRUCTURE (Economics) ,RISK assessment ,BASIC needs ,GEOSPATIAL data ,SCIENTIFIC literature - Abstract
This paper provides an overview and mapping of needs and use of data related to formal risk analysis within the context of critical infrastructures, including activities of risk assessments and risk modelling as a part of preventive work against major accidents and crises. The aim is to contribute to a greater understanding of the type of data that is actually used in published sources where different risk assessment or risk analysis methods are applied for critical infrastructure protection. The study focuses specifically on the presentation of applications of quantitative or semi-quantitative risk analysis in the scientific literature within the domain of societally important services and critical infrastructures. The survey was delimited to peer reviewed research papers between the years 2010 and 2020 and resulted in a total number of 183 papers subject for evaluation. The results provide insights into the type of data that is used, missing or difficult to obtain for an application of the identified methods. To obtain a comprehensive critical infrastructure risk analysis data needs are related to three different data dimensions; geospatial topology data, socio-economic data, and infrastructure data. However, no databases are currently available with the explicit purpose to support critical infrastructure risk analysis. Even though this is not viewed as a problem in the examined papers, collecting that data is resource intensive which is a barrier for a more systematic use of formal risk analysis methods. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
45. The Editor and the Algorithm: Recommendation Technology in Online News.
- Author
-
Peukert, Christian, Sen, Ananya, and Claussen, Jörg
- Subjects
TREATMENT effect heterogeneity ,INNOVATION adoption ,PERSONALLY identifiable information ,INFORMATION storage & retrieval systems ,COST estimates - Abstract
We run a field experiment to study the relative performance of human curation and automated personalized recommendation technology in the context of online news. We build a simple theoretical model that captures the relative efficacy of personalized algorithmic recommendations and curation based on human expertise. We highlight a critical tension between detailed, yet potentially narrow, information available to the algorithm versus broad (often private), but not scalable, information available to the human editor. Empirically, we show that, on average, algorithmic recommendations can outperform human curation with respect to clicks, but there is significant heterogeneity in this treatment effect. The human editor performs relatively better in the absence of sufficient personal data and when there is greater variation in preferences. These results suggest that reverting to human curation can mitigate the drawbacks of personalized algorithmic recommendations. Our computations show that the optimal combination of human curation and automated recommendation technology can lead to an increase of up to 13% in clicks. In absolute terms, we provide thresholds for when the estimated gains are larger than our estimate of implementation costs. This paper was accepted by Chris Forman, information systems. Funding: C. Peukert acknowledges funding from the Swiss National Science Foundation [Grant No. 100013_197807]. Supplemental Material: The e-companion and data files are available at https://doi.org/10.1287/mnsc.2023.4954. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Emerging issues in fisheries science by fisheries scientists.
- Author
-
Murray, David S., Campón‐Linares, Victoria, O'Brien, Carl M., Thorpe, Robert B., Vieira, Rui P., and Gilmour, Fiona
- Subjects
SCIENTIFIC knowledge ,GLOBAL environmental change ,MARINE resource management ,SUSTAINABLE fisheries ,FISHERY sciences - Abstract
The current epoch in fisheries science has been driven by continual advances in laboratory techniques and increasingly sophisticated approaches to analysing datasets. We now have the scientific knowledge and tools to proactively identify obstacles to the sustainable management of marine resources. However, in addition to technological advances, there are predicted global environmental changes, each with inherent implications for fisheries. The 2023 symposium of the Fisheries Society of the British Isles called for "open and constructive knowledge exchange between scientists, stakeholders, managers and policymakers" (https://fsbi.org.uk/symposium-2023/), a nexus of collaborative groups best placed to identify issues and solutions. Arguably, the Centre of Environment, Aquaculture and Fisheries Science (Cefas) and their Scientific Advice for Fisheries Management (SAFM) Team sit at the centre of such a network. SAFM regularly engages with managers and stakeholders, undertakes scientific research, provides fisheries advice to the UK government, and are leading experts within the International Council for the Exploration of the Sea (ICES). As such, this paper is an opinion piece, linked to individual authors specialisms, that aims to highlight emerging issues affecting fisheries and suggest where research efforts could be focused that contribute to sustainable fisheries. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Drawing Data Together: Inscriptions, Asylum, and Scripts of Security.
- Author
-
Perret, Sarah and Aradau, Claudia
- Subjects
POLITICAL refugees ,SCHOLARLY method ,BORDER security ,INSCRIPTIONS ,SCRIPTS - Abstract
Data have become a vital device of border governance and security. Recent scholarship on the datafication of borders and migration at the intersection of science and technology studies and critical security studies has privileged concepts attuned to messiness, contingency, and friction such as data assemblages and infrastructures. This paper proposes to revisit and expand the analytical vocabulary of script analysis to understand what comes to count as data, what forms of data come to matter and how "drawing data together" reconfigures power and agency at Europe's borders. Empirically, we analyze controversies about the practices of asylum decision-making and age assessment in Greece. We show that agency of "users" is unequally distributed through anticipations of subscription and dis-inscription, while asylum seekers are conscripted within security scripts that restrict their agency. Moreover, as a multiplicity of inscriptions are produced, migrants' claims can be disqualified through circumscriptions of data and ascriptions of expertise. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Closing the information gaps: a systematic review of research on delay and disruption claims.
- Author
-
Ali, Babar, Aibinu, Ajibade A., and Paton-Cole, Vidal
- Abstract
Purpose: Delay and disruption claims involve a complex process that often result in disputes, unnecessary expenses and time loss on construction projects. This study aims to review and synthesize the contributions of previous research undertaken in this area and propose future directions for improving the process of delay and disruption claims. Design/methodology/approach: This study adopted a holistic systematic review of literature following Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines. A total of 230 articles were shortlisted related to delay and disruption claims in construction using Scopus and Web of Science databases. Findings: Six research themes were identified and critically reviewed including delay analysis, disruption analysis, claim management, contract administration, dispute resolution and delay and disruption information and records. The systematic review showed that there is a dearth of research on managing the wide-ranging information required for delay and disruption claims, ensuring the transparency and uniformity in delay and disruption claims' information and adopting an end-user's centred research approach for resolving the problems in the process of delay and disruption claims. Practical implications: Complexities in delay and disruption claims are real-world problems faced by industry practitioners. The findings will help the research community and industry practitioners to prioritize their energies toward information management of delay and disruption claims. Originality/value: This study contributes to the body of knowledge in delay and disruption claims by identifying the need for conducting more research on its information requirements and management. Subsequently, it provides an insight on the use of modern technologies such as drones, building information modeling, radio frequency identifiers, blockchain, Bigdata and machine learning, as tools for more structured and efficient attainment of required information in a transparent and consistent manner. It also recommends greater use of design science research approach for delay and disruption claims. This will help to ensure delay and disruption claims are the least complex and less dispute-prone process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Frontiers: Digital Hermits.
- Author
-
Miklós-Thal, Jeanine, Goldfarb, Avi, Haviv, Avery, and Tucker, Catherine
- Subjects
INFORMATION sharing - Abstract
As firms accumulate more data, users' data-sharing decisions may polarize. Some users may share all data, whereas others may share no data, becoming "digital hermits." When users share multidimensional data about themselves with a firm, the firm learns about the correlations between different dimensions of user data. We incorporate this type of learning into a model of a data market in which a firm acquires data from users with privacy concerns. Each user can share no data, only nonsensitive data, or their full data with the firm. As the firm collects more data and becomes better at drawing inferences about a user's privacy-sensitive data from their nonsensitive data, the share of new users who share no data ("digital hermits") grows. This growth of digital hermits occurs even though the firm offers higher compensation for a user's nonsensitive data and a user's full data as its ability to draw inferences improves. At the same time, the share of new users who share their full data also grows. The model thus predicts a polarization of users' data-sharing choices away from nonsensitive data sharing to no sharing and full sharing. Our model suggests that recent privacy policies, which are focused on control of data rather than inferences, may be misplaced. History: Anthony Dukes served as the senior editor. This paper was accepted through the Marketing Science: Frontiers review process. Funding: Partial financial support was received from the Social Sciences and Humanities Research Council of Canada [Grant 435-2023-0492]. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Regulating AI-Based Medical Devices in Saudi Arabia: New Legal Paradigms in an Evolving Global Legal Order.
- Author
-
Solaiman, Barry
- Subjects
MEDICAL equipment ,ARTIFICIAL intelligence ,MACHINE learning ,MEDICAL technology ,BEST practices - Abstract
This paper examines the Saudi Food and Drug Authority's (SFDA) Guidance on Artificial Intelligence (AI) and Machine Learning (ML) technologies based Medical Devices (the MDS-G010). The SFDA has pioneered binding requirements designed for manufacturers to obtain Medical Device Marketing Authorization. The regulation of AI in health is at an early stage worldwide. Therefore, it is critical to examine the scope and nature of the MDS-G010, its influences, and its future directions. It is argued that the guidance is a patchwork of existing international best practices concerning AI regulation, incorporates adapted forms of non-AI-based guidelines, and builds on existing legal requirements in the SFDA's existing regulatory architecture. There is particular congruence with the approaches of the US Food and Drug Administration (FDA) and the International Medical Device Regulators Forum (IMDRF), but the SFDA goes beyond those approaches to incorporate other best practices into its guidance. Additionally, the binding nature of the MDS-G010 is complex. There are binding 'components' within the guidance, but the incorporation of non-binding international best practices which are subordinate to national law results in a lack of clarity about how penalties for non-compliance will operate. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.