142 results
Search Results
2. The research of human individual’s conformity behavior in emergency situations
- Author
-
Chen, Min
- Published
- 2020
- Full Text
- View/download PDF
3. Towards the Augmentation of Digital Twin Performance.
- Author
-
Charrier, Quentin, Hakam, Nisar, Benfriha, Khaled, Meyrueis, Vincent, Liotard, Cyril, Bouzid, Abdel-Hakim, and Aoussat, Améziane
- Subjects
DIGITAL twins ,CYBER physical systems ,MANUFACTURING processes ,KEY performance indicators (Management) ,DATA analysis - Abstract
Digital Twin (DT) aims to provide industrial companies with an interface to visualize, analyze, and simulate the production process, improving overall performance. This paper proposes to extend existing DT by adding a complementary methodology to make it suitable for process supervision. To implement our methodology, we introduce a novel framework that identifies, collects, and analyses data from the production system, enhancing DT functionalities. In our case study, we implemented Key Performance Indicators (KPIs) in the immersive environment to monitor physical processes through cyber representation. First, a review of the Digital Twin (DT) allows us to understand the status of the existing methodologies as well as the problem of data contextualization in recent years. Based on this review, performance data in Cyber–Physical Systems (CPS) are identified, localized, and processed to generate indicators for monitoring machine and production line performance through DT. Finally, a discussion reveals the difficulties of integration and the possibilities to respond to other major industrial challenges, like predictive maintenance. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. Upgraded Thoth: Software for Data Visualization and Statistics.
- Author
-
Laher, Russ R., Masci, Frank J., Rebull, Luisa M., Schurr, Steven D., Burt, Wendy, Laity, Anastasia, Swain, Melanie, Shupe, David L., Groom, Steve, Rusholme, Benjamin, Kong, Mih-Seh, Good, John C., Gorjian, Varoujan, Akeson, Rachel, Fulton, Benjamin J., Ciardi, David R., and Carey, Sean
- Subjects
DATA visualization ,LAPTOP computers ,VISUALIZATION ,ASTRONOMY ,DATA analysis - Abstract
Thoth is a free desktop/laptop software application with a friendly graphical user interface that facilitates routine data-visualization and statistical-calculation tasks for astronomy and astrophysical research (and other fields where numbers are visualized). This software has been upgraded with many significant improvements and new capabilities. The major upgrades consist of: (1) six new graph types, including 3D stacked-bar charts and 3D surface plots, made by the Orson 3D Charts library; (2) new saving and loading of graph settings; (3) a new batch-mode or command-line operation; (4) new graph-data annotation functions; (5) new options for data-file importation; and (6) a new built-in FITS-image viewer. There is now the requirement that Thoth be run under Java 1.8 or higher. Many other miscellaneous minor upgrades and bug fixes have also been made to Thoth. The newly implemented plotting options generally make possible graph construction and reuse with relative ease, without resorting to writing computer code. The illustrative astronomy case study of this paper demonstrates one of the many ways the software can be utilized. These new software features and refinements help make astronomers more efficient in their work of elucidating data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. ANALYSIS OF THE QUALITY LEVEL OF TRANSPORT SERVICES USING SPECIALISED SOFTWARE.
- Author
-
Jonasíková, Dominika
- Subjects
CUSTOMER satisfaction ,CHI-squared test ,DATA analysis - Abstract
Copyright of Young Science / Mladá Veda is the property of Vydavatelstvo Universum and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
6. Cash Flow Impact on Business Results.
- Author
-
Pepur, Petar and Turić, Nikolina
- Subjects
CASH flow ,WELL-being ,BUSINESS success ,FINANCIAL statements ,DATA analysis - Abstract
We often meet the question what is more important indicator for the company business, the cash flow or the business result. It is very difficult to define what is more important because the both are essential and crucial parameters of business so we need to keep track of both. Each company have tendency to generate the positive business results and operate with the positive cash flow. Also the often question is how they are related. Understanding how cash flow relates to business results and how business results relate to cash flow is very important and crucial for success and prosperity of business. A positive business result does not have to mean positive cash flow and vice versa. The aim of this paper is to make analysis between cash flow and business results. The research objective was to explore the correlation between cash flow and business results using the data of net cash flow and business results from the financial statements. The research was conducted on the 20 companies from the Croatian capital market by using static panel data analysis for the period from 2014 to 2017. The research result shows that cash flow impact the business results and that positive cash flow resulting with higher business results. So this paper implied the important but not statistically significant role of cash flow in the determination of business results. [ABSTRACT FROM AUTHOR]
- Published
- 2020
7. MODELS FOR ANALYSIS OF THE IMPORT OF GOODS BASED ON ACCOUNTING INFORMATION.
- Author
-
Petrova, Diana Dimitrova
- Subjects
INTERNATIONAL trade ,ACCOUNTING information storage & retrieval systems ,ORGANIZATION management ,INFORMATION retrieval ,DATA analysis - Abstract
Accounting information has an extremely important role in carrying out the analysis of the import activities of the enterprises in the contemporary global business environment. This analysis is a key prerequisite for effective management of import transactions and needs to be provided with detailed information through a rationally organized system for analytical accounting. This scientific paper examines approaches for effective use of accounting information for the purposes of detailed analysis and overall management of the import activities of the enterprises engaged in international trade. It explores the problems of providing accounting information for detailed analysis of the import transactions. Specific models for improving the methodology and organization of analysis of import of goods based on accounting information are proposed. A special attention is paid to approaches for analyzing the final profit from transactions for supply of goods from abroad and their sales on domestic market in the country. They provide the opportunity for precise assessment of the profit from each import transaction, taking into account the influence of changes in exchange rates during its implementation. The research of the problems in the scientific paper is based on the use of the systematic approach. In accordance with it, the detailed analysis of the import activities is considered as a composite element of the overall analysis system of the enterprise engaged in foreign trade. The methodology of research also involves the methods of comparison, analogy, induction and deduction, factor modelling method, method of detailing, etc. Every enterprise engaged in import activities strives to minimize the costs for the supply of the goods from abroad. For this reason, one of the priority aspects of the analytical studies of the import activities is the analysis of the costs for import and the factors that influence them. The forecast calculations for the profit from the import transactions are performed before the decisions are made for their realization. In the process of analysis, the predicted data is compared with the actual reported data for the import operations. The same approach can be applied in the analysis based on accounting information for import activities relating to two different reporting periods. The profit from the sales of imported goods on the Bulgarian market is a significant criterion for assessing the import activities. Therefore, the detailed analysis of this indicator is particularly important for Bulgarian enterprises engaged in international trade. An essential prerequisite for achieving the effectiveness of analytical work is the proper determination of the direct factors affecting profit and the dependence between them. When analyzing the final profit of transactions for supply of goods from abroad and their sales on domestic market in the country, the differences due to exchange rate fluctuation during the import must undoubtedly be taken into account. Foreign exchange gains and losses have an enormous influence on the financial results of enterprises carrying out foreign trade operations, on the value of a number of significant indicators for business analysis. All this makes the problems related to accounting the effects of exchange rates fluctuation during import transactions extremely important and topical. [ABSTRACT FROM AUTHOR]
- Published
- 2022
8. Final Consumption and Foreign Trade for Romania and European Union – A Granger causality-based analysis.
- Author
-
Păunică, Mihai, Manole, Alexandru, Motofei, Cătălina, and Tănase, Gabriela-Lidia
- Subjects
GROSS domestic product ,DATA analysis ,STATISTICAL hypothesis testing - Abstract
In this paper, the authors seek to analyze if the relationship between final consumption and foreign trade indicators, at a macroeconomic level, manifests as a Granger causality. The graphical representation of the datasets reveals that the evolutions follow a similar pattern (with the exception of the net export). The indicators on both sides of the causality have been widely approached by researchers, as they contribute to the formation of the Gross Domestic Product. The research methodology follows the Toda-Yamamoto procedure for measurement of Granger causality, as the variables were expected to be (and were found to be in the initial step of the data analysis) non-stationary, and the results on the processing in levels can provide more accurate information. The research hypotheses were designed in order to detail the main topic of the paper on foreign trade indicators. None of the hypotheses has been validated, and the authors consider, in the future, the application of other methods to assess the quantitative side of the links between foreign trade and final consumption. The authors consider that a significant contribution brought by this study is the type of data analysis method applied and the approach towards the two macroeconomic components of the economy, for the cases of Romania and the entire European Union, of which Romania is a member. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. Comprehensive framework for the integration and analysis of geo-environmental data for urban geohazards.
- Author
-
Zhang, Xinyu, Zhang, Junqiang, Liu, Gang, Tian, Yiping, Sun, Yongzheng, Xu, Lirui, and Wang, Song
- Subjects
GEOLOGICAL modeling ,DATA analysis ,SPATIAL systems ,DATA integration ,DECISION making ,INFORMATION storage & retrieval systems - Abstract
Geo-environmental information is an important basis for geohazard analysis and the integration of geo-environmental data is crucial in the construction of urban emergency management systems. In existing urban spatial information systems, the integration of geo-environmental data is neither intuitive nor efficient enough to support the analysis of geohazards well. On the basis of Web virtual globe, this paper proposes a comprehensive framework for the integration and analysis of geo-environmental data. This framework can effectively integrate geological data with a 3D geological model as a carrier, seamlessly connect geographic data, dynamically load real-time monitoring data, and build 3D visualisation and analysis scenes of urban full-space temporal information in the browser. The application example shows that the proposed framework can provide good geo-environmental data and practical data analysis functions for geohazard early warning and decision making, and improve the efficiency of government departments' response to geohazards. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
10. Applying Linear Regression to The World Happiness Report.
- Author
-
Dabholkar, Ankur A.
- Subjects
REGRESSION analysis ,HAPPINESS ,DATA analysis ,SOCIAL sciences ,MATHEMATICAL variables - Abstract
In this paper, we use The World Happiness Report to illustrate the use of linear regression. We perform some linear regression using the dataset values to pinpoint the effectiveness of linear regression in data analysis. The results show that linear regression can be used to precisely define trends in the data between the output variable and the input variables. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. Preventing and Protecting Against Internet Research Fraud in Anonymous Web-Based Research: Protocol for the Development and Implementation of an Anonymous Web-Based Data Integrity Plan.
- Author
-
Hohn, Kris L., Braswell, April A., and DeVita, James M.
- Subjects
DATA integrity ,INTERNET research ,DATA security ,FRAUD in science ,DATA analysis - Abstract
Background: Data integrity is a priority in any internet research study; it should be maintained to protect the safety and privacy of human participants and to maintain the validity and reliability of research findings. However, one noteworthy risk of web-based research is fraudulent respondent activity. When investigators must utilize anonymous web-based recruitment techniques to reach hidden and expanded populations, steps should be taken to safeguard the integrity of data collected. Objective: The purpose of this paper is to present a novel protocol in the form of an anonymous web-based research data integrity plan (DIP) protocol that outlines steps for securing data integrity while conducting anonymous web-based data collection. Methods: In this paper, we discuss a protocol regarding the development and implementation of a specific DIP in response to fraudulent activity in an original large-scale mixed methods study launched in April 2021. Four primary steps, each with a set of affiliated procedures, are presented: (1) defining the risks, (2) planning research protocols, (3) securing data collection and recruitment, and (4) determining enrollment. Results: Following the relaunch of a large-scale original study and implementation of the DIP protocol, preliminary analyses demonstrated no fraudulent activity. A pre-post analysis is underway to evaluate the effectiveness of the DIP strategies from February 2022 through May 2023. Conclusions: Implementing the DIP protocol could save valuable research time, provides a process to examine data critically, and enables the contribution of rigorous findings to various health fields. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
12. Experimenting with data and analysis in researching the writing practices of student teachers.
- Author
-
Harrison, Michaela J.
- Subjects
WRITING processes ,STUDENT teachers ,DATA analysis ,UNDERGRADUATES ,SELF-consciousness (Awareness) ,QUALITATIVE research methodology - Abstract
Primarily methodological in its orientation, this paper offers a presentation of 'research outcomes' in ways that challenge and disrupt commonplace notions of data and analysis. In an attempt to write against the grain of conventional qualitative research practice and to experiment with alternative encounters with data and analysis, I present 'data/analysis' in the form of a play (or imagined performance) that writes into being two Deleuzo-Guattarian principles – the critique of the self-conscious 'I' and desire. The play draws on a wider study that examined the potential of writing as a tool for learning for undergraduate student teachers in England. As such, the paper also contributes to debates on the practice and purpose of writing as a method of (professional) inquiry. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
13. CHANGING TRENDS OF PREFERENCES IN MODE OF TRANSACTIONS-A PREDICTION USING ROUGH SET THEORY.
- Author
-
Preethi, M.
- Subjects
ROUGH sets ,DATA mining ,DATA analysis ,DESCRIPTIVE statistics ,SET theory - Abstract
Rough Set Theory is a new technique that deals with fuzziness and improbability stressed in decision making. Data mining is a discipline that has an important contribution to data analysis, discovery of new significant knowledge, and independent decision making. The rough set theory offers a feasible approach for decision rule extraction from data. The introduction of demonetisation resulted in elimination of high valued currency notes. It aimed to achieve the goal of a 'less cash' society. Digital trades bring in better scalability and responsibility. Recently RBI has also disclosed its document- "Payments and Settlement Systems in India: Vision 2018" boosting the electronic payments and to help INDIA grow from cash to cashless society in the long run. Thus giving this model an overlook, this paper focuses on studying the views of people on evolution of cashless economy and their comfort level with it. The study was conducted in Chennai; data was collected with the help of organised questionnaire and analysed using rough set theory. [ABSTRACT FROM AUTHOR]
- Published
- 2018
14. THE ROLE AND SPECIFICS OF ACCOUNTING SYSTEM IN SMALL FARMS IN LATVIA.
- Author
-
Bratka, Valda and Prauliņš, Artūrs
- Subjects
SMALL farms ,ACCOUNTING ,DATA analysis ,DECISION making in business - Abstract
This paper explores the particularities of accounting system that serves as the main source of data for analyzing and managing small farm performance in Latvia. Although the topic of accounting in small companies has already been discussed in the literature, there is still a dearth of research on challenges faced by small agricultural holdings. The paper attempts to fill this gap in the literature by providing an insight into highly unsophisticated systems for aggregating and processing data used by some small farms. These mechanisms occasionally fail to provide reasonably accurate and up-to-date information for costing that is an essential pre-requisite for successful control over the efficiency of agricultural production. The study also highlights the contribution of the SUDAT - a component of the EU Farm Accountancy Data Network in Latvia - to improving the quality of decision-making by small farms. The paper adopts a mixed-method approach by integrating the results of quantitative and qualitative data analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
15. Nonlinearity detection using new signal analysis methods for global health monitoring.
- Author
-
Nouri, Y., Shariatmadar, H., and Shahabian, F.
- Subjects
PUBLIC health ,DATA analysis ,HYDRAULICS ,VELOCITY ,GENETIC algorithms - Abstract
Statistical pattern recognition has emerged as a promising and practical technique for data-based health monitoring of civil structures. This paper intends to detect nonlinearity changes resulting from damage by some simple but e ective signal analysis methods. The primary idea behind these methods is to use measured time-domain vibration signals based on exploratory data analysis without applying any feature extraction. First, statistical moments and central tendency measurements on the basis of the theory of exploratory data analysis are considered as damage indicators to monitor their changes and detect any substantial variations in measured vibration signals. Subsequently, cross correlation and convolution methods are proposed to measure the similarity and overlap between the measured signals of the undamaged and damaged conditions. The main innovation of this study is the capability of the proposed signal analysis methods for implementing nonlinear damage detection without any feature extraction. Numerical and experimental models of civil structures are employed to demonstrate the e ectiveness and performance of the proposed methods. Results show that nonlinearity changes caused by damage lead to reductions in the values of cross correlation and convolution methods. Moreover, some statistical criteria are applicable tools for the global structural health monitoring. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
16. Mining for Social Media: Usage Patterns of Small Businesses.
- Author
-
Balan, Shilpa and Rege, Janhavi
- Subjects
SMALL business ,SOCIAL media ,DATA mining ,DATA analysis - Abstract
Background: Information can now be rapidly exchanged due to social media. Due to its openness, Twitter has generated massive amounts of data. In this paper, we apply data mining and analytics to extract the usage patterns of social media by small businesses. Objectives: The aim of this paper is to describe with an example how data mining can be applied to social media. This paper further examines the impact of social media on small businesses. The Twitter posts related to small businesses are analyzed in detail. Methods/Approach: The patterns of social media usage by small businesses are observed using IBM Watson Analytics. In this paper, we particularly analyze tweets on Twitter for the hashtag #smallbusiness. Results: It is found that the number of females posting topics related to small business on Twitter is greater than the number of males. It is also found that the number of negative posts in Twitter is relatively low. Conclusions: Small firms are beginning to understand the importance of social media to realize their business goals. For future research, further analysis can be performed on the date and time the tweets were posted. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
17. RESEARCH METHODS IN LINGUISTICS: AN OVERVIEW.
- Author
-
Obeyd, Snejana
- Subjects
LINGUISTICS ,DATA analysis ,MIXED methods research ,GROUNDED theory ,DISCOURSE analysis - Abstract
The paper offers a short theoretical overview of the main research paradigms, key issues, procedures, data collection and methods of analysis applied to the study of language. General description of qualitative, quantitative and mixedmethod research is followed by more detailed account of the three approaches. Different criteria for evaluating research are enumerated and specified and the most useful tools for obtaining quantitative data in linguistics are pointed out: questionnaire surveys and the experimental studies. The reviewed qualitative procedures leading to the generation of a set of data are ethnography, one-to-one interviews, focus group interviews, introspection, case studies, diary studies and research journals. Qualitative content analysis is a deeper level of analysis and interpretation of the underlying meaning of the data. Grounded theory has grown into the "mainstream" and many of its theoretical aspects are considered. The most significant of the discourse-analytic approaches: Conversation analysis, Discourse analysis, Critical discourse analysis and Feminist Post-Structuralist Discourse analysis are highlighted. The application of the discussed research paradigms to various research contexts is exemplified by reference to authors whose work is regularly published in SILC. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
18. Deleuze-inspired action research in the university: mobilising Deleuzian concepts to rethink research on the reflective writing practices of student teachers.
- Author
-
Harrison, Michaela J.
- Subjects
ACTION research in education ,TEACHER education ,STUDENT teachers ,JOURNAL writing ,DATA analysis - Abstract
This article offers an insight into the process and potential of Deleuze-inspired action research. It draws on a classroom action research (CAR) project that critically reconceptualises practices of reflective writing in teacher education, including the widespread use of the 'professional learning journal' as a resource to facilitate reflection on practice. Students following a teacher education programme in England took part in an innovative mode of engagement with texts, including their learning journals, drawing on the Deleuzo-Guattarian notion of the text as an agent that acts outside of itself. The process was called 'implicated reading'. An example of a teaching and learning intervention, in the form of a seminar transcript, is offered as an illustration of how Deleuzian theory and philosophy can inspire and shape innovations in practice. The transcript also serves as an opportunity to reimagine the ways in which data and data analysis are conceptualised and practiced in action research (AR) projects. Data is (re)conceptualised as agentic, rather than inert or indifferent. Synthesis is privileged over analysis so that the transcript acts as a provocation to rethink the relation between theory and data, asking what is made possible when these are 'plugged into' one another to raise questions that otherwise would have remained unthought. Ultimately, the article offers a worked example of what happens when action researchers take up the challenge of working and thinking within a Deleuzian ontology that seeks to maintain the plurality and potentialities of AR in practice. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
19. Hermeneutic Exploration, Analysis and Authority: Phenomenology of Researcher's Emotions and Organizational Trust.
- Author
-
Cole, Caroline, Couch, Oliver, Chase, Steven, and Clark, Murray
- Subjects
DATA analysis ,HERMENEUTICS ,ACQUISITION of data ,EMOTIONS ,REFLEXIVITY - Abstract
In this paper we focus on the design, data analysis and practical application of hermeneutic based organizational research. There is much written about hermeneutics as a research approach but there is little written about the emotions of the researcher as they analyse data in that fashion and present their conclusions to, often, a non-academic audience grounded in a positivistic business environment. In this paper we highlight the importance of the emotions of the researcher within hermeneutic design, data collection and analysis. We start from the position that hermeneutic research is emotion and value laden and, as they are part of the research and not removed from it, researchers must acknowledge and be reflexive about their emotions. We discuss the philosophical and practical considerations that emerge from this and how these can be dealt with. Hermeneutic frameworks are gaining in popularity in organizational research however there are few papers that consider the analytical processes in any detail. There are challenges for the hermeneutic researcher when seeking to provide insight and make a difference to organizational practice where there are expectations of measureable, reproducible results. We find that there are considerations around the trust, acceptance and authority of emergent insight arising from undertaking this type of research in typically positivist business environments. We take the position that this is, in part, because the approach taken to data analysis in qualitative research is often not visible, accessible or presented in any detail. A reader seeking to access an approach taken can often be left to assume what a researcher has done. These assumptions can become a taken for granted acceptance of what has been done or can become assumptions around what has not been done. In this paper we demonstrate the contribution hermeneutic studies can make to organizational practice. We suggest that there is a need for researchers to shine a light on their approach. In particular we highlight the importance of practically presenting what has been done and why to provide visibility of the approach taken. We highlight how doing this can provide greater authority to emerging insights and facilitate organisational trust and acceptance of results. [ABSTRACT FROM AUTHOR]
- Published
- 2015
20. Comparison of Distributed and Parallel NGS Data Analysis Methods based on Cloud Computing.
- Author
-
Hyungil Kang and Sangsoo Kim
- Subjects
CLOUD computing ,NUCLEOTIDE sequencing ,DATA analysis ,BIG data ,INFORMATION resources management - Abstract
With the rapid growth of genomic data, new requirements have emerged that are difficult to handle with big data storage and analysis techniques. Regardless of the size of an organization performing genomic data analysis, it is becoming increasingly difficult for an institution to build a computing environment for storing and analyzing genomic data. Recently, cloud computing has emerged as a computing environment that meets these new requirements. In this paper, we analyze and compare existing distributed and parallel NGS (Next Generation Sequencing) analysis based on cloud computing environment for future research. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
21. Event data based network analysis (EDNA).
- Author
-
Widmer, Thomas and Troeger, Vera
- Subjects
- *
POLITICAL science , *DATA analysis , *SOCIAL networks , *INTERNATIONAL relations , *ARTIFICIAL intelligence - Abstract
This paper investigates the independently developed contributions offered by two streams of research: event data analysis and social network analysis. The first approach, event data analysis, deals with data collecting enterprises in the field of international relations used to circumscribe and explain interstate interactions. This approach was first introduced in the seventies and saw a second wave of strong advances since the mid nineties, especially based on new artificial intelligence supported data collecting procedures. The second approach, embedded in the theoretical developments in the field of domestic and comparative politics, is the so-called social network analysis approach. The last two decades have witnessed various innovations in the methods used to analyze data gathered through standardized questionnaires or based on written evidence in a comparative setting. In this context, one of the most recent innovations is the analysis of dynamic networks. This contribution demonstrates that the combination of these two approaches within the event data based network analysis (EDNA) framework has three central advantages. On a theoretical level, our approach to EDNA allows a more stringent, reasonable application of the theoretical hypothesis at hand in various fields of political science. In a methodological perspective we show, that EDNA has decent advantages -- especially its explanatory power -- compared to the approaches so far used in both schools. On a practical level, we show how EDNA can be used to support a sound early warning system that allows for efficient analysis that is better suited to understanding the complexity of the world system. [ABSTRACT FROM AUTHOR]
- Published
- 2004
22. Okul Yöneticiliği Alanında Yapılan Bilimsel Çalışmaların Kavramsal ve Yöntemsel Analizi.
- Author
-
BOZAN, Serdar and ÖZTÜRK, Sevim
- Subjects
SCHOOL administrators ,TEACHERS ,JUDGMENT sampling ,QUALITATIVE research ,DATA analysis - Abstract
Copyright of Inonu University Journal of the Faculty of Education (INUJFE) is the property of Inonu University Journal of the Faculty of Education and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
23. Using the Nominal Group Technique: how to analyse across multiple groups.
- Author
-
McMillan, Sara, Kelly, Fiona, Sav, Adem, Kendall, Elizabeth, King, Michelle, Whitty, Jennifer, and Wheeler, Amanda
- Subjects
BRAINSTORMING ,CONSUMER attitudes ,HOSPITAL pharmacies ,HEALTH outcome assessment ,RESEARCH funding ,STATISTICS ,VOTING ,PILOT projects ,DATA analysis ,THEMATIC analysis ,DATA analysis software ,MEDICAL coding ,DESCRIPTIVE statistics ,METHODOLOGY - Abstract
The nominal group technique (NGT) is a method to elicit healthcare priorities. Yet, there is variability on how to conduct the NGT, and limited guidance on how to analyse a diverse sample of multiple groups. This paper addresses some of this ambiguity, and explores whether different approaches to analysis provide the same outcome/s. Conceptual papers and empirical studies were identified via PubMed and informed an adapted version of the NGT. Twenty-six nominal groups were conducted, which provided in-depth knowledge on how to best conduct this method. Pilot group data were used to compare different analysis methods and to explore how this impacted on reported outcomes. Data analyses for large data-sets are complex; thematic analysis is needed to be able to conduct across group comparisons of participant priorities. Consideration should be given not just to the strength, i.e. sum of votes, or relative importance of the priority, but to the voting frequency, i.e. the popularity of the idea amongst participants; our case study demonstrated that this can affect priority rankings for those ideas with the same score. As a case study, this paper provides practical information on analysis for complex data sets. Researchers need to consider more than one analysis process to ensure that the results truly reflect participant priorities. A priority that has a high score may not necessarily reflect its popularity within the group; the voting frequency may also need to be considered. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
24. Programmed method: developing a toolset for capturing and analyzing tweets.
- Author
-
Borra, Erik and Rieder, Bernhard
- Subjects
DISCOURSE analysis ,MICROBLOGS ,SOCIAL media research ,DATA analysis ,COMPUTER software - Abstract
Purpose – The purpose of this paper is to introduce Digital Methods Initiative Twitter Capture and Analysis Toolset, a toolset for capturing and analyzing Twitter data. Instead of just presenting a technical paper detailing the system, however, the authors argue that the type of data used for, as well as the methods encoded in, computational systems have epistemological repercussions for research. The authors thus aim at situating the development of the toolset in relation to methodological debates in the social sciences and humanities. Design/methodology/approach – The authors review the possibilities and limitations of existing approaches to capture and analyze Twitter data in order to address the various ways in which computational systems frame research. The authors then introduce the open-source toolset and put forward an approach that embraces methodological diversity and epistemological plurality. Findings – The authors find that design decisions and more general methodological reasoning can and should go hand in hand when building tools for computational social science or digital humanities. Practical implications – Besides methodological transparency, the software provides robust and reproducible data capture and analysis, and interlinks with existing analytical software. Epistemic plurality is emphasized by taking into account how Twitter structures information, by allowing for a number of different sampling techniques, by enabling a variety of analytical approaches or paradigms, and by facilitating work at the micro, meso, and macro levels. Originality/value – The paper opens up critical debate by connecting tool design to fundamental interrogations of methodology and its repercussions for the production of knowledge. The design of the software is inspired by exchanges and debates with scholars from a variety of disciplines and the attempt to propose a flexible and extensible tool that accommodates a wide array of methodological approaches is directly motivated by the desire to keep computational work open for various epistemic sensibilities. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
25. Improving the interpretation of 'less than' values in environmental monitoring.
- Author
-
Gardner, Mike
- Subjects
ENVIRONMENTAL monitoring ,MEASUREMENT ,GUIDELINES ,ENVIRONMENTAL impact analysis ,MAXIMUM likelihood statistics ,DATA analysis - Abstract
This paper reviews the approaches used to deal with the interpretation of measurements reported as 'less than' a stated reporting limit. The principal current methodologies are examined and their shortcomings discussed. Recent key papers on the subject are summarised. It is concluded that lack of easy-to-use alternative methods have led to the continued use of substitution methods that are acknowledged to be biased. With the aim of promoting a more technically sound approach to dealing with 'less than' data, a supplementary spreadsheet tool is supplied to provide the reader with ready introductory access to a simple way to apply maximum likelihood methods. Recommendations and simple guidelines for better practice are provided. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
26. A Review of Application Challenges of Digital Forensics.
- Author
-
Okereafor, Kenneth and Djehaiche, Rania
- Subjects
COMPUTER engineering ,APPLICATION software ,DATA analysis ,ACQUISITION of data ,CYBERTERRORISM ,FORENSIC medicine ,FORENSIC sciences - Abstract
The growing preference of automation and digital transformation over semi manual operations in the corporate world has led to an exponential rise in the applications of computer technology, internet and web assets for everyday living, resulting in significant behavioural adjustments, particularly how humans communicate with each other and interact with the environment. Unfortunately, digital growth has also given rise to different forms of cyber criminalities in industry, government and academia. With many cyberattacks becoming more and more sophisticated, it is equally becoming increasingly difficult to trace cybersecurity breaches without first establishing an accurate mechanism for data collection and analysis offered by digital forensics. In the absence of reliable data analysis, the scope of digital forensic operations required to respond to modern cybersecurity breaches could become significantly challenging, costly and open-ended. This paper reviews the major challenges faced by organizations in performing effective digital forensic operations. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
27. Big Data Analytics for Improved Care Delivery in the Healthcare Industry.
- Author
-
Thangarasu, Gunasekar and Kayalvizhi Subramanian
- Subjects
BIG data ,RESEARCH & development ,MEDICAL care ,MEDICAL research ,DATA analysis - Abstract
The big data analytics plays a pivotal role in the field of healthcare services and research to facilitate better service to the patients. It has provided tools to accumulate, manage, analysis the structured and unstructured data produced by the healthcare systems. Recently the utilization of big data analytics has been increased in the healthcare industry for assisting the process of diagnosing diseases and care delivery. However, the adoption and research development of big data analysis in the healthcare industry is still slow down due to facing some fundamental problems inherent within the big data paradigm. In this study, addresses these problems which focus on the upcoming and promising areas of medical research and proposed a novel big data analytics approach using Apache Spark. The proposed approach will improve care delivery in the healthcare industry. Big data analytics can continually evaluate clinical data in order to improve the effective practices of physicians and improved patient care. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
28. GIS SUPPORT FOR PUBLIC TRANSPORT DATA ANALYSIS.
- Author
-
Floková, Ludmila, Kodym, Oldřich, and Létavková, Dagmar
- Subjects
GEOGRAPHIC information systems ,TRANSPORTATION ,DATA analysis ,TARIFF - Abstract
Paper focuses on methodology for preprocessing of transactional data from public transport system the in region as administration unit of the Czech Republic. The aim is to present synthetic information in GIS information system. Necessary data sources are defined and evaluated. Target group of users are those who plan development of public transport system. Information covering state of each bus or other mean of transport (number of passengers categorized by tariffs between any stops of the bus line) and activities on each bus stop (number of passengers who get out, in or change) are necessary to make proper decision. Different kinds of data presentation are discussed and compared to get maximum efficiency for users. Geographic distribution of this kind of information can provide new quality to public transport system evaluation and development. Some extrapolation ideas for individual transport are discussed too. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
29. Analysis of Research Studies Published in Journal of Korean Critical Care Nursing : 2008-2013.
- Author
-
Youn-Jung, Son, Jiyeon, Kang, Hyo-Jeong, Song, Young-Rye, Park, Yun-Mi, Lee, Jin-Hee, Park, and Minju, Kim
- Subjects
CINAHL database ,EXPERIMENTAL design ,INTENSIVE care nursing ,INTENSIVE care units ,RESEARCH methodology ,NURSING research ,NURSING literature ,PARADIGMS (Social sciences) ,SERIAL publications ,STATISTICS ,QUALITATIVE research ,INSTITUTIONAL review boards ,DATA analysis ,LITERATURE reviews ,QUANTITATIVE research ,RETROSPECTIVE studies - Abstract
Purpose: The purpose of this study was to identify the trends in studies published in Journal of Korean Critical Care Nursing (JKCCN) from 2008 to 2013. Methods: A total of 65 studies published between 2008 to 2013 were reviewed using criteria developed by researchers. Results: Approximately 36% of studies were conducted with patients. Intensive care unit (ICU) was the most popular site as the study setting. Among 59 research papers, 42.4% were approved by institutional review board (IRB). Quantitative studies were 78.6%, while qualitative studies were 4.6%. The research designs for the quantitative studies were survey (52.3%), quasi-experimental (16.9%), and so on. There was no randomized controlled trial. The most frequently used methods to provide nursing intervention was education. In addition, "nurse" and "ICU" were most commonly used keyword. Conclusion: Considering the low rate of IRB approval, more stringent application of research ethics is necessary to improve the quality of JKCCN. In addition, more randomized controlled trials should be encouraged to support evidence-based practice in critical care. [ABSTRACT FROM AUTHOR]
- Published
- 2013
30. A New Decision Tree Mechanism for Big Data Analytics Using C4.5.
- Author
-
Anusha, P. and Gopi, A. Peda
- Subjects
- *
BIG data , *DATA analysis , *MACHINE learning , *DESCRIPTIVE statistics , *MACHINE theory , *DATA mining - Abstract
Big data is one of the most rising tools trends that have the ability for considerably changing the way production organizations use user behaviour to analyse and transform it into valuable insights. In this decision trees can be used efficiently for analysing data. In this paper we proposed C4.5 algorithm uses information gain as splitting criteria. In this you are using un-structured data; it can understand data with categorical or numerical values. To handle non-stop values it generates threshold and then divides attributes with values above the threshold and values equal to or below the threshold. C4.5algorithm can easily handle missing values. C4.5 is one of the most classic classification algorithms, but when it is used in mass calculations, the efficiency is very low. C4.5 is one of the most effective classification methods. This paper also gives insights into the rate of accuracy it provides when an XsA dataset contains noisy data, missing data and large amount of data. [ABSTRACT FROM AUTHOR]
- Published
- 2018
31. Study of Honeypot Technology for Virtual Space Monitoring - Parsing the conpot.log file.
- Author
-
Cangea, Otilia
- Subjects
HONEYPOTS (Network security) ,CYBERTERRORISM ,LABORATORIES ,DATA analysis ,BIG data - Abstract
Copyright of Petroleum - Gas University of Ploiesti Bulletin, Technical Series is the property of Petroleum - Gas University of Ploiesti and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2018
32. The National Health and Nutrition Examination Survey as a Tool to Teach Data Analysis to Public Health Students.
- Author
-
Briggs, Virginia G.
- Subjects
HEALTH & Nutrition Examination Survey ,PUBLIC health education ,DATA analysis ,PUBLIC health - Abstract
Graduates of Master of Public Health programs may lack appropriate skills in data analysis, and would benefit from practice with research data. Datasets that contain health information relevant to student interests, and that are appropriate sizes for class use can be difficult to locate. The National Health and Nutrition Examination Survey is a study that collects both survey and health examination information from a national sample every 2 years. I present a sample of this dataset, with examples of how to use it for human health related questions. Instructions for how to access and create additional, customized datasets are also provided. Instructors may consider investigating this rich data source and providing students with subsets of these data for class assignments, projects and master's theses. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. APPLICATION OF INTELLIGENT ALGORITHMS AND BIG DATA ANALYSIS IN FILM AND TELEVISION CREATION.
- Author
-
WEIWEI WU
- Subjects
BIG data ,MOTION picture audiences ,TELEVISION broadcasting ,DATA analysis ,ARTIFICIAL intelligence ,ALGORITHMS ,SMART television devices ,MACHINE learning - Abstract
With the rapid development of social media, people can access a large amount of data in a short period of time, and big data technology has emerged. With the vigorous development of cloud computing and big data, the method of mining audience interests through a large amount of data to guide film and television creation has attracted more and more attention from experts and scholars. In order to understand the current situation of film and television drama creation in China and provide suggestions for the shortcomings in the industry, this article mainly analyzes the application of intelligent algorithms for traffic prediction models and big data analysis in film and television creation. This intelligent algorithm can predict the potential audience of movies or TV dramas, helping producers and investors make decisions. This system utilizes artificial intelligence technology to select suitable actors for characters based on their matching degree and past work performance. This article applies intelligent algorithms to big data processing to improve the accuracy of data processing. This article explores the application of intelligent algorithms and big data analysis in film and television creation. Using machine learning algorithms to predict the potential audience of a movie or TV series based on historical data, providing decision-making basis for investors and producers. This system utilizes AI technology to select suitable actors for characters based on their matching degree with characters and past work performance, improving the scientific and accurate selection of roles. The application of these technologies helps to improve production efficiency and quality, reduce costs and risks, and inject new impetus into the sustainable development of the film and television industry. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. The absent, the hidden and the obscured: reflections on 'dark matter' in qualitative research.
- Author
-
Weiner-Levy, Naomi and Popper-Giveon, Ariela
- Subjects
QUALITATIVE research ,REFLEXIVITY ,METHODOLOGY ,WRITING ,DATA analysis ,REPORT writing - Abstract
Qualitative research literature generally ignores the voids that are created and the materials that are suppressed during data analysis and the writing phase. Qualitative studies are usually based on observations and interviews that hold an immense amount of data. These are transformed to a few condensed papers at the final stage. During this process, many of the findings and insights are omitted. This study focuses on the important but neglected topic of data omitted from final research reports by examining two specific aspects of research: (1) reflexivity, its pretensions notwithstanding, that may often suppress and conceal more than it presents and reveals and (2) relevant findings omitted from final reports despite their marked effect on research. We maintain that these suppressed and obscured materials, the 'dark matter' of qualitative research, have a marked effect on the research and significantly affect the findings and their structure even if they are not included in the final report. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
35. Multivariable data analysis of a cold rolling control system to minimise defects.
- Author
-
Takami, Kourosh, Mahmoudi, Jafar, Dahlquist, Erik, and Lindenmo, Magnus
- Subjects
DATA analysis ,AUTOMATION ,CONTROL theory (Engineering) ,PARAMETER estimation ,PRINCIPAL components analysis ,ROLLING-mills ,STATISTICAL correlation ,MULTIVARIATE analysis - Abstract
This paper focuses on the application of principal component analysis (PCA) to thoroughly analyse and interpret multidimensional data from a cold rolling process. The analysis includes the effects of variables on the final properties of strips in a cold rolling mill. Unscrambler software was used to analyse and identify hidden variables. Variable correlations were also used to derive correlations between the control parameters. The results of this research will be used to improve the selection of material in order to reduce the occurrence of defects in the cold rolling process and to improve the adjustment of the set points that are performed in every pass or section of the cold rolling process. The hot rolled strips that enter the cold rolling mill are made of different materials and are produced by different strip manufacturers. Some strips break during the thickness reduction process in the cold rolling mill. This paper focuses on two possible causes of breakage: non-uniform strip material properties and failures in the rolling mill process. Two types of rolled strips (those that break and those that do not break) were compared to identify causes of breakage. The results indicate that breakages are caused by material or process failures. PCA was applied to the dataset in order to identify and analyse the relationships between the variables in the process. This information was used to interpret and diagnose the process behaviour. Swarm analysis and relating observations to process behaviour were able to distinguish between different start-up conditions, and between desirable and undesirable process conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
36. Towards Realizable, Low-Cost Broadcast Systems for Dynamic Environments.
- Author
-
Liaskos, Christos K., Petridou, Sophia G., and Papadimitriou, Georgios I.
- Subjects
BROADCASTING industry ,WIRELESS communications ,CENTRAL processing units ,COMPUTER storage devices ,ALGORITHMS ,TELECOMMUNICATION systems - Abstract
A main design issue in a wireless data broadcasting system is to choose between push-based and pull-based logic: The former is used as a low-cost solution, while the latter is preferred when performance is of utmost importance. Therefore, the most significant advantage of a push system is the minimal cost. This fact implies that hardware limitations do exist in the case of push systems. As a consequence, every related proposed algorithm should primarily be cost-effective. This attribute, however, has been overlooked in related research. In this paper, popular broadcast scheduling approaches are tested from an implementation cost aspect, and the results render them only conditionally realizable. Moreover, a new, cost-effective, adaptivity oriented schedule constructor is proposed as a realistic, minimal-cost solution. [ABSTRACT FROM PUBLISHER]
- Published
- 2011
- Full Text
- View/download PDF
37. Dynamics of astrocytes Ca2+ signaling: a low-cost fluorescence customized system for 2D cultures.
- Author
-
Musotto, Rosa, Wanderlingh, Ulderico, D'Ascola, Angela, Spatuzza, Michela, Catania, Maria Vincenza, De Pittà, Maurizio, and Pioggia, Giovanni
- Subjects
ASTROCYTES ,FLUORESCENCE ,FLUORESCENCE microscopy ,CUSTOMIZATION ,DATA analysis - Abstract
In an effort to help reduce the costs of fluorescence microscopy and expand the use of this valuable technique, we developed a low-cost platform capable of visualising and analysing the spatio-temporal dynamics of intracellular Ca
2+ signalling in astrocytes. The created platform, consisting of a specially adapted fluorescence microscope and a data analysis procedure performed with Imagej Fiji software and custom scripts, allowed us to detect relative changes of intracellular Ca2+ ions in astrocytes. To demonstrate the usefulness of the workflow, we applied the methodology to several in vitro astrocyte preparations, specifically immortalised human astrocyte cells and wild-type mouse cells. To demonstrate the reliability of the procedure, analyses were conducted by stimulating astrocyte activity with the agonist dihydroxyphenylglycine (DHPG), alone or in the presence of the antagonist 2-methyl-6-phenylethyl-pyridine (MPEP). [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
38. Electromobility in Poland Compared to the European Union - Results of Preliminary Analysis.
- Author
-
Bełch, Paulina, Szczygieł, Elżbieta, and Hajduk-Stelmachowicz, Marzena
- Subjects
ELECTRIC vehicles ,INFERENTIAL statistics ,STATISTICS ,DATA analysis - Abstract
Copyright of Research Papers of the Wroclaw University of Economics / Prace Naukowe Uniwersytetu Ekonomicznego we Wroclawiu is the property of Uniwersytet Ekonomiczny we Wroclawiu and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
39. Combining Channel Theory, HowNet and Extension Model to Analyze Big Data.
- Author
-
Li, Weihua and Yang, Chunyan
- Subjects
DATA analysis ,DECISION making ,SEMANTICS ,INFORMATION sharing ,ELECTRONIC data processing - Abstract
Because the diversity of unstructured data has brought new challenges to big data analysis, this paper proposes to combine Channel theory, HowNet and extension model to improve big data analysis ability. The paper proposes a new method to process big data, which is based on the Channel theory idea and HowNet structure, in order to overcome the semantic conflicts of big data. In view of the problems that people are difficult to analyze their big data in order to get profits, the paper proposes a case study to show the effective of our method. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
40. Personality and Text: Quantitative Psycholinguistic Analysis of a Stylistically Differentiated Czech Text.
- Author
-
Kučera, Dalibor, Haviger, Jiří, and Havigerová, Jana M.
- Subjects
- *
QUANTITATIVE research , *EXAMINATIONS , *AGRAMMATISM , *PERSONALITY , *DATA analysis , *PSYCHOLINGUISTICS , *PSYCHODIAGNOSTICS - Abstract
The paper focuses on the possibilities of quantitative linguistic method application in the context of psychological disciplines, especially psychodiagnostics. It presents the theoretical basis for this approach, as well as documentation and selected results of the Quantitative Psycholinguistic Analysis of the Formal Parameters of the Text (QPA-FPT) research. The research sample consisted of 76 undergraduates who wrote two types of texts according to in advance given criteria, a stylistically formal text (apology letter) and a stylistically informal text (letter from holiday). Besides, results of psychodiagnostic tests (STAI-X, KṢAT, SSI and PSSI) were collected. During data analysis, all the texts were computationally processed and described by the means of 48 linguistic characteristics (parameters of text). The outcomes of the analysis were then compared with the test results, and significant correlations were identified. The research results include a number of key findings, pointing out the influence of a given text type on its morphological structure, but also the potential links among linguistic features (i.e. text parameters) and specific personality characteristics of the writer, e.g. between the quantity of verbs, emotional skills and the overall score in the SSI test, and between incidence of punctuation and the PSSI test scale distrustful—paranoid. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
41. Development of an information-analytical system for the analysis and monitoring of climatic and ecological changes in the environment: Part 1.
- Author
-
Duisebekova, K.S., Kozhamzharova, D.K., Rakhmetulayeva, S.B., Umarov, F.A., and Aitimov, M. Zh.
- Subjects
SYSTEM analysis ,CLIMATE change ,ENVIRONMENTAL monitoring ,AIR quality indexes ,SYSTEMS development - Abstract
The paper presents the information system, which designed to monitor the state of the atmosphere of the settlement, by calculating the indices of atmospheric pollution. In addition, this information-analytical system stores recorded data in the database, which provides an overview of the complex index of air pollution for the last 6 and 12 months. By analyzing the data obtained, it is possible to make forecasts for the near future, as well as to make environmentally important decisions to improve the state of the environment. The results of this work have a wide range of applications; in particular, they may be utilized to build management decisions on environmental issues and on the construction of wireless networks for transmission and reception of data from remote sources. The results can be applied to collect and analyze big data. In the interests of ensuring environmentally safe living of the population. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
42. Single-case experimental designs: Reflections on conduct and analysis.
- Author
-
Manolov, Rumen, Gast, David L., Perdices, Michael, and Evans, Jonathan J.
- Subjects
SINGLE subject research ,EXPERIMENTAL design ,METHODOLOGY ,DATA analysis ,CLINICAL medicine - Abstract
In this editorial discussion we reflect on the issues addressed by, and arising from, the papers in this special issue on Single-Case Experimental Design (SCED) study methodology. We identify areas of consensus and disagreement regarding the conduct and analysis of SCED studies. Despite the long history of application of SCEDs in studies of interventions in clinical and educational settings, the field is still developing. There is an emerging consensus on methodological quality criteria for many aspects of SCEDs, but disagreement on what are the most appropriate methods of SCED data analysis. Our aim is to stimulate this ongoing debate and highlight issues requiring further attention from applied researchers and methodologists. In addition we offer tentative criteria to support decision-making in relation to the selection of analytical techniques in SCED studies. Finally, we stress that large-scale interdisciplinary collaborations, such as the current Special Issue, are necessary if SCEDs are going to play a significant role in the development of the evidence base for clinical practice. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
43. Design and data analysis case-controlled study in clinical research.
- Author
-
Thomas, Sanjeev V., Suresh, Karthik, and Suresh, Geetha
- Subjects
CLINICAL medicine research ,RESEARCH methodology ,DATA analysis ,PREDICTIVE tests ,CONTROL groups ,CASE-control method - Abstract
Clinicians during their training period and practice are often called upon to conduct studies to explore the association between certain exposures and disease states or interventions and outcomes. More often they need to interpret the results of research data published in the medical literature. Case-control studies are one of the most frequently used study designs for these purposes. This paper explains basic features of case control studies, rationality behind applying case control design with appropriate examples and limitations of this design. Analysis of sensitivity and specificity along with template to calculate various ratios are explained with user friendly tables and calculations in this article. The interpretation of some of the laboratory results requires sound knowledge of the various risk ratios and positive or negative predictive values for correct identification for unbiased analysis. A major advantage of case-control study is that they are small and retrospective and so they are economical than cohort studies and randomized controlled trials. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
44. UMA ANÁLISE DA APLICAÇÃO DO ESTUDO DE CASO EM PESQUISAS NO CAMPO DA ESTRATÉGIA.
- Author
-
Di Francesco Kich, Juliane Ines and Pereira, Maurício Fernandes
- Subjects
CASE studies ,DATA analysis ,CASE method (Teaching) ,FIELD research ,DESCRIPTIVE statistics ,LONGITUDINAL method - Abstract
Copyright of Revista Pretexto is the property of Revista Pretexto and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2013
45. Deriving Viewpoints for Enterprise Architecture Analysis.
- Author
-
Santana, Alixandre, Kreimeyer, Matthias, Simon, Daniel, Fischbach, Kai, and de Moura, Hermano
- Subjects
BUSINESS enterprises ,ARCHITECTURE ,NETWORK analysis (Communication) ,EMPIRICAL research ,DATA analysis - Abstract
The power of visual analysis of Enterprise Architecture (EA) model tends to diminish when we deal with large and complex viewpoints. However, we still can extract useful information from them. In this work, we apply ideas from the Design Structure Matrix (DSM) theory to derive enterprise architecture viewpoints from primary models already available at organizations. In order to do so, we propose four derivation mechanisms. Subsequently, we apply network analysis metrics to those new viewpoints in three empirical datasets. With our suggested approach, it was possible to derive cross-layer viewpoints, amplifying the analysis possibilities for experts through the creation of "non-standard/implicit" visualizations of the enterprise architecture which have proven useful to those experts. [ABSTRACT FROM AUTHOR]
- Published
- 2018
46. Combinatorial analysis on spatial information statistics for the karst water environment in Guiyang, China.
- Author
-
Wang, Zhongmei, Zhu, Lijun, Yang, Ruidong, Yang, Shengyuan, Ding, Jianping, and Yang, Genlan
- Subjects
COMBINATORICS ,ENVIRONMENTAL engineering ,GROUNDWATER ecology ,SPATIAL analysis (Statistics) ,DATA analysis ,WATER pollution - Abstract
The karst groundwater system is extremely vulnerable and easily contaminated by human activities. To understand the spatial distribution of contaminants in the groundwater of karst urban areas and contributors to the contamination, this paper employs the spatial information statistics analysis theory and method to analyze the karst groundwater environment in Guiyang City. Based on the karst groundwater quality data detected in 61 detection points of the research area in the last three years, we made Kriging evaluation isoline map with some ions in the karst groundwater, such as SO, Fe, Mn and F, analyzed and evaluated the spatial distribution, extension and variation of four types of ions on the basis of this isoline map. The results of the analysis show that the anomaly areas of SO, Fe, Mn, F and other ions are mainly located in Baba'ao, Mawangmiao and Sanqiao in northwestern Guiyang City as well as in its downtown area by reasons of the original non-point source pollution and the contamination caused by human activities (industrial and domestic pollution). [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
47. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.
- Author
-
Magnusson, Bertil, Ossowicki, Haakan, Rienitz, Olaf, and Theodorsson, Elvar
- Subjects
PATHOLOGICAL laboratories ,QUALITY control ,DATA analysis ,PHYSICS laboratories ,UNCERTAINTY ,BIOLOGICAL variation - Abstract
Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
48. A Comparison of Subgroup Analyses in Grant Applications and Publications.
- Author
-
Boonacker, Chantal W. B., Hoes, Arno W., van Liere-Visser, Karen, Schilder, Anne G. M., and Rovers, Maroeska M.
- Subjects
ANALYSIS of variance ,BIBLIOMETRICS ,CLASSIFICATION ,CLINICAL trials ,DIAGNOSIS ,ENDOWMENTS ,MEDICAL protocols ,PATIENTS ,PROGNOSIS ,GRANT writing ,REPORT writing ,STATISTICS ,DECISION making in clinical medicine ,DATA analysis ,RANDOMIZED controlled trials ,HUMAN research subjects ,STANDARDS - Abstract
In this paper, the authors compare subgroup analyses as outlined in grant applications and their related publications. Grants awarded by the Netherlands Organization for Health Research and Development (ZonMw) from 2001 onward that were finalized before March 1, 2010, were studied. Of the 79 grant proposals, 50 (63%) were intervention studies, 18 (23%) were diagnostic studies, and 6 (8%) were prognostic studies. Subgroups were mentioned in 49 (62%) grant applications and in 53 (67%) publications. In 20 of the 79 projects (25%), the publications were completely in agreement with the grant proposal; that is, subgroups that were prespecified in the grant proposal were reported and no new subgroup analyses were introduced in the publications. Of the 149 prespecified subgroups, 46 (31%) were reported in the final report or scientific publications, and 143 of the 189 (76%) reported subgroups were based on post-hoc findings. For 77% of the subgroup analyses in the publications, there was no mention of whether these were prespecified or post hoc. Justification for subgroup analysis and methods to study subgroups were rarely reported. The authors conclude that there is a large discrepancy between grant applications and final publications regarding subgroup analyses. Both nonreporting prespecified subgroup analyses and reporting post-hoc subgroup analyses are common. More guidance is clearly needed. [ABSTRACT FROM PUBLISHER]
- Published
- 2011
- Full Text
- View/download PDF
49. Reliability analysis of maintenance data for complex medical devices.
- Author
-
Taghipour, Sharareh, Banjevic, Dragan, and Jardine, Andrew K. S.
- Subjects
MEDICAL equipment ,RELIABILITY (Personality trait) ,MATHEMATICAL optimization ,DRUG infusion pumps ,DATA analysis - Abstract
This paper proposes a method to analyze statistically maintenance data for complex medical devices with censoring and missing information. It presents a classification of the different types of failures and establishes policies for analyzing data at the system and component levels taking into account the failure types. The results of this analysis can be used as basic assumptions in the development of a maintenance/inspection optimization model. As a case study, we present the reliability analysis of a general infusion pump from a hospital. Copyright © 2010 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
50. On the identification and analysis of Skype traffic.
- Author
-
Molnár, Sándor and Perényi, Marcell
- Subjects
COMPUTER software ,COMMUNICATIONS industries ,DATA encryption ,ALGORITHMS ,AUTOMATION ,COMPUTER networks ,DATA analysis ,PEER-to-peer architecture (Computer networks) - Abstract
Skype applies strong encryption to provide secure communication inside the whole Skype network. It also uses several techniques to conceal the traffic and the protocol. As a consequence, traditional port-based or payload-based identification of Skype traffic cannot be applied. In this paper, after an overview of the Skype P2P system, network entities and operation, we introduce novel algorithms to detect several types of communications (including voice calls primarily) that the Skype client initiates toward dedicated servers of the Skype network and other peers. The common point in these algorithms is that all of them are based on packet headers only and the extracted flow level information. We do not need information from packet payloads. The identification methods allow us to discover logged on Skype users and their voice calls. The whole identification process is scripted in Transact-SQL; it can thus be executed automatically on a prerecorded (offline) data set. We present identification results, analysis and comparison of data sets captured in mobile and fixed networks. We also present the validation of the algorithms in both network types. Copyright © 2010 John Wiley & Sons, Ltd. After investigating the operation of the Skype peer-to-peer overlay network and performing traffic measurements, we proposed heuristic methods based on flow-dynamics to identify Skype traffic. Besides packet headers and the extracted flow-level information, the identification algorithms rely on the observable behavior of the Skype protocol. We performed traffic analysis in fixed and mobile network environments, including ADSL and 3G/HSDPA mobile networks. The results involve daily profiles of Skype calls and user activity, characteristic properties of Skype voice calls, and findings about the Skype codec. Copyright © 2010 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.