288 results
Search Results
2. Doing place through data: Proliferation, profiling and the perils of portrayal in local climate action.
- Author
-
Knox, Hannah
- Abstract
Building on work which has shown the role of digital technologies in reframing environmental relations, this paper explores ethnographically how environmental data is reconfiguring the concept of place. The paper takes as its focus an action-research project within a UK based, citizen-oriented initiative called Newtown Energy Futures, in which we sought to enfold climate and energy data into a social-justice informed attempt at climate action. By exploring how the project used data as an invitation for citizens to engage with and participate in local infrastructural and environmental dynamics, the paper sheds light on how environmental data came to participate in the making of place and in doing so raised questions about how to rebuild the socio-material relations through which 'a sense of place' might be reproduced. As climate and energy data increasingly demand that places become enrolled into environmental projects our findings suggest that data enables place to emerge as a 'socio-technical potentiality' an observation that has implications both for both engagement with, and the study of data and place. In practical terms, we suggest that this refiguration of place has the effect of creating hopeful trajectories for change, whilst also posing difficult questions about the limits of participation in a data-infused form of place-based politics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Theorizing globalized production and digitalization: Towards a re-centering of value.
- Author
-
Foster, Christopher
- Subjects
GLOBAL value chains ,GLOBAL production networks ,DIGITAL technology ,HIGH technology industries - Abstract
Digital and data-driven technologies are having substantial impacts on global production, with growing analysis within established frameworks such as Global Value Chains (GVC) and Global Production Networks (GPN). Given the claims, however, that digitalization is leading to transformations in the patterns of production and labor, further theoretical work is needed to consider how these frameworks fit with evolving dynamics. Beginning with critiques that mainstream GVC/GPN have poorly theorized the concept of value, the paper argues that a re-centering of value is crucial for improved understanding of digitalization. To do this, broader debates in the literature on the digital economy—on rent and surplus value—are reviewed. These debates provide an expanded perspective of value including a broader understanding of forms of techno-economic rent and the growing debates on heterogeneous forms of labor, shaping production. A stronger orientation towards value within mainstream GVC/GPN studies can absorb some of these ideas, but considering the evolving forms, conventional notions of governance and upgrading may be less viable. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Starting with the archive: principles for prospective collaborative research.
- Author
-
Thomson, Rachel and Berriman, Liam
- Subjects
PUBLISHING ,HEALTH services accessibility ,ACQUISITION of data ,HUMAN services programs ,QUALITATIVE research ,INTERPROFESSIONAL relations ,RESEARCH funding ,ACTION research ,ARCHIVES ,LONGITUDINAL method ,VIDEO recording - Abstract
What are participants and researchers agreeing to when they consent to having data archived and what do they imagine the future life of their data to be? In this paper, we reflect on a project that deliberately started rather than ended with the archive. The Everyday Childhoods project invited children and their families to take part in the creation of an open access public archive documenting everyday childhoods using a range of multimedia data. Families and researchers were invited into the archive, encouraged to imagine different kinds of secondary use and to speak directly to future user of their data through short films and postcards. This paper raises questions concerning the place of the archive in different disciplinary traditions; the roles of researcher and archivist in safekeeping, gatekeeping and caring for data collections; and the place of qualitative longitudinal research as a site of innovation within a new data landscape. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. (De)constructing machines as critical technical practice.
- Author
-
Soon, Winnie and Velasco, Pablo R
- Subjects
SOCIOTECHNICAL systems ,TEACHING methods ,RESEARCH personnel ,MACHINERY ,SOCIAL structure ,MACHINE learning - Abstract
This paper discusses the role of technology under the framework of Critical Technical Practice specifically in the form of constructing artefacts and deconstructing tools in order to produce what Philip Agre would describe as 'reflexive work of critique' (Agre, 1997:155). By presenting the activities and methods used in the teaching and shaping of undergraduate courses, this paper aims to show how technical objects, such as data, datasets, application programming interfaces and machine learning models, can be considered as discursive subjects, demonstrating pedagogical understanding across fields. The courses operate in the humanities tradition and take critical technical practice as a didactic approach, insofar as software and data are understood and manipulated on an instrumental level, while encouraging critical engagement and embodied reflection that bridge the technical and social/cultural domains. Within this pedagogical approach, critical is not only understood as a paradigm of rationality or quantitative, data-driven argumentation, but as adopting a critical position – that is, to research and reflect on the social structures and cultural phenomena entangled with digital objects, bodies, tools, methods and software production. By embracing work-in-progress and reflexive exploration, we aim to extend the notion of critical technical practice by unfolding how (de)constructing machines can be achieved beyond thinking of technology as neutral instrumentalisation. The challenge is how to find a balance, not only as researchers but as educators, unfolding aspects of both formality and functionality as well as questioning and understanding technology at a discursive and critical level. We argue that learning technical practice in an educational setting is not an end, but rather a means to question existing technological structures and create further changes in socio-technical systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Using Sports Data to Advance Management Research: A Review and a Guide for Future Studies.
- Author
-
Fonti, Fabio, Ross, Jan-Michael, and Aversa, Paolo
- Subjects
SPORTS ,MANAGEMENT ,RESEARCH methodology ,DATA ,COLLEGE basketball teams ,MANAGEMENT philosophy ,GRAND Prix racing ,EXPERIMENTAL design - Abstract
Sports contexts are increasingly used in management research to test and develop theory and explore managerially relevant phenomena. This growth in publications is likely driven by a series of advantages that sports data offers to management researchers. However, such positive features are not a panacea, as several drawbacks are also associated with leveraging sports data, which can limit their usefulness for management scholars. In this paper, we aim to provide management researchers guidance to leverage the advantages and avoid the drawbacks of leveraging sports contexts. To do so, we identify and review 249 papers published over the last 50 years that used sports data to advance management theories and shed light on managerial phenomena. After outlining how these works contributed to the growth of several key conversations in management research, we discuss the advantages of using sports data by outlining how they can advance management research both conceptually (e.g., theory building and radical theorizing) and empirically (e.g., triangulation and replication). We then discuss the potential drawbacks of research using sports data and suggest ways to compensate for them. We close by outlining several new directions in which scholars can leverage sports data to further advance management research. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Absence, multiplicity and the boundaries of research? Reflections on online asynchronous focus groups.
- Author
-
Estrada-Jaramillo, Ana, Michael, Mike, and Farrimond, Hannah
- Subjects
FOCUS groups ,CONGENITAL, hereditary, & infantile syphilis ,WOMEN ,QUALITATIVE research ,HEALTH attitudes ,COVID-19 pandemic - Abstract
During the COVID 19 pandemic, Online Asynchronous Focus Groups (OAFG) through WhatsApp were conducted to explore women's experiences in the context of Congenital Syphilis prevention in Colombia. This paper discusses issues raised by the OAFGs (not least in relation to face-to-face focus groups). After a review of the literature on online and offline focus groups, there is a consideration of some key features of our OAFGs. In particular, we note how silence, presence, attention, continuity and multiplicity manifested in our OAFGs. We suggest that rather than regarding OAFGs as inferior to the 'gold standard' of face-to-face focus groups, our OAFGs raise important questions about our assumptions about focus group methodology. For instance, what counts as participant engagement, what comprises 'useful' social data, and what constitute the boundaries of a focus group all emerge as critical issues. We go on to reflect on some of the implications of these issues for the fruitfulness of OAFG methods. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. Using Old Data: When Is It Appropriate?
- Author
-
Ketchen Jr., David J., Roccapriore, Ashley Y., and Connelly, Brian L.
- Subjects
DATA ,RESEARCH methodology ,ACQUISITION of data ,ARCHIVAL resources ,DISCLOSURE - Abstract
Researchers and gatekeepers lack clarity about the circumstances under which using old data to test hypotheses is appropriate or inappropriate. In response to this complex issue, we first define what makes data "old." We then suggest that using old data is justifiable (a) when examining a past event, (b) when recent data are not available, or (c) when the data were collected painstakingly. Scholars should avoid using old data if none of these conditions exist. Further, authors should be forthcoming about the age of their data and, in the case of a rejected journal submission, update the data whenever possible. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. Data Power in Material Contexts: Introduction.
- Author
-
Kennedy, Helen and Bates, Jo
- Subjects
NATIONALISM & television ,DIGITAL media ,BIG data ,TECHNOLOGY & scholarship ,DATA analysis - Abstract
This short piece introduces the special issue of Television & New Media (TVNM) on data power in material contexts, which brings together papers which analyze the operations of data power across a range of real-world domains. It highlights the increasing connectedness of digital data tracking, aggregation, and analytics across domains that include and move beyond media, as data are increasingly combined and shared across diverse digital spaces. Thus, it connects media and communications scholarship concerned with datafication to debates in other related and overlapping fields, as part of the larger project of building data studies as an interdisciplinary and critical field. It briefly introduces the papers in the special issue, all of which constitute detailed empirical investigations that ground the study of data power in specific, material contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
10. Drawing Data Together: Inscriptions, Asylum, and Scripts of Security.
- Author
-
Perret, Sarah and Aradau, Claudia
- Subjects
- *
POLITICAL refugees , *SCHOLARLY method , *BORDER security , *INSCRIPTIONS , *SCRIPTS - Abstract
Data have become a vital device of border governance and security. Recent scholarship on the datafication of borders and migration at the intersection of science and technology studies and critical security studies has privileged concepts attuned to messiness, contingency, and friction such as data assemblages and infrastructures. This paper proposes to revisit and expand the analytical vocabulary of script analysis to understand what comes to count as data, what forms of data come to matter and how "drawing data together" reconfigures power and agency at Europe's borders. Empirically, we analyze controversies about the practices of asylum decision-making and age assessment in Greece. We show that agency of "users" is unequally distributed through anticipations of subscription and dis-inscription, while asylum seekers are conscripted within security scripts that restrict their agency. Moreover, as a multiplicity of inscriptions are produced, migrants' claims can be disqualified through circumscriptions of data and ascriptions of expertise. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Attending to data: Exploring the use of attendance data within the datafied school.
- Author
-
Selwyn, Neil, Pangrazio, Luci, and Cumbo, Bronwyn
- Subjects
SCHOOL attendance ,SCHOOL administration ,EDUCATIONAL planning ,SECONDARY schools ,SECONDARY education - Abstract
Contemporary schooling is seen to be altering significantly in light of a combined 'digitisation' and 'datafication' of key processes. This paper examines the nature and conditions of the datafied school by exploring how a relatively prosaic and longstanding school metric (student attendance data) is being produced and used in digital form. Drawing on empirical data taken from in-depth qualitative studies in three contrasting Australian secondary schools, the paper considers 'anticipatory', 'analytical' and 'administrative' aspects of how digitally-mediated attendance data is produced, used and imagined by school staff. Our findings foreground a number of constraints, compromises and inconsistencies that are usually glossed-over in enthusiasms for 'data-driven' education. It is argued that these findings highlight the messy realities of schools' current relationships with digital data, and the broader logics of school datafications. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
12. Algorithmic Surveillance in the Gig Economy: The Organization of Work through Lefebvrian Conceived Space.
- Author
-
Newlands, Gemma
- Subjects
WORK structure ,DIGITAL mapping ,GIG economy ,TEMPORARY employment ,VIRTUAL reality ,POWER (Social sciences) ,SOCIOLOGY of work - Abstract
Workplace surveillance is traditionally conceived of as a dyadic process, with an observer and an observee. In this paper, I discuss the implications of an emerging form of workplace surveillance: surveillance with an algorithmic, as opposed to human, observer. Situated within the on-demand food-delivery context, I draw upon Henri Lefebvre's spatial triad to provide in-depth conceptual examination of how platforms rely on conceived space, namely the virtual reality generated by data capture, while neglecting perceived and lived space in the form of the material embodied reality of workers. This paper offers a two-fold contribution. First, it applies Henri Lefebvre's spatial triad to the techno-centric digital cartography used by platform-mediated organisations, assessing spatial power dynamics and opportunities for resistance. Second, this paper advances organisational research into workplace surveillance in situations where the observer and decision-maker can be a non-human agent. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
13. From People to Objects: The digital transformation of fields.
- Author
-
Alaimo, Cristina
- Subjects
DIGITAL transformation ,DIGITAL technology ,INTERNET advertising ,COLLECTIVE action ,ALGORITHMS ,AUTOMATION - Abstract
Digital technologies are reconfiguring organizations and their environments. Activities are increasingly distributed across fields and coordinated by data, algorithms and machines. This paper investigates data objects (objects made of data structured and aggregated under a specific template) and their role in structuring fields and field practices. It studies programmatic advertising, an automated bidding process with hundreds of participants, whereby media spaces are auctioned in real time as individual users browse online content. To work on such a large scale, programmatic advertising must standardize existing field knowledge into data and coordinate collective action through objects, algorithms and technologies. This study shows how data, data objects and their infrastructures are involved in transforming the links between institutions and practice. The makeup and functioning of data and data objects reorient existing cognitive, normative and regulative structures, constrain the rule of engagement of actors and enable field-level autonomous interaction. The datafication of knowledge and automation of practices exposed in this paper call for a thorough rethinking of existing approaches to the concepts of organizations and fields. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
14. Sketchbook archaeology: Bodies multiple and the archives they create.
- Author
-
Novak, Shannon A
- Subjects
NOTEBOOKS ,ARCHIVES ,PRESBYTERIAN Church ,WRITING processes ,ARCHAEOLOGY - Abstract
Archaeological bodies and their afflictions have multiplied in recent years, along with the specialists who study them. The result is a cascade of data, much of it difficult to reconcile. I argue that variable enactments of disease, rather than reflecting an epistemological disconnect or difference in scale, engender ontological gaps. To pursue these malleable matters, I trace the proliferation of "cancer" from the Spring Street Presbyterian Church burial vaults (1820–1850) in Manhattan. To explore the struggles involved in making many things one, I consider emergent multiplicities of this "disease" within specialists' laboratories, archival records, and the writing process. Rather than force these different cancers to cohere, or make one "win" based on disciplinary domain (science/humanities) or hierarchy of substance (bone/paper), I rely on Stengers's (2018) ecology of partial connects. The outcome is not a rubric of knowledge gained, but a sketchbook of lessons learned with bodies multiple along the way. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Europeanizing the Danish School through National Testing: Standardized Assessment Scales and the Anticipation of Risky Populations.
- Subjects
NATIONAL competency-based educational tests ,STANDARDIZED tests ,EXAMINATIONS ,FOREIGN students ,INFORMATION economy ,MONETARY incentives ,DYNAMIC testing - Abstract
This paper explores "the peopling of Europe through data practices" in relation to standardized testing of students in Denmark. Programme for International Student Assessment (PISA) is a central component of Danish and European education infrastructures. In Denmark, mediocre PISA results spurred the introduction of national testing. With inspiration from Michel Foucault's notion of biopolitics, this paper analyzes how complementary Danish national test assessment scales make up population objects and student subjects and how these scales are aligned with European and transnational standards. A norm scale, standardized against the European Credit Transfer System (ECTS) grading scale, enacts a population whose performance can be tracked over time. A criteria scale introduces categories describing skills and enacts a moving student subject whose progression can be tracked. This paper argues that the three assessment scales enact the student population as bound to the nation and as simultaneously constituted in relation to transnational European categories and imaginaries of competition. As part of this, this paper discusses how the national test and PISA are used to single out students of non-European background, anticipated to be low PISA achievers and nonparticipants in a European knowledge economy. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
16. "We're building their data": Labor, alienation, and idiocy in the smart city.
- Author
-
Attoh, Kafui, Wells, Katie, and Cullen, Declan
- Subjects
SMART cities ,URBAN growth ,INTELLECTUAL disabilities ,SUSTAINABLE development ,LABOR ,ELECTROCHROMIC windows - Abstract
In 2017, Uber Technologies Inc. launched a new service called Uber Movement. Designed by a team of 10 engineers, the new service provided a select number of cities access to Uber's vast trove of transportation data. One of the first cities to partner with Uber on this initiative was Washington, DC. Playing directly to the city's longstanding "smart city" aspirations, the initiative was greeted warmly by city officials eager to market the region as a symbol of data-driven urban growth and smart technology. Largely missing from this response, however, was any mention of Uber drivers themselves. Over the course of the paper, and drawing on 40 interviews conducted with Washington, DC-based Uber drivers, we examine the labor conditions that we argue are central to the production of Uber's smart data. Beyond placing labor more centrally in critiques of the smart city, the paper suggests that the experience of Uber drivers offers us a window into the type of smart city on offer. As we argue, the city that emerges from our interviews is less a city defined by data-driven growth, than it is a city defined by alienation and isolation. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
17. Financial geography III: Research strategies, designs, methods and data.
- Author
-
Wójcik, Dariusz
- Subjects
GEOGRAPHY ,EXPERIMENTAL design - Abstract
I review research strategies, designs, methods and data in financial geography, by focussing on 449 articles published in 2001–2020. The analysis shows considerable methodological diversity and originality, contributing to geography and studies of finance. Over time, qualitative strategies, case study design and interviews as a method and data source are growing, while quantitative strategy, longitudinal design, regression methods and the use of government data are declining. The analysis helps identify gaps, opportunities and challenges, including the need for more methodological transparency, more in-depth qualitative and quantitative approaches supporting causal analysis, and a more ambitious historical and geographical scope. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
18. A Feature Extraction & Selection Benchmark for Structural Health Monitoring.
- Author
-
Buckley, Tadhg, Ghosh, Bidisha, and Pakrashi, Vikram
- Subjects
STRUCTURAL health monitoring ,FEATURE selection ,SUPERVISED learning ,SIGNAL processing ,STATISTICAL learning ,FEATURE extraction - Abstract
There are a large number of time domain, frequency domain and time-frequency signal processing methods available for univariate feature extraction. However, there is no consensus in SHM on which feature, or feature sets, are best suited for the identification, localisation and prognosis of damage. This paper attempts to address this problem by providing a comprehensive benchmark of feature selection & reduction methods applied to an extensive set of univariate features. These univariate features are extracted using multiple statistical, temporal and spectral methods from the benchmark S101 and Z24 bridge datasets. These datasets contain labelled accelerometer recordings from full scale bridges as they are progressively subjected to multiple damage scenarios. To identify the minimal set of features that best distinguishes between the multiple damage states, a supervised machine learning approach is used in combination with multiple feature selection methods. The ability of these reduced feature sets to distinguish between damage states is benchmarked using the prediction performance of the classification models, with the training and test sets obtained through stratified k-fold cross validation. The results obtained show that reduced sets of univariate features, extracted from a single accelerometer sensor, are capable of accurately distinguishing between multiple classes of healthy and damaged states. This work provides a benchmark for SHM practitioners and researchers alike for the choice, comparison and validation of feature extraction and feature selection methods across a wide range of systems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
19. Infrastructuring European migration and border control: The logistics of registration and identification at Moria hotspot.
- Subjects
BORDER security ,RECORDS management ,ELECTRONIC data processing ,LOGISTICS ,IDENTIFICATION ,IMAGE registration - Abstract
This paper examines Moria hotspot in Greece as a logistical site which fulfills two different functions within the European migration and border regime. It locates, contains, and sorts individuals locally at the external borders of the EU and creates, inserts, and processes data for controlling people on the move. Based on ethnographic fieldwork in Greece, including interviews with local administrators from the Registration and Identification Service, Médecins du Monde, Frontex and Hellenic Police and a collection of internal and publicly available planning, policy, and management documents and handbooks, the paper scrutinizes how both the movement of migrants and data is organized at the site. By developing an analytic lens of logistics, it outlines a specific mode of infrastructuring which aligns staff from different organizations with databases, devices, and migrants all in one place and organizes mundane practices such as filling out forms, taking fingerprints, signing, and entering datasets along a chain. That way the hotspot is able to locate, sort, and detain those who arrive at the hardened EU border and to create a data infrastructure for controlling, monitoring, and governing further movement by processing data through the bureaucratic channels of the EU's transnational control assemblages. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
20. The Datafication of Intimacy: Mobile Dating Apps, Dependency, and Everyday Life.
- Author
-
De Ridder, Sander
- Subjects
MOBILE apps ,INTIMACY (Psychology) ,EVERYDAY life ,YOUNG adults ,MOBILE libraries - Abstract
Mobile dating apps are familiar in everyday life. Their data-driven operations offer algorithmically organized archives of people. This paper aims to offer a reflection on the datafication of intimacy, focusing on the social knowledge mobile dating apps produce on the building of close human connections. Drawing on interviews with young adults, I rely on an existential media analysis, exploring struggles with and around mobile dating apps. I argue that the datafication of intimacy is a particular way of experiencing intimacy, going beyond the socio-technological functions of mobile dating apps. I show how the datafication of intimacy is a mathematical mind-set characterized by commercialization and rationalization (predictability, controllability, convenience), building a relationship of interdependency between a data economy and intimacy. I conclude how this interdependency is an emotionally experienced, existential burden for people, it demands reflection on how data-driven technology has become environmental to building close human connections. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
21. Sacred Excess: Organizational Ignorance in an Age of Toxic Data.
- Author
-
Schwarzkopf, Stefan
- Subjects
OVERPRODUCTION - Abstract
Actors in data-intensive industries at times deliberately induce and reproduce organizational ignorance by engaging in over-production of data. This observation leads the paper to make two claims. First, members of these industries fetishize data excess not in order to reduce, but in order to reproduce and stabilize organizational ignorance. Second, in this process of fetishization, organizational ignorance gives rise to forms of collective effervescence similar to that found in totemistic religions. This effervescence allows organizational actors to draw defining lines around that which is marked as awe-inspiring, dangerous and off-limits, namely the sacred. In reviewing organizational ignorance from the perspective of the sacred, this paper proposes that, paradoxically, contemporary forms of data creation allow companies and industries to organize themselves around ignorance as opposed to the promise of knowledge and insight. The paper uses this theoretical proposal in order to outline the contours of an alternative ontology of organizational ignorance, one that understands this phenomenon in terms of excessive presence of data and information. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
22. Editorial.
- Author
-
Gulson, Kalervo N.
- Subjects
DATA ,UNIVERSITIES & colleges - Abstract
An introduction is presented which discusses articles within the issue on topics including role of practitioner research in universities, and use of data in materialist work.
- Published
- 2018
- Full Text
- View/download PDF
23. Antitrust, Big Tech, and Democracy: A Research Agenda.
- Author
-
Robertson, Viktoria H. S. E.
- Subjects
- *
POLITICAL debates , *ANTITRUST law , *DEMOCRACY , *ECONOMICS - Abstract
In the twenty-first century, voter choice and the broader political debate are within the reach of those that can access and channel the vast streams of user data that are generated online. How digital platforms utilize personal user data to influence the outcome of democratic processes has become a central issue that liberal democracies must confront. The paper explores whether competition law has a role to play when it comes to addressing this intersection of Big Tech, data, and democracy. It first sets out the democratic roots of competition or antitrust law in the United States and the European Union. From these, the paper deduces that competition law cannot remain inactive when it comes to maintaining a democratic society in the face of the abilities of Big Tech to influence democratic processes and outcomes. The paper then goes a step further and asks what role competition law could play in this regard. Should democratic values simply be reflected in the procedural set-up of antitrust law, or is there a role for democratic values in the substantive provisions as well? And if so, does antitrust law's focus on keeping market power in check suffice to fulfill its role in a democratic society, or does this role require the law to specifically target antidemocratic market behavior as anticompetitive harm? In navigating these questions, the paper contributes to the ongoing debate on political antitrust and sets out an ambitious research agenda on how to carry this discussion forward. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
24. The material politics of mobile virtual reality: Oculus, data, and the technics of sensemaking.
- Author
-
Egliston, Ben and Carter, Marcus
- Subjects
VIRTUAL reality ,MEDIA studies ,NEW product development ,PRACTICAL politics - Abstract
This paper contributes to an increasing occupation in media studies with Mobile Virtual Reality (MVR) – a form of 'wireless' VR, where all necessary sensing componentry is built into the system's headset and controllers. Our analysis focuses on the Quest series of devices, offered by Facebook-owned VR company Oculus. Through the philosophy of Gilbert Simondon, we argue that the Quest represents a 'concretisation' of VR – of VR becoming internally coherent and synergistic, enabling its mobility and use in varied contexts. We suggest that this process of concretisation is what enables it to generate vast amounts of data with the potential for use by Facebook's advertising arm and in future product development. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
25. Banking on exclusion: Data disclosure and geographies of UK personal lending markets.
- Author
-
Henry, N., Pollard, J., Sissons, P., Ferreira, J., and Coombes, M.
- Subjects
BRITISH politics & government ,DATA ,PERSONAL loans ,MARKETS ,SPATIAL analysis (Statistics) - Abstract
In 2013, the UK Government announced that seven of the nation’s largest banks had agreed to publish their lending data at the local level across Great Britain. The release of such area based lending data has been welcomed by advocacy groups and policy makers keen to better understand and remedy geographies of financial exclusion. This paper makes three contributions to debates about financial exclusion. First, it provides the first exploratory spatial analysis of the personal lending data made available; it scrutinises the parameters and robustness of the dataset and evaluates the extent to which the data increase transparency in UK personal lending markets. Second, it uses the data to provide a geographical overview of patterns of personal lending across Great Britain. Third, it uses this analysis to revisit the analytical and political limitations of ‘open data’ in addressing the relationship between access to finance and economic marginalisation. Although a binary policy imaginary of ‘inclusion-exclusion’ has historically driven advocacy for data disclosure, recent literatures on financial exclusion generate the need for more complex and variegated understandings of economic marginalisation. The paper questions the relationship between transparency and data disclosure, the policy push for financial inclusion, and patterns of indebtedness and economic marginalisation in a world where ‘fringe finance’ has become mainstream. Drawing on these literatures, this analysis suggests that data disclosure, and the transparency it affords, is a necessary but not sufficient tool in understanding the distributional implications of variegated access to credit. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
26. Mapping an emergent field of 'computational education policy': Policy rationalities, prediction and data in the age of Artificial Intelligence.
- Author
-
Gulson, Kalervo N. and Webb, P. Taylor
- Subjects
EDUCATION ,ARTIFICIAL intelligence ,DECISION making ,EDUCATION policy ,LEARNING ,EDUCATIONAL technology ,STUDENTS - Abstract
Contemporary education policy involves the integration of novel forms of data and the creation of new data platforms, in addition to the infusion of business principles into school governance networks, and intensification of socio-technical relations. In this paper, we examine how 'computational rationality' may be understood as intensifying of an instrumental set of logics in educational governance and decision making, and/or as opening up new explorations around the uncertainty and incompleteness of policy. We suggest that policy rationalities focused on prediction, transparency and data provide the conditions of possibility for Artificial Intelligence to be integrated into, and intensify aspects of, what we term 'computational education policy'. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
27. A gold mine, but still no Klondike: Nordic register data in health inequalities research.
- Author
-
Van Der Wel, Kjetil A., Östergren, Olof, Lundberg, Olle, Korhonen, Kaarina, Martikainen, Pekka, Andersen, Anne-Marie Nybo, and Urhoj, Stine Kjaer
- Subjects
COMMUNICATION ,DOCUMENTATION ,HEALTH services accessibility ,HEALTH status indicators ,INTERPROFESSIONAL relations ,MEDICAL research ,PUBLIC health ,RESEARCH ethics ,INFORMATION resources ,SOCIOECONOMIC factors ,ACCESS to information ,HEALTH equity ,PSYCHOLOGY of Research personnel - Abstract
Aims: Future research on health inequality relies on data that cover life-course exposure, different birth cohorts and variation in policy contexts. Nordic register data have long been celebrated as a 'gold mine' for research, and fulfil many of these criteria. However, access to and use of such data are hampered by a number of hurdles and bottlenecks. We present and discuss the experiences of an ongoing Nordic consortium from the process of acquiring register data on socio-economic conditions and health in Denmark, Finland, Norway and Sweden. Methods : We compare experiences of data-acquisition processes from a researcher's perspective in the four countries and discuss the comparability of register data and the modes of collaboration available to researchers, given the prevailing ethical and legal restrictions. Results : The application processes we experienced were time-consuming, and decision structures were often fragmented. We found substantial variation between the countries in terms of processing times, costs and the administrative burden of the researcher. Concerned agencies differed in policy and practice which influenced both how and when data were delivered. These discrepancies present a challenge to comparative research. Conclusions : We conclude that there are few signs of harmonisation, as called for by previous policy documents and research papers. Ethical vetting needs to be centralised both within and between countries in order to improve data access. Institutional factors that seem to facilitate access to register data at the national level include single storage environments for health and social data, simplified ethical vetting and user guidance. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
28. Controlling response dependence in the measurement of change using the Rasch model.
- Author
-
Andrich, David
- Subjects
RASCH models ,ITEM response theory ,PARAMETERS (Statistics) ,DATA ,LOGIC - Abstract
The advantages of using person location estimates from the Rasch model over raw scores for the measurement of change using a common test include the linearization of scores and the automatic handling of statistical properties of repeated measurements. However, the application of the model requires that the responses to the items are statistically independent in the sense that the specific responses to the items on the first time of testing do not affect the responses at a second time. This requirement implies that the responses to the items at both times of assessment are governed only by the invariant location parameters of the items at the two times of testing and the location parameters of each person each time. A specific form of dependence that is pertinent when the same items are used is when the observed response to an item at the second time of testing is affected by the response to the same item at the first time, a form of dependence which has been referred to as response dependence. This paper presents the logic of applying the Rasch model to quantify, control and remove the effect of response dependence in the measurement of change when the same items are used on two occasions. The logic is illustrated with four sets of simulation studies with dichotomous items and with a small example of real data. It is shown that the presence of response dependence can reduce the evidence of change, a reduction which may impact interpretations at the individual, research, and policy levels. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
29. Missing the Millennium Development Goal targets for water and sanitation in urban areas.
- Author
-
SATTERTHWAITE, DAVID
- Subjects
SUSTAINABLE urban development ,URBAN planning ,DRINKING water analysis ,INTERNATIONAL cooperation - Abstract
This paper reviews progress towards the Millennium Development Goals (MDGs) for water and sanitation in urban areas. Drawing on UN data, it shows the disastrous performance of many low- and middle-income nations in relation to the goal of halving the proportion without drinking water sources piped on premises and improved sanitation between 1990 and 2015. It also describes how even such a poor performance is actually understating the problem because of deficiencies in the data available. For water, there are no data sources with global coverage on who has "sustainable access to safe drinking water" (what the MDGs specify). UN statistics record whether households have drinking water sources piped on premises, but this does not necessarily mean the water is safe to drink or that there is a regular, reliable supply (what is implied by sustainable access). For what is termed "improved" or "basic" sanitation, the bar is set too low in the quality of provision needed in urban areas, so large numbers of urban dwellers said to have improved or basic sanitation still lack sanitation that greatly reduces health risks. The paper emphasizes that assessments of provision for water and sanitation need to make allowances for different contexts; what can work well in rural contexts does not do so in large and dense urban agglomerations. The paper ends with a discussion of what the experience with the MDGs for water and sanitation implies for the Sustainable Development Goals. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
30. More than a number: The telephone and the history of digital identification.
- Author
-
Holt, Jennifer and Palm, Michael
- Subjects
TELEPHONE numbers ,COMMUNICATION infrastructure ,ELECTRONIC surveillance ,DATA privacy ,SOCIAL exchange ,POPULAR culture ,PRIVACY - Abstract
This article examines the telephone's entangled history within contemporary infrastructural systems of 'big data', identity and, ultimately, surveillance. It explores the use of telephone numbers, keypads and wires to offer new perspective on the imbrication of telephonic information, interface and infrastructure within contemporary surveillance regimes. The article explores telephone exchanges as arbiters of cultural identities, keypads as the foundation of digital transactions and wireline networks as enacting the transformation of citizens and consumers into digital subjects ripe for commodification and surveillance. Ultimately, this article argues that telephone history – specifically the histories of telephone numbers and keypads as well as infrastructure and policy in the United States – continues to inform contemporary practices of social and economic exchange as they relate to consumer identity, as well as to current discourses about surveillance and privacy in a digital age. This article is based on a paper presented at the Media in Transition symposium (Utrecht, June 28, 2018), in the Industries and Infrastructures panel organised by Judith Keilbach. Also published in this issue of ECS are Amanda D. Lotz, 'Unpopularity and cultural power in the age of Netflix: new questions for cultural studies' approaches to television texts' and Vicki Mayer, 'From peat to Google power: communications infrastructures and structures of feeling in Groningen.' [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
31. A Data Partitioning Method for Parallel Digital Terrain Analysis.
- Author
-
Wanfeng Dou, Yan Li, and Shoushuai Miao
- Subjects
DIGITAL elevation models ,COMPUTER simulation ,COMPUTER systems ,ELECTRONIC systems ,DATA - Abstract
Parallel computing of the intensive data is one of the effective methods to improve high-performance computation of massive data. The purpose of this paper is to study the method of data partitioning and scheduling which is geared to the strategy of parallel computation facing to the distribution of data in sequence. According to the features of the intensive data computation, this paper puts forward the concept of data granularity, return granularity and saturation. The parallel computing and scheduling model facing to the distribution of data in sequence is given based on these concepts. Considering that the startup and shutdown of data distribution has some overhead, the distribution of a data block in sequence and the calculation of another data block can be done at the same time. While the total time of the calculation is not decreasing with the increase of the number of data blocks divided, there exists an optimal value. Through analyzing the slope algorithm of Digital Terrain Analysis (DTA), the optimal solution of data partitioning and the best number of computing nodes is presented in this paper. The results of the experiment show that the theoretical analysis and the results of the experiment are basically consistent. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
32. Cloud geographies.
- Author
-
Amoore, Louise
- Subjects
CLOUD computing ,GEOPOLITICS ,BORDER security ,DRONE warfare ,ALGORITHMS - Abstract
The architecture of cloud computing is becoming ever more closely intertwined with geopolitics – from the sharing of intelligence data, to border controls, immigration decisions, and drone strikes. Developing an analogy with the cloud chamber of early twentieth century particle physics, this paper explores the geography of the cloud in cloud computing. It addresses the geographical character of cloud computing across two distinct paradigms. The first, ‘Cloud I’ or a geography of cloud forms, is concerned with the identification and spatial location of data centres where the cloud is thought to materialize. Here the cloud is understood within a particular history of observation, one where the apparently abstract and obscure world can be brought into vision and rendered intelligible. In the second variant, ‘Cloud II’ or the geography of a cloud analytic, the cloud is a bundle of experimental algorithmic techniques acting upon the threshold of perception itself. Like the cloud chamber of the twentieth century, contemporary cloud computing is concerned with rendering perceptible and actionable that which would otherwise be beyond the threshold of human observation. The paper proposes three elements of correlative cloud reasoning, suggesting their significance for our geopolitical present: condensing traces; discovering patterns; and archiving the future. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
33. Data challenges for public libraries: African perspectives and the social context of knowledge.
- Author
-
Lynch, Renee, Young, Jason C., Jowaisas, Chris, Rothschild, Chris, Garrido, Maria, Sam, Joel, and Boakye-Achampong, Stanley
- Subjects
PUBLIC libraries ,SOCIAL context ,LIBRARY administration ,DATA libraries ,INDUSTRIAL capacity ,BIBLIOTHERAPY - Abstract
This article sheds light on the collection and use of data by libraries in sixteen countries across Africa. It highlights the challenges that librarians and library organizations face in gathering, analyzing, and presenting data of various types for self-advocacy. In this study, qualitative data from a meeting of library representatives was analyzed to identify main challenges including: data integrity in terms of completeness, accuracy, credibility, and relevancy; infrastructure; capacity; local investment in libraries; time; and participation of data collectors and respondents. Implications for those collecting data on African libraries as well as those supporting the use of data in these contexts are discussed. The purpose of this paper is not to feed into representations of African libraries as chronically under-resourced and lacking in capacity, but rather, to constructively engage with first-hand accounts of how librarians are experiencing and navigating barriers in order to offer potential avenues forward for the field. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
34. Epidemiology of sexually transmitted infections in global indigenous populations: data availability and gaps.
- Author
-
Minichiello, Victor, Rahman, Saifur, and Hussain, Rafat
- Subjects
SEXUALLY transmitted diseases ,EPIDEMIOLOGY ,MORTALITY ,PUBLIC health ,DATA - Abstract
Socioeconomic and health disadvantage is widespread within and across indigenous communities in the world, leading to differentials in morbidity and mortality between indigenous and non-indigenous populations. Sexually transmitted infections (STIs), including HIV/AIDS, among indigenous populations are an emerging public health concern. The focus of this paper is on examining the STI epidemiology in indigenous communities in various parts of the world utilizing a range of data sources. Most of the STI research on global indigenous communities has concentrated on developed countries, neglecting more than half the world’s indigenous people in the developing countries. This has resulted in major gaps in data at global level for STIs and HIV/AIDS among indigenous populations. Available data show that the prevalence of STIs is increasing among the indigenous communities and in several instances, the rates of these infections are higher than among non-indigenous populations. However, HIV still remains low when compared with the rates of other STIs. The paper argues that there is an urgent need to collect more comprehensive and reliable data at the global level across various indigenous communities. There is also an opportunity to reverse current trends in STIs through innovative, evidence-based and culturally appropriate targeted sexual health programmes. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
35. Dangerous data: Analytics and information behaviour in the commercial world.
- Author
-
Tredinnick, Luke
- Subjects
BUSINESS intelligence ,DATA quality ,DECISION making ,VISUALIZATION - Abstract
Data has become an increasingly important component in contemporary business operations, epitomised by the rise of the Business Intelligence system, data analytics, and data visualisations. It has been associated with increased productivity and the development of new business opportunities. But the use of data is sometimes also associated with poor decision-making, either because of the quality of the data on which decisions are made, or because of the ways in which that data is used. This paper explores the problem of dangerous data in commercial contexts: those situations where the use of data contributes to worse outcomes.t [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. Structuring a Team-Based Approach to Coding Qualitative Data.
- Author
-
Giesen, Lindsay and Roeser, Allison
- Subjects
SCHOOL food ,VIDEO coding ,REFERENCE sources ,SPORTS forecasting ,DATA analysis ,QUALITATIVE research ,DATA - Abstract
Improvements to qualitative data analysis software (QDAS) have both facilitated and complicated the qualitative research process. This technology allows us to work with a greater volume of data than ever before, but the increased volume of data frequently requires a large team to process and code. This paper presents insights on how to successfully structure and manage a team of staff in coding qualitative data. We draw on our experience in team-based coding of 154 interview transcripts for a study of school meal programs. The team consisted of four coders, three senior reviewers, and a lead analyst and external qualitative methodologist who shepherded the coding process together. Lessons learned from this study include: 1) establish a strong and supportive management structure; 2) build skills gradually by breaking training and coding into "bite-sized" pieces; and 3) develop detailed reference materials to guide your coding team. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
37. "Value-adding" Analysis: Doing More With Qualitative Data.
- Author
-
Eakin, Joan M. and Gladstone, Brenda
- Subjects
QUALITATIVE research ,REFLEXIVITY ,DATA - Abstract
Much qualitative research produces little new knowledge. We argue that this is largely due to deficits of analysis. Researchers too seldom venture beyond cataloguing data into pre-existing concepts and scouting for "themes," and fail to exploit the distinctive powers of insight of qualitative methodology. The paper introduces a "value-adding" approach to qualitative analysis that aims to extend and enrich researchers' analytic interpretive practices and enhance the worth of the knowledge generated. We outline key features of this form of analysis, including how it is constituted by principles of interpretation, contextualization, criticality, and the "creative presence" of the researcher. Using concrete examples from our own research, we describe some analytic "devices" that can free up and stretch a researcher's analytic capacities, including putting reflexivity to work, treating everything as data, reading data for what is invisible, anomalous and "gestalt," engaging in "generative" coding, deploying heuristics for theorizing, and recognizing writing as a key analytic activity. We argue that at its core, value-adding analysis is a scientific craft rather than a scientific formula, a creative assemblage of reality rather than a procedural determination of it. The researcher is the primary generative and synthesizing mechanism for transforming empirically observed data into the key products of qualitative research—concepts, accounts and explanations. The ultimate value of value-adding analysis resides in its ability to generate new knowledge, including not just the "discovery" of things heretofore unknown but also the re-conceptualization of what is already known, and, importantly, the reframing and reconstitution of the research problem. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
38. A new joint screening method for right-censored time-to-event data with ultra-high dimensional covariates.
- Author
-
Liu, Yi, Chen, Xiaolin, and Li, Gang
- Subjects
CENSORING (Statistics) ,STATISTICS ,BREAST cancer ,STATISTICAL models ,DATA ,RESEARCH funding - Abstract
In an ultra-high dimensional setting with a huge number of covariates, variable screening is useful for dimension reduction before applying a more refined method for model selection and statistical analysis. This paper proposes a new sure joint screening procedure for right-censored time-to-event data based on a sparsity-restricted semiparametric accelerated failure time model. Our method, referred to as Buckley-James assisted sure screening (BJASS), consists of an initial screening step using a sparsity-restricted least-squares estimate based on a synthetic time variable and a refinement screening step using a sparsity-restricted least-squares estimate with the Buckley-James imputed event times. The refinement step may be repeated several times to obtain more stable results. We show that with any fixed number of refinement steps, the BJASS procedure retains all important variables with probability tending to 1. Simulation results are presented to illustrate its performance in comparison with some marginal screening methods. Real data examples are provided using a diffuse large-B-cell lymphoma (DLBCL) data and a breast cancer data. We have implemented the BJASS method using Matlab and made it available to readers through Github https://github.com/yiucla/BJASS . [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
39. The Spike-and-Slab Lasso regression modeling with compositional covariates: An application on Brazilian children malnutrition data.
- Author
-
Louzada, Francisco, Shimizu, Taciana KO, and Suzuki, Adriano K
- Subjects
BRAZILIANS ,MALNUTRITION in children ,REGRESSION analysis ,MALNUTRITION ,PARAMETER estimation ,DATA - Abstract
There are considerable challenges in analyzing large-scale compositional data. In this paper, we introduce the Spike-and-Slab Lasso linear regression in the presence of compositional covariates for parameter estimation and variable selection. We consider the well-known isometric log-ratio (ilr) coordinates to avoid misleading statistical inference. The separable and non-separable (adaptative) Spike-and-Slab Lasso penalties are compared to verify the advantages of each approach. The proposed method is illustrated on simulated and on real Brazilian child malnutrition data. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
40. Analysis on trial.
- Author
-
Dolan, Angelina and Ayland, Catherine
- Subjects
QUALITATIVE research ,COMPUTER assisted research ,COMPUTER software ,DATA analysis ,RESEARCH methodology ,DATA - Abstract
This paper describes work carried out in order to assess whether the approach taken to the analysis of qualitative data impacts upon the findings and their consequent interpretation. Three different approaches were used to analyse the same dataset -- a Holistic and Interpretive approach, a Cut and Paste approach and Computer Assisted Qualitative Data Analysis Software (CAQDAS). We discuss the differences between the approaches themselves, their output, the relative costs of adopting the three different approaches and the implications of the work. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
41. A critical exploration of face-to-face interviewing vs. computer-mediated interviewing.
- Author
-
Curasi, Carolyn Folkman
- Subjects
INTERVIEWING ,AMERICAN business enterprises ,STATISTICAL sampling ,INTERNET ,DATA - Abstract
Since the early 1990s, the internet has dominated the attention of the media, academics and business organizations. It has the potential of being a revolutionary way to collect primary and secondary data, although much more research is needed to learn how to better harness its strengths. This project compares depth interviews collected online with depth interviews conducted face-to-face. Advantages and disadvantages are highlighted, as well as suggested strategies for successfully collecting online data. Major points are illustrated using data from a project in which both data collection techniques are employed. The online interview dataset included some of the strongest and some of the weakest interviews in the investigation. This paper argues that under some conditions online depth interviews can provide a useful complement to the traditional face-to-face interview. Sampling frame problems of nonrepresentativeness, endemic in quantitative online data collection, is not problematic if the researcher is conducting an interpretive investigation. When the researcher's goal is not to quantify or generalise but instead to better understand a particular population, online data collection can complement other datasets, allow data triangulation and strengthen the trustworthiness of the findings. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
42. Making Exceptions Exceptional: A Cross-Methodological Review and Future Research Agenda.
- Author
-
Hymer, Christina B. and Smith, Anne D.
- Subjects
DATA ,QUALITATIVE research ,QUANTITATIVE research ,MANAGEMENT periodicals ,OUTLIERS (Statistics) - Abstract
"Exceptions" refers to data obtained from a nontraditional context and/or data that emerge during data analysis that substantially deviate from other data present within a study. Both qualitative and quantitative research acknowledge exceptions; however, approaches for handling and discussing exceptions vary across these two perspectives and are rarely integrated. We provide a two-decade review of exception usages across 930 empirical articles in six leading management journals. Through our review, we identify two types of exceptions: planned and emergent. "Planned exceptions" describes unique data or phenomenon used to motivate a study design. "Emergent exceptions" describes nonconforming data that arise during data analysis. We review on-diagonal and off-diagonal patterns in exception uses across qualitative and quantitative research, pointing to varied ways that exceptions are used to further management theory. Based on insights gleaned from our review, we provide suggestions for researchers in handling exceptions across different phases of the research process: study design, data analysis, and findings presentation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Data sharing in orthodontic research.
- Author
-
Papageorgiou, Spyridon N. and Cobourne, Martyn T.
- Subjects
ORTHODONTICS ,DATA ,MEDICAL ethics ,CLINICAL trials ,DENTISTRY - Abstract
The article discusses the aim of orthodontic research which is dissemination of new data to wider community. According to International Committee of Medical Journal Editors, there is an ethical duty to properly share data created as a result of clinical research, particularly in case of interventional clinical trials which endanger lives of patients.
- Published
- 2018
- Full Text
- View/download PDF
44. In Search of a Problem: Mapping Controversies over NHS (England) Patient Data with Digital Tools.
- Author
-
Moats, David and McFall, Liz
- Subjects
HISTORY of science ,NATIONAL health services ,CONCEPT mapping ,EMERGENCY medical services communication systems ,HISTORY of technology ,TERRAIN mapping ,OBJECT tracking (Computer vision) - Abstract
There is a long history in science and technology studies (STS) of tracking problematic objects, such as controversies, matters of concern, and issues, using various digital tools. But what happens when public problems do not play out in these familiar ways? In this paper, we will think through the methodological implications of studying "problems" in relation to recent events surrounding the sharing of patient data in the National Health Service in the United Kingdom. When a data sharing agreement called care.data was announced in 2013, nearly 1.5 million citizens chose to opt out. Yet, in subsequent years, there has been little evidence of a robust public mobilising around data sharing. We will attempt to track this elusive 'non problem' using some digital tools developed in STS for the purpose of mapping issues and problem definitions within science. Although we find these digital tools are unable to capture the "problem," the process of searching helps us map the terrain of the case and forces us to consider wider definitions. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
45. Discovering the Language of Data: Personal Pattern Languages and the Social Construction of Meaning from Big Data.
- Author
-
Erwin, Kim, Bond, Maggee, and Jain, Aashika
- Subjects
BIG data ,VISUAL programming languages (Computer science) ,HUMAN beings ,DATA ,TECHNOLOGY - Abstract
This paper attempts to address two issues relevant to the sense-making of Big Data. First, it presents a case study for how a large dataset can be transformed into both a visual language and, in effect, a 'text' that can be read and interpreted by human beings. The case study comes from direct observation of graduate students at the IIT Institute of Design who investigated task-switching behaviours, as documented by productivity software on a single user's laptop and a smart phone. Through a series of experiments with the resulting dataset, the team effects a transformation of that data into a catalogue of visual primitives — a kind of iconic alphabet — that allow others to 'read' the data as a corpus and, more provocatively, suggest the formation of a personal pattern language. Second, this paper offers a model for human-technical collaboration in the sense-making of data, as demonstrated by this and other teams in the class. Current sense-making models tend to be data- and technology-centric, and increasingly presume data visualization as a primary point of entry of humans into Big Data systems. This alternative model proposes that meaningful interpretation of data emerges from a more elaborate interplay between algorithms, data and human beings. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
46. Two-part models with stochastic processes for modelling longitudinal semicontinuous data: Computationally efficient inference and modelling the overall marginal mean.
- Author
-
Yiu, Sean, Tom, Brian D. M., and Tom, Brian Dm
- Subjects
STOCHASTIC processes ,RANDOM effects model ,REGRESSION analysis ,DATA ,GAUSSIAN distribution - Abstract
Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
47. Evidence for dose and dose rate effects in human and animal radiation studies.
- Author
-
Little, M. P.
- Subjects
- *
RADIATION doses , *EXTRAPOLATION , *CANCER , *TISSUES , *MICE , *DATA , *ANIMAL experimentation , *DOSE-response relationship (Radiation) , *RADIATION carcinogenesis , *RELATIVE medical risk - Abstract
For stochastic effects such as cancer, linear-quadratic models of dose are often used to extrapolate from the experience of the Japanese atomic bomb survivors to estimate risks from low doses and low dose rates. The low dose extrapolation factor (LDEF), which consists of the ratio of the low dose slope (as derived via fitting a linear-quadratic model) to the slope of the straight line fitted to a specific dose range, is used to derive the degree of overestimation (if LDEF > 1) or underestimation (if LDEF < 1) of low dose risk by linear extrapolation from effects at higher doses. Likewise, a dose rate extrapolation factor (DREF) can be defined, consisting of the ratio of the low dose slopes at high and low dose rates. This paper reviews a variety of human and animal data for cancer and non-cancer endpoints to assess evidence for curvature in the dose response (i.e. LDEF) and modifications of the dose response by dose rate (i.e. DREF). The JANUS mouse data imply that LDEF is approximately 0.2-0.8 and DREF is approximately 1.2-2.3 for many tumours following gamma exposure, with corresponding figures of approximately 0.1-0.9 and 0.0-0.2 following neutron exposure. This paper also cursorily reviews human data which allow direct estimates of low dose and low dose rate risk. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
48. Special Issue on Spatial Methods for Health Policy Research.
- Author
-
Neelon, Brian and Lawson, Andrew B
- Subjects
SPATIAL analysis (Statistics) ,DATA - Abstract
An introduction is presented in which the editor discusses various reports within the issue on topics including bivariate probit model for joint spatial analysis of areal-referenced binary data, MacNab explores identifiability issues, and tackling the problem of missing data in spatial analysis.
- Published
- 2014
- Full Text
- View/download PDF
49. Certainties and Uncertainties from Using a Selection of Data to Predict Concert Hall Preference.
- Author
-
Skålevik, Magne
- Subjects
DATA ,CONCERT halls ,EVALUATION ,STATISTICAL correlation ,RESEARCH - Abstract
Over the past hundred years or so, many researchers have explored the possible correlation between physical properties of the concert halls and listeners assessment of the acoustics of the same halls. And we are still searching and researching. This author has previously shown how some sets of room acoustical parameters can, with their appropriate qualifying criteria, be used to explain the subjective ranking of a selection of halls from Beranek's rank ordering of 58 halls. A set of five listening aspects in ISO-3382 seems to be important, but trials with even more physical quantities have provided more explanation potential. A critical limitation in the research turned out to be the lack of sufficient amount of subjective AND objective data, leading to the launch of an online concert hall acoustics rating survey. In this paper, the latest results from this author's investigation are presented, featuring a demonstration of how the size of selected data affects the statistical uncertainties in such results. Remaining uncertainty in the prediction method naturally leads to a "safety first" policy with strict acceptance limits for the objective data. As a consequence, many appreciated halls would not be recommended for replication. These and other consequences need to be discussed in further work. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
50. Merging the Religious Congregations and Membership Studies: A Data File for Documenting American Religious Change.
- Author
-
Bacon, Rachel, Finke, Roger, and Jones, Dale
- Subjects
CHURCH membership ,RELIGIOUS statistics ,LONGITUDINAL method ,RELIGIOUS diversity ,SCHISM - Abstract
The decennial religious congregations and membership studies are a popular data source for analyzing local religious composition and diversity, but several methodological challenges hinder merging the datasets for longitudinal analyses. In this paper, we introduce strategies for addressing four of the most serious challenges: religious mergers and schisms, changes in membership standards within certain groups, missing data and changes in county boundaries. In doing so we successfully merge the 1980, 1990, 2000 and 2010 collections and build new longitudinal datasets of congregational and membership counts at the state and county levels. These changes increase religious group representation from 48 to 76, reduce bias from missing data, allow for the more reliable inclusion of 20-23 million adherents in each year, and improve overall ease of use. We also document instances when corrections were not possible and alert readers to the limitations of the merged files when measuring change among certain groups. The new longitudinal files are accessible from theARDA.com. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.