13 results on '"Paige, Nong"'
Search Results
2. Public comfort with the use of ChatGPT and expectations for healthcare.
- Author
-
Jodyn Platt, Paige Nong, Renée Smiddy, Reema Hamasha, Gloria Carmona Clavijo, Joshua E. Richardson, and Sharon L. R. Kardia
- Published
- 2024
- Full Text
- View/download PDF
3. Public perspectives on the use of different data types for prediction in healthcare.
- Author
-
Paige Nong, Julia Adler-Milstein, Sharon L. R. Kardia, and Jodyn Platt
- Published
- 2024
- Full Text
- View/download PDF
4. Applying anti-racist approaches to informatics: a new lens on traditional frames.
- Author
-
Jodyn Platt, Paige Nong, Beza Merid, Minakshi Raj, Elizabeth Cope, Sharon L. R. Kardia, and Melissa Creary
- Published
- 2023
- Full Text
- View/download PDF
5. Clinical algorithms, racism, and 'fairness' in healthcare: A case of bounded justice
- Author
-
Sarah El-Azab and Paige Nong
- Subjects
General Works - Abstract
To date, attempts to address racially discriminatory clinical algorithms have largely focused on fairness and the development of models that “do no harm.” While the push for fairness is rooted in a desire to avoid or ameliorate health disparities, it generally neglects the role of racism in shaping health outcomes and does little to repair harm to patients. These limitations necessitate reconceptualizing how clinical algorithms should be designed and employed in pursuit of racial justice and health equity. A useful lens for this work is bounded justice, a concept and research analytic proposed by Melissa Creary to guide multidisciplinary health equity interventions. We describe how bounded justice offers a lens for (1) articulating the deep injustices embedded in the datasets, methodologies, and sociotechnical infrastructure underlying design and implementation of clinical algorithms and (2) envisioning how these algorithms can be redesigned to contribute to larger efforts that not only address current inequities, but to redress the historical mistreatment of communities of color by biomedical institutions. Thus, the aim of this article is two-fold. First, we apply the bounded justice analytic to fairness and clinical algorithms by describing structural constraints on health equity efforts such as medical device regulatory frameworks, race-based medicine, and racism in data. We then reimagine how clinical algorithms could function as a reparative technology to support justice and empower patients in the healthcare system.
- Published
- 2023
- Full Text
- View/download PDF
6. Learning about COVID-19: sources of information, public trust, and contact tracing during the pandemic
- Author
-
Philip S. Amara, Jodyn E. Platt, Minakshi Raj, and Paige Nong
- Subjects
Contact tracing ,COVID-19 ,Information sources ,Misinformation ,Public trust ,Public aspects of medicine ,RA1-1270 - Abstract
Abstract Objective To assess the association between public attitudes, beliefs, and information seeking about the COVID-19 pandemic and willingness to participate in contact tracing in Michigan. Methods Using data from the quarterly Michigan State of the State survey conducted in May 2020 (n = 1000), we conducted multiple regression analyses to identify factors associated with willingness to participate in COVID-19 contact tracing efforts. Results Perceived threat of the pandemic to personal health (B = 0.59, p =
- Published
- 2022
- Full Text
- View/download PDF
7. Policy Preferences Regarding Health Data Sharing Among Patients With Cancer: Public Deliberations
- Author
-
Minakshi Raj, Kerry Ryan, Philip Sahr Amara, Paige Nong, Karen Calhoun, M Grace Trinidad, Daniel Thiel, Kayte Spector-Bagdady, Raymond De Vries, Sharon Kardia, and Jodyn Platt
- Subjects
Neoplasms. Tumors. Oncology. Including cancer and carcinogens ,RC254-282 - Abstract
BackgroundPrecision health offers the promise of advancing clinical care in data-driven, evidence-based, and personalized ways. However, complex data sharing infrastructures, for-profit (commercial) and nonprofit partnerships, and systems for data governance have been created with little attention to the values, expectations, and preferences of patients about how they want to be engaged in the sharing and use of their health information. We solicited patient opinions about institutional policy options using public deliberation methods to address this gap. ObjectiveWe aimed to understand the policy preferences of current and former patients with cancer regarding the sharing of health information collected in the contexts of health information exchange and commercial partnerships and to identify the values invoked and perceived risks and benefits of health data sharing considered by the participants when formulating their policy preferences. MethodsWe conducted 2 public deliberations, including predeliberation and postdeliberation surveys, with patients who had a current or former cancer diagnosis (n=61). Following informational presentations, the participants engaged in facilitated small-group deliberations to discuss and rank policy preferences related to health information sharing, such as the use of a patient portal, email or SMS text messaging, signage in health care settings, opting out of commercial data sharing, payment, and preservation of the status quo. The participants ranked their policy preferences individually, as small groups by mutual agreement, and then again individually in the postdeliberation survey. ResultsAfter deliberation, the patient portal was ranked as the most preferred policy choice. The participants ranked no change in status quo as the least preferred policy option by a wide margin. Throughout the study, the participants expressed concerns about transparency and awareness, convenience, and accessibility of information about health data sharing. Concerns about the status quo centered around a lack of transparency, awareness, and control. Specifically, the patients were not aware of how, when, or why their data were being used and wanted more transparency in these regards as well as greater control and autonomy around the use of their health data. The deliberations suggested that patient portals would be a good place to provide additional information about data sharing practices but that over time, notifications should be tailored to patient preferences. ConclusionsOur study suggests the need for increased disclosure of health information sharing practices. Describing health data sharing practices through patient portals or other mechanisms personalized to patient preferences would minimize the concerns expressed by patients about the extent of data sharing that occurs without their knowledge. Future research and policies should identify ways to increase patient control over health data sharing without reducing the societal benefits of data sharing.
- Published
- 2023
- Full Text
- View/download PDF
8. Public Deliberation Process on Patient Perspectives on Health Information Sharing: Evaluative Descriptive Study
- Author
-
Minakshi Raj, Kerry Ryan, Paige Nong, Karen Calhoun, M Grace Trinidad, Raymond De Vries, Melissa Creary, Kayte Spector-Bagdady, Sharon L R Kardia, and Jodyn Platt
- Subjects
Neoplasms. Tumors. Oncology. Including cancer and carcinogens ,RC254-282 - Abstract
BackgroundPrecision oncology is one of the fastest-developing domains of personalized medicine and is one of many data-intensive fields. Policy for health information sharing that is informed by patient perspectives can help organizations align practice with patient preferences and expectations, but many patients are largely unaware of the complexities of how and why clinical health information is shared. ObjectiveThis paper evaluates the process of public deliberation as an approach to understanding the values and preferences of current and former patients with cancer regarding the use and sharing of health information collected in the context of precision oncology. MethodsWe conducted public deliberations with patients who had a current or former cancer diagnosis. A total of 61 participants attended 1 of 2 deliberative sessions (session 1, n=28; session 2, n=33). Study team experts led two educational plenary sessions, and trained study team members then facilitated discussions with small groups of participants. Participants completed pre- and postdeliberation surveys measuring knowledge, attitudes, and beliefs about precision oncology and data sharing. Following informational sessions, participants discussed, ranked, and deliberated two policy-related scenarios in small groups and in a plenary session. In the analysis, we evaluate our process of developing the deliberative sessions, the knowledge gained by participants during the process, and the extent to which participants reasoned with complex information to identify policy preferences. ResultsThe deliberation process was rated highly by participants. Participants felt they were listened to by their group facilitator, that their opinions were respected by their group, and that the process that led to the group’s decision was fair. Participants demonstrated improved knowledge of health data sharing policies between pre- and postdeliberation surveys, especially regarding the roles of physicians and health departments in health information sharing. Qualitative analysis of reasoning revealed that participants recognized complexity, made compromises, and engaged with trade-offs, considering both individual and societal perspectives related to health data sharing. ConclusionsThe deliberative approach can be valuable for soliciting the input of informed patients on complex issues such as health information sharing policy. Participants in our two public deliberations demonstrated that giving patients information about a complex topic like health data sharing and the opportunity to reason with others and discuss the information can help garner important insights into policy preferences and concerns. Data on public preferences, along with the rationale for information sharing, can help inform policy-making processes. Increasing transparency and patient engagement is critical to ensuring that data-driven health care respects patient autonomy and honors patient values and expectations.
- Published
- 2022
- Full Text
- View/download PDF
9. Discrimination, trust, and withholding information from providers: Implications for missing data and inequity
- Author
-
Paige Nong, Alicia Williamson, Denise Anthony, Jodyn Platt, and Sharon Kardia
- Subjects
Public aspects of medicine ,RA1-1270 ,Social sciences (General) ,H1-99 - Abstract
Quality care requires collaborative communication, information exchange, and decision-making between patients and providers. Complete and accurate data about patients and from patients are especially important as high volumes of data are used to build clinical decision support tools and inform precision medicine initiatives. However, systematically missing data can bias these tools and threaten their effectiveness. Data completeness relies in many ways on patients being comfortable disclosing information to their providers without prohibitive concerns about security or privacy. Patients are likely to withhold information in the context of low trust relationships with providers, but it is unknown how experiences of discrimination in the healthcare system also relate to non-disclosure. In this study, we assess the relationship between withholding information from providers, experiences of discrimination, and multiple types of patient trust. Using a nationally representative sample of US adults (n = 2,029), weighted logistic regression modeling indicated a statistically significant relationship between experiences of discrimination and withholding information from providers (OR 3.7; CI [2.6–5.2], p
- Published
- 2022
- Full Text
- View/download PDF
10. Early experiences with patient generated health data: health system and patient perspectives.
- Author
-
Julia Adler-Milstein and Paige Nong
- Published
- 2019
- Full Text
- View/download PDF
11. How Academic Medical Centers Govern AI Prediction Tools in the Context of Uncertainty and Evolving Regulation.
- Author
-
Paige Nong, Hamasha, Reema, Singh, Karandeep, Adler-Milstein, Julia, and Platt, Jody
- Subjects
MEDICAL centers ,ARTIFICIAL intelligence ,MACHINE learning ,MEDICAL care - Abstract
Prediction tools driven by artificial intelligence (AI) and machine learning are becoming increasingly integrated into health care delivery in the United States. However, organizational approaches to the governance of AI tools are highly varied. There is growing recognition of the need for evidence on best governance practices and multilayered oversight that could provide appropriate guardrails at the organizational and federal levels to address the unique dimensions of AI prediction tools. We sought to qualitatively characterize salient dimensions of AI-enabled predictive model governance at U.S. academic medical centers (AMCs). We analyzed how AMCs in the United States currently govern predictive models and consider the implications. A total of 17 individuals from 13 AMCs across the country participated in interviews. Half-hour to 1-hour interviews were conducted via Zoom from October 2022 to January 2023. The interview guide focused on the capacity, governance, regulation, and evaluation of AI-driven predictive models. Analysis of interview data was inductive. The research team wrote memos throughout the process of interviewing and analysis. We identified three governance phenotypes: welldefined, emerging, and interpersonal. In the well-defined governance phenotype, health systems have explicit, comprehensive procedures for the review and evaluation of AI and predictive models. In the emerging governance phenotype, systems are in the process of adjusting or adapting previously established approaches for clinical decision support or electronic health records (EHR) to govern AI. In health systems using interpersonal or individual-driven governance approaches, an individual is tasked with making decisions about model implementation without consistent evaluation requirements. We found that the influence of EHR vendors is an important consideration for those tasked with governance at AMCs, given concerns about regulatory gaps and the need for model evaluation. Even well-resourced AMCs are struggling to effectively identify, manage, and mitigate the myriad potential problems and pitfalls related to the implementation of predictive AI tools. The range of governance structures that we identified indicates a need for additional guidance, both regulatory and otherwise, for health systems as prediction and AI proliferate. Rather than concentrating responsibility for governance within organizations, multiple levels of governance that include the industry and regulators would better promote quality care and patient safety. This sort of structure would also provide desired guidance and support to the individuals tasked with governing these tools. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Do people have an ethical obligation to share their health information? Comparing narratives of altruism and health information sharing in a nationally representative sample.
- Author
-
Minakshi Raj, Raymond De Vries, Paige Nong, Sharon L R Kardia, and Jodyn E Platt
- Subjects
Medicine ,Science - Abstract
BackgroundWith the emergence of new health information technologies, health information can be shared across networks, with or without patients' awareness and/or their consent. It is often argued that there can be an ethical obligation to participate in biomedical research, motivated by altruism, particularly when risks are low. In this study, we explore whether altruism contributes to the belief that there is an ethical obligation to share information about one's health as well as how other health care experiences, perceptions, and concerns might be related to belief in such an obligation.MethodsWe conducted an online survey using the National Opinion Research Center's (NORC) probability-based, nationally representative sample of U.S. adults. Our final analytic sample included complete responses from 2069 participants. We used multivariable logistic regression to examine how altruism, together with other knowledge, attitudes, and experiences contribute to the belief in an ethical obligation to allow health information to be used for research.ResultsWe find in multivariable regression that general altruism is associated with a higher likelihood of belief in an ethical obligation to allow one's health information to be used for research (OR = 1.22, SE = 0.14, p = 0.078). Trust in the health system and in care providers are both associated with a significantly higher likelihood of believing there is an ethical obligation to allow health information to be used (OR = 1.48, SE = 0.76, pConclusionsBelief that there is an ethical obligation to allow one's health information to be used for research is shaped by altruism and by one's experience with, and perceptions of, health care and by general concerns about the use of personal information. Altruism cannot be assumed and researchers must recognize the ways encounters with the health care system influence (un)willingness to share one's health information.
- Published
- 2020
- Full Text
- View/download PDF
13. Current Use And Evaluation Of Artificial Intelligence And Predictive Models In US Hospitals.
- Author
-
Nong P, Adler-Milstein J, Apathy NC, Holmgren AJ, and Everson J
- Subjects
- United States, Humans, Machine Learning, Electronic Health Records, Forecasting, Artificial Intelligence, Hospitals
- Abstract
Effective evaluation and governance of predictive models used in health care, particularly those driven by artificial intelligence (AI) and machine learning, are needed to ensure that models are fair, appropriate, valid, effective, and safe, or FAVES. We analyzed data from the 2023 American Hospital Association Annual Survey Information Technology Supplement to identify how AI and predictive models are used and evaluated for accuracy and bias in hospitals. Hospitals use AI and predictive models to predict health trajectories or risks for inpatients, identify high-risk outpatients to inform follow-up care, monitor health, recommend treatments, simplify or automate billing procedures, and facilitate scheduling. We found that 65 percent of US hospitals used predictive models, and 79 percent of those used models from their electronic health record developer. Sixty-one percent of hospitals that used models evaluated them for accuracy using data from their health system (local evaluation), but only 44 percent reported local evaluation for bias. Hospitals that developed their own predictive models, had high operating margins, and were health system members were more likely to report local evaluation. Policy and programs that provide technical support, tools to assess FAVES principles, and educational resources would help ensure that all hospitals can use predictive models safely and prevent a new organizational digital divide in AI.
- Published
- 2025
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.