15 results on '"Barton-Maclaren T"'
Search Results
2. Paving the way for application of next generation risk assessment to safety decision-making for cosmetic ingredients
- Author
-
Dent, M. P., Vaillancourt, E., Thomas, R. S., Carmichael, P L, Ouedraogo, G., Kojima, H, Barroso, J., Ansell, J., Barton-Maclaren, T. S., Bennekou, Susanne Hougaard, Boekelheide, K., Ezendam, J., Field, J., Fitzpatrick, S., Hatao, M., Kreiling, R., Lorencini, M., Mahony, C., Montemayor, B., Mazaro-Costa, R., Oliveira, J., Rogiers, V., Smegal, D., Taalman, R., Tokura, Y., Verma, R., Willett, C., Yang, C., Dent, M. P., Vaillancourt, E., Thomas, R. S., Carmichael, P L, Ouedraogo, G., Kojima, H, Barroso, J., Ansell, J., Barton-Maclaren, T. S., Bennekou, Susanne Hougaard, Boekelheide, K., Ezendam, J., Field, J., Fitzpatrick, S., Hatao, M., Kreiling, R., Lorencini, M., Mahony, C., Montemayor, B., Mazaro-Costa, R., Oliveira, J., Rogiers, V., Smegal, D., Taalman, R., Tokura, Y., Verma, R., Willett, C., and Yang, C.
- Abstract
Next generation risk assessment (NGRA) is an exposure-led, hypothesis-driven approach that has the potential to support animal-free safety decision-making. However, significant effort is needed to develop and test the in vitro and in silico (computational) approaches that underpin NGRA to enable confident application in a regulatory context. A workshop was held in Montreal in 2019 to discuss where effort needs to be focussed and to agree on the steps needed to ensure safety decisions made on cosmetic ingredients are robust and protective. Workshop participants explored whether NGRA for cosmetic ingredients can be protective of human health, and reviewed examples of NGRA for cosmetic ingredients. From the limited examples available, it is clear that NGRA is still in its infancy, and further case studies are needed to determine whether safety decisions are sufficiently protective and not overly conservative. Seven areas were identified to help progress application of NGRA, including further investments in case studies that elaborate on scenarios frequently encountered by industry and regulators, including those where a 'high risk' conclusion would be expected. These will provide confidence that the tools and approaches can reliably discern differing levels of risk. Furthermore, frameworks to guide performance and reporting should be developed.
- Published
- 2021
3. Toxicity testing in the 21st century: progress in the past decade and future perspectives
- Author
-
Krewski, D., primary, Andersen, M. E., additional, Tyshenko, M. G., additional, Krishnan, K., additional, Hartung, T., additional, Boekelheide, K., additional, Wambaugh, J. F., additional, Jones, D., additional, Whelan, M., additional, Thomas, R., additional, Yauk, C., additional, Barton-Maclaren, T., additional, and Cote, I., additional
- Published
- 2019
- Full Text
- View/download PDF
4. Improving confidence in (Q)SAR predictions under Canada’s Chemicals Management Plan – a chemical space approach
- Author
-
Kulkarni, S. A., primary, Benfenati, E., additional, and Barton-Maclaren, T. S., additional
- Published
- 2016
- Full Text
- View/download PDF
5. High Throughput Read-Across for Screening a Large Inventory of Related Structures by Balancing Artificial Intelligence/Machine Learning and Human Knowledge.
- Author
-
Yang C, Rathman JF, Mostrag A, Ribeiro JV, Hobocienski B, Magdziarz T, Kulkarni S, and Barton-Maclaren T
- Subjects
- Humans, Machine Learning, Quantitative Structure-Activity Relationship, Risk Assessment, Artificial Intelligence, Reading
- Abstract
Read-across is an in silico method applied in chemical risk assessment for data-poor chemicals. The read-across outcomes for repeated-dose toxicity end points include the no-observed-adverse-effect level (NOAEL) and estimated uncertainty for a particular category of effects. We have previously developed a new paradigm for estimating NOAELs based on chemoinformatics analysis and experimental study qualities from selected analogues, not relying on quantitative structure-activity relationships (QSARs) or rule-based SAR systems, which are not well-suited to end points for which the underpinning data are weakly grounded in specific chemical-biological interactions. The central hypothesis of this approach is that similar compounds have similar toxicity profiles and, hence, similar NOAEL values. Analogue quality (AQ) quantifies the suitability of an analogue candidate for reading across to the target by considering similarity from structure, physicochemical, ADME (absorption, distribution, metabolism, excretion), and biological perspectives. Biological similarity is based on experimental data; assay vectors derived from aggregations of ToxCast/Tox21 data are used to derive machine learning (ML) hybrid rules that serve as biological fingerprints to capture target-analogue similarity relevant to specific effects of interest, for example, hormone receptors (ER/AR/THR). Once one or more analogues have been qualified for read-across, a decision theory approach is used to estimate confidence bounds for the NOAEL of the target. The confidence interval is dramatically narrowed when analogues are constrained to biologically related profiles. Although this read-across process works well for a single target with several analogues, it can become unmanageable when, for example, screening multiple targets (e.g., virtual screening library) or handling a parent compound having numerous metabolites. To this end, we have established a digitalized framework to enable the assessment of a large number of substances, while still allowing for human decisions for filtering and prioritization. This workflow was developed and validated through a use case of a large set of bisphenols and their metabolites.
- Published
- 2023
- Full Text
- View/download PDF
6. In vitro transcriptomic analyses reveal pathway perturbations, estrogenic activities, and potencies of data-poor BPA alternative chemicals.
- Author
-
Matteo G, Leingartner K, Rowan-Carroll A, Meier M, Williams A, Beal MA, Gagné M, Farmahin R, Wickramasuriya S, Reardon AJF, Barton-Maclaren T, Christopher Corton J, Yauk CL, and Atlas E
- Subjects
- Humans, Estrone, Gene Expression Profiling, MCF-7 Cells, Estrogens adverse effects, Estrogens pharmacology, Benzhydryl Compounds toxicity, Estrogen Receptor alpha metabolism, Transcriptome
- Abstract
Since initial regulatory action in 2010 in Canada, bisphenol A (BPA) has been progressively replaced by structurally related alternative chemicals. Unfortunately, many of these chemicals are data-poor, limiting toxicological risk assessment. We used high-throughput transcriptomics to evaluate potential hazards and compare potencies of BPA and 15 BPA alternative chemicals in cultured breast cancer cells. MCF-7 cells were exposed to BPA and 15 alternative chemicals (0.0005-100 µM) for 48 h. TempO-Seq (BioSpyder Inc) was used to examine global transcriptomic changes and estrogen receptor alpha (ERα)-associated transcriptional changes. Benchmark concentration (BMC) analysis was conducted to identify 2 global transcriptomic points of departure: (1) the lowest pathway median gene BMC and (2) the 25th lowest rank-ordered gene BMC. ERα activation was evaluated using a published transcriptomic biomarker and an ERα-specific transcriptomic point of departure was derived. Genes fitting BMC models were subjected to upstream regulator and canonical pathway analysis in Ingenuity Pathway Analysis. Biomarker analysis identified BPA and 8 alternative chemicals as ERα active. Global and ERα transcriptomic points of departure produced highly similar potency rankings with bisphenol AF as the most potent chemical tested, followed by BPA and bisphenol C. Further, BPA and transcriptionally active alternative chemicals enriched similar gene sets associated with increased cell division and cancer-related processes. These data provide support for future read-across applications of transcriptomic profiling for risk assessment of data-poor chemicals and suggest that several BPA alternative chemicals may cause hazards at similar concentrations to BPA., (© His Majesty the King in Right of Canada, as represented by the Minister of Health, 2022.)
- Published
- 2023
- Full Text
- View/download PDF
7. Quantitative in vitro to in vivo extrapolation of genotoxicity data provides protective estimates of in vivo dose.
- Author
-
Beal MA, Audebert M, Barton-Maclaren T, Battaion H, Bemis JC, Cao X, Chen C, Dertinger SD, Froetschl R, Guo X, Johnson G, Hendriks G, Khoury L, Long AS, Pfuhler S, Settivari RS, Wickramasuriya S, and White P
- Subjects
- Animals, Humans, Mutation, Risk Assessment, Mutagenicity Tests methods, DNA Damage, Mutagens toxicity
- Abstract
Genotoxicity assessment is a critical component in the development and evaluation of chemicals. Traditional genotoxicity assays (i.e., mutagenicity, clastogenicity, and aneugenicity) have been limited to dichotomous hazard classification, while other toxicity endpoints are assessed through quantitative determination of points-of-departures (PODs) for setting exposure limits. The more recent higher-throughput in vitro genotoxicity assays, many of which also provide mechanistic information, offer a powerful approach for determining defined PODs for potency ranking and risk assessment. In order to obtain relevant human dose context from the in vitro assays, in vitro to in vivo extrapolation (IVIVE) models are required to determine what dose would elicit a concentration in the body demonstrated to be genotoxic using in vitro assays. Previous work has demonstrated that application of IVIVE models to in vitro bioactivity data can provide PODs that are protective of human health, but there has been no evaluation of how these models perform with in vitro genotoxicity data. Thus, the Genetic Toxicology Technical Committee, under the Health and Environmental Sciences Institute, conducted a case study on 31 reference chemicals to evaluate the performance of IVIVE application to genotoxicity data. The results demonstrate that for most chemicals considered here (20/31), the PODs derived from in vitro data and IVIVE are health protective relative to in vivo PODs from animal studies. PODs were also protective by assay target: mutations (8/13 chemicals), micronuclei (9/12), and aneugenicity markers (4/4). It is envisioned that this novel testing strategy could enhance prioritization, rapid screening, and risk assessment of genotoxic chemicals., (© 2022 His Majesty the King in Right of Canada and The Authors. Environmental and Molecular Mutagenesis published by Wiley Periodicals LLC on behalf of Environmental Mutagen Society. Reproduced with the permission of the Minister of Health Canada. This article has been contributed to by U.S. Government employees and their work is in the public domain in the USA.)
- Published
- 2023
- Full Text
- View/download PDF
8. Establishing a quantitative framework for regulatory interpretation of genetic toxicity dose-response data: Margin of exposure case study of 48 compounds with both in vivo mutagenicity and carcinogenicity dose-response data.
- Author
-
Chepelev N, Long AS, Beal M, Barton-Maclaren T, Johnson G, Dearfield KL, Roberts DJ, van Benthem J, and White P
- Subjects
- Animals, Humans, Mutagenicity Tests methods, Mutagenesis, DNA Damage, Rodentia, Mutagens toxicity, Carcinogens toxicity
- Abstract
Quantitative relationships between carcinogenic potency and mutagenic potency have been previously examined using a benchmark dose (BMD)-based approach. We extended those analyses by using human exposure data for 48 compounds to calculate carcinogenicity-derived and genotoxicity-derived margin of exposure values (MOEs) that can be used to prioritize substances for risk management. MOEs for 16 of the 48 compounds were below 10,000, and consequently highlighted for regulatory concern. Of these, 15 were highlighted using genotoxicity-derived (micronucleus [MN] dose-response data) MOEs. A total of 13 compounds were highlighted using carcinogenicity-derived MOEs; 12 compounds were overlapping. MOEs were also calculated using transgenic rodent (TGR) mutagenicity data. For 10 of the 12 compounds examined using TGR data, the results similarly revealed that mutagenicity-derived MOEs yield regulatory decisions that correspond with those based on carcinogenicity-derived MOEs. The effect of benchmark response (BMR) on MOE determination was also examined. Reinterpretation of the analyses using a BMR of 50% indicated that four out of 15 compounds prioritized using MN-derived MOEs based on a default BMR of 5% would have been missed. The results indicate that regulatory decisions based on in vivo genotoxicity dose-response data would be consistent with those based on carcinogenicity dose-response data; in some cases, genotoxicity-based decisions would be more conservative. Going forward, and in the absence of carcinogenicity data, in vivo genotoxicity assays (MN and TGR) can be used to effectively prioritize substances for regulatory action. Routine use of the MOE approach necessitates the availability of reliable human exposure estimates, and consensus regarding appropriate BMRs for genotoxicity endpoints., (© 2022 His Majesty the King in Right of Canada and The Authors. Environmental and Molecular Mutagenesis published by Wiley Periodicals LLC on behalf of Environmental Mutagen Society. Reproduced with the permission of the Minister of Health.)
- Published
- 2023
- Full Text
- View/download PDF
9. Principles and Procedures for Assessment of Acute Toxicity Incorporating In Silico Methods.
- Author
-
Zwickl CM, Graham J, Jolly R, Bassan A, Ahlberg E, Amberg A, Anger LT, Barton-Maclaren T, Beilke L, Bellion P, Brigo A, Cronin MTD, Custer L, Devlin A, Burleigh-Flayers H, Fish T, Glover K, Glowienke S, Gromek K, Jones D, Karmaus A, Kemper R, Piparo EL, Madia F, Martin M, Masuda-Herrera M, McAtee B, Mestre J, Milchak L, Moudgal C, Mumtaz M, Muster W, Neilson L, Patlewicz G, Paulino A, Roncaglioni A, Ruiz P, Suarez D, Szabo DT, Valentin JP, Vardakou I, Woolley D, and Myatt G
- Abstract
Acute toxicity in silico models are being used to support an increasing number of application areas including (1) product research and development, (2) product approval and registration as well as (3) the transport, storage and handling of chemicals. The adoption of such models is being hindered, in part, because of a lack of guidance describing how to perform and document an in silico analysis. To address this issue, a framework for an acute toxicity hazard assessment is proposed. This framework combines results from different sources including in silico methods and in vitro or in vivo experiments. In silico methods that can assist the prediction of in vivo outcomes (i.e., LD
50 ) are analyzed concluding that predictions obtained using in silico approaches are now well-suited for reliably supporting assessment of LD50 -based acute toxicity for the purpose of GHS classification. A general overview is provided of the endpoints from in vitro studies commonly evaluated for predicting acute toxicity (e.g., cytotoxicity/cytolethality as well as assays targeting specific mechanisms). The increased understanding of pathways and key triggering mechanisms underlying toxicity and the increased availability of in vitro data allow for a shift away from assessments solely based on endpoints such as LD50 , to mechanism-based endpoints that can be accurately assessed in vitro or by using in silico prediction models. This paper also highlights the importance of an expert review of all available information using weight-of-evidence considerations and illustrates, using a series of diverse practical use cases, how in silico approaches support the assessment of acute toxicity.- Published
- 2022
- Full Text
- View/download PDF
10. Innovation in regulatory approaches for endocrine disrupting chemicals: The journey to risk assessment modernization in Canada.
- Author
-
Barton-Maclaren TS, Wade M, Basu N, Bayen S, Grundy J, Marlatt V, Moore R, Parent L, Parrott J, Grigorova P, Pinsonnault-Cooper J, and Langlois VS
- Subjects
- Animals, Ecosystem, Endocrine System, Risk Assessment methods, Toxicity Tests, Endocrine Disruptors analysis, Endocrine Disruptors toxicity
- Abstract
Globally, regulatory authorities grapple with the challenge of assessing the hazards and risks to human and ecosystem health that may result from exposure to chemicals that disrupt the normal functioning of endocrine systems. Rapidly increasing number of chemicals in commerce, coupled with the reliance on traditional, costly animal experiments for hazard characterization - often with limited sensitivity to many important mechanisms of endocrine disruption -, presents ongoing challenges for chemical regulation. The consequence is a limited number of chemicals for which there is sufficient data to assess if there is endocrine toxicity and hence few chemicals with thorough hazard characterization. To address this challenge, regulatory assessment of endocrine disrupting chemicals (EDCs) is benefiting from a revolution in toxicology that focuses on New Approach Methodologies (NAMs) to more rapidly identify, prioritize, and assess the potential risks from exposure to chemicals using novel, more efficient, and more mechanistically driven methodologies and tools. Incorporated into Integrated Approaches to Testing and Assessment (IATA) and guided by conceptual frameworks such as Adverse Outcome Pathways (AOPs), emerging approaches focus initially on molecular interactions between the test chemical and potentially vulnerable biological systems instead of the need for animal toxicity data. These new toxicity testing methods can be complemented with in silico and computational toxicology approaches, including those that predict chemical kinetics. Coupled with exposure data, these will inform risk-based decision-making approaches. Canada is part of a global network collaborating on building confidence in the use of NAMs for regulatory assessment of EDCs. Herein, we review the current approaches to EDC regulation globally (mainly from the perspective of human health), and provide a perspective on how the advances for regulatory testing and assessment can be applied and discuss the promises and challenges faced in adopting these novel approaches to minimize risks due to EDC exposure in Canada, and our world., (Copyright © 2021. Published by Elsevier Inc.)
- Published
- 2022
- Full Text
- View/download PDF
11. Paving the way for application of next generation risk assessment to safety decision-making for cosmetic ingredients.
- Author
-
Dent MP, Vaillancourt E, Thomas RS, Carmichael PL, Ouedraogo G, Kojima H, Barroso J, Ansell J, Barton-Maclaren TS, Bennekou SH, Boekelheide K, Ezendam J, Field J, Fitzpatrick S, Hatao M, Kreiling R, Lorencini M, Mahony C, Montemayor B, Mazaro-Costa R, Oliveira J, Rogiers V, Smegal D, Taalman R, Tokura Y, Verma R, Willett C, and Yang C
- Subjects
- Risk Assessment, Animal Testing Alternatives methods, Consumer Product Safety standards, Cosmetics standards
- Abstract
Next generation risk assessment (NGRA) is an exposure-led, hypothesis-driven approach that has the potential to support animal-free safety decision-making. However, significant effort is needed to develop and test the in vitro and in silico (computational) approaches that underpin NGRA to enable confident application in a regulatory context. A workshop was held in Montreal in 2019 to discuss where effort needs to be focussed and to agree on the steps needed to ensure safety decisions made on cosmetic ingredients are robust and protective. Workshop participants explored whether NGRA for cosmetic ingredients can be protective of human health, and reviewed examples of NGRA for cosmetic ingredients. From the limited examples available, it is clear that NGRA is still in its infancy, and further case studies are needed to determine whether safety decisions are sufficiently protective and not overly conservative. Seven areas were identified to help progress application of NGRA, including further investments in case studies that elaborate on scenarios frequently encountered by industry and regulators, including those where a 'high risk' conclusion would be expected. These will provide confidence that the tools and approaches can reliably discern differing levels of risk. Furthermore, frameworks to guide performance and reporting should be developed., (Copyright © 2021 The Authors. Published by Elsevier Inc. All rights reserved.)
- Published
- 2021
- Full Text
- View/download PDF
12. 4,5,6,7-Tetrabromo-2,3-dihydro-1,1,3-trimethyl-3-(2,3,4,5-tetrabromophenyl)-1H-indene (OBTMPI): Levels in humans and in silico toxicological profiles.
- Author
-
Das D, Kulkarni S, Barton-Maclaren T, and Zhu J
- Abstract
Limited human exposure and toxicity data are currently available for 4,5,6,7-Tetrabromo-2,3-dihydro-1,1,3-trimethyl-3-(2,3,4,5-tetrabromophenyl)-1H-indene (OBTMPI), a flame retardant often used for high temperature application of various polymer materials. Levels of OBTMPI in a cohort population that includes children and their co-residing parents (n = 217) in Canada were determined. Detection frequency of OBTMPI in the samples was 22.6%. OBTMPI levels were in general at sub-to low ng/g lipid weight level with a 95th percentile at 15.6 ng/g lipid weight. Compared to an earlier study conducted in 2008-2009 in the same region, results from this study show an increase in both detection frequency and concentration of OBTMPI. In silico toxicity predictions using Multicase CaseUltra and Leadscope Model Applier suggested that OBTMPI, and its possible metabolites in humans, while unlikely to be carcinogenic or mutagenic, exhibit some estrogen antagonist, androgen antagonist and estrogen binding capability reflective of possible endocrine disrupting properties., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Crown Copyright © 2021. Published by Elsevier Ltd. All rights reserved.)
- Published
- 2021
- Full Text
- View/download PDF
13. Internationalization of read-across as a validated new approach method (NAM) for regulatory toxicology.
- Author
-
Rovida C, Barton-Maclaren T, Benfenati E, Caloni F, Chandrasekera PC, Chesné C, Cronin MTD, De Knecht J, Dietrich DR, Escher SE, Fitzpatrick S, Flannery B, Herzler M, Hougaard Bennekou S, Hubesch B, Kamp H, Kisitu J, Kleinstreuer N, Kovarich S, Leist M, Maertens A, Nugent K, Pallocca G, Pastor M, Patlewicz G, Pavan M, Presgrave O, Smirnova L, Schwarz M, Yamada T, and Hartung T
- Subjects
- Animal Testing Alternatives, Animals, Humans, Internationality, Toxicology methods, Computer Simulation, Hazardous Substances toxicity, Reproducibility of Results, Risk Assessment, Toxicology legislation & jurisprudence
- Abstract
Read-across (RAx) translates available information from well-characterized chemicals to a substance for which there is a toxicological data gap. The OECD is working on case studies to probe general applicability of RAx, and several regulations (e.g., EU-REACH) already allow this procedure to be used to waive new in vivo tests. The decision to prepare a review on the state of the art of RAx as a tool for risk assessment for regulatory purposes was taken during a workshop with international experts in Ranco, Italy in July 2018. Three major issues were identified that need optimization to allow a higher regulatory acceptance rate of the RAx procedure: (i) the definition of similarity of source and target, (ii) the translation of biological/toxicological activity of source to target in the RAx procedure, and (iii) how to deal with issues of ADME that may differ between source and target. The use of new approach methodologies (NAM) was discussed as one of the most important innovations to improve the acceptability of RAx. At present, NAM data may be used to confirm chemical and toxicological similarity. In the future, the use of NAM may be broadened to fully characterize the hazard and toxicokinetic properties of RAx compounds. Concerning available guidance, documents on Good Read-Across Practice (GRAP) and on best practices to perform and evaluate the RAx process were identified. Here, in particular, the RAx guidance, being worked out by the European Commission’s H2020 project EU-ToxRisk together with many external partners with regulatory experience, is given.
- Published
- 2020
- Full Text
- View/download PDF
14. Utility of In Vitro Bioactivity as a Lower Bound Estimate of In Vivo Adverse Effect Levels and in Risk-Based Prioritization.
- Author
-
Paul Friedman K, Gagne M, Loo LH, Karamertzanis P, Netzeva T, Sobanski T, Franzosa JA, Richard AM, Lougee RR, Gissi A, Lee JJ, Angrish M, Dorne JL, Foster S, Raffaele K, Bahadori T, Gwinn MR, Lambert J, Whelan M, Rasenberg M, Barton-Maclaren T, and Thomas RS
- Subjects
- Drug-Related Side Effects and Adverse Reactions, Humans, No-Observed-Adverse-Effect Level, Hazardous Substances toxicity, Risk Assessment methods
- Abstract
Use of high-throughput, in vitro bioactivity data in setting a point-of-departure (POD) has the potential to accelerate the pace of human health safety evaluation by informing screening-level assessments. The primary objective of this work was to compare PODs based on high-throughput predictions of bioactivity, exposure predictions, and traditional hazard information for 448 chemicals. PODs derived from new approach methodologies (NAMs) were obtained for this comparison using the 50th (PODNAM, 50) and the 95th (PODNAM, 95) percentile credible interval estimates for the steady-state plasma concentration used in in vitro to in vivo extrapolation of administered equivalent doses. Of the 448 substances, 89% had a PODNAM, 95 that was less than the traditional POD (PODtraditional) value. For the 48 substances for which PODtraditional < PODNAM, 95, the PODNAM and PODtraditional were typically within a factor of 10 of each other, and there was an enrichment of chemical structural features associated with organophosphate and carbamate insecticides. When PODtraditional < PODNAM, 95, it did not appear to result from an enrichment of PODtraditional based on a particular study type (eg, developmental, reproductive, and chronic studies). Bioactivity:exposure ratios, useful for identification of substances with potential priority, demonstrated that high-throughput exposure predictions were greater than the PODNAM, 95 for 11 substances. When compared with threshold of toxicological concern (TTC) values, the PODNAM, 95 was greater than the corresponding TTC value 90% of the time. This work demonstrates the feasibility, and continuing challenges, of using in vitro bioactivity as a protective estimate of POD in screening-level assessments via a case study., (Published by Oxford University Press on behalf of the Society of Toxicology 2019. This work is written by US Government employees and is in the public domain in the US.)
- Published
- 2020
- Full Text
- View/download PDF
15. Emerging technologies for food and drug safety.
- Author
-
Slikker W Jr, de Souza Lima TA, Archella D, de Silva JB Junior, Barton-Maclaren T, Bo L, Buvinich D, Chaudhry Q, Chuan P, Deluyker H, Domselaar G, Freitas M, Hardy B, Eichler HG, Hugas M, Lee K, Liao CD, Loo LH, Okuda H, Orisakwe OE, Patri A, Sactitono C, Shi L, Silva P, Sistare F, Thakkar S, Tong W, Valdez ML, Whelan M, and Zhao-Wong A
- Subjects
- Animals, Drug Evaluation, Preclinical, Humans, Legislation, Drug, Legislation, Food, Risk Assessment, Toxicity Tests, Drug-Related Side Effects and Adverse Reactions, Food Safety
- Abstract
Emerging technologies are playing a major role in the generation of new approaches to assess the safety of both foods and drugs. However, the integration of emerging technologies in the regulatory decision-making process requires rigorous assessment and consensus amongst international partners and research communities. To that end, the Global Coalition for Regulatory Science Research (GCRSR) in partnership with the Brazilian Health Surveillance Agency (ANVISA) hosted the seventh Global Summit on Regulatory Science (GSRS17) in Brasilia, Brazil on September 18-20, 2017 to discuss the role of new approaches in regulatory science with a specific emphasis on applications in food and medical product safety. The global regulatory landscape concerning the application of new technologies was assessed in several countries worldwide. Challenges and issues were discussed in the context of developing an international consensus for objective criteria in the development, application and review of emerging technologies. The need for advanced approaches to allow for faster, less expensive and more predictive methodologies was elaborated. In addition, the strengths and weaknesses of each new approach was discussed. And finally, the need for standards and reproducible approaches was reviewed to enhance the application of the emerging technologies to improve food and drug safety. The overarching goal of GSRS17 was to provide a venue where regulators and researchers meet to develop collaborations addressing the most pressing scientific challenges and facilitate the adoption of novel technical innovations to advance the field of regulatory science., (Published by Elsevier Inc.)
- Published
- 2018
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.