122 results on '"James NT"'
Search Results
2. Board Meeting Effectiveness and Bank Performance: Evidence from Sub-Saharan Africa
- Author
-
Lawrence Asare Boadi, Philomena Dadzie, Sampson Narteh-Yoe, James Ntiamoah Doku, Richard Takyi Opoku, and Kofi Afriyie Nyamekye
- Subjects
Board Meeting ,Bank ,Performance ,GMM ,sub-Saharan Africa ,Business ,HF5001-6182 ,Economics as a science ,HB71-74 - Abstract
The main objective of the study was to evaluate the effectiveness and performance of board meetings at banks in sub-Saharan Africa. The study adopted a quantitative research design and collected secondary data from the annual reports of 33 banks in sub-Saharan Africa to cover the period 2011–2020. To achieve its research objective, the study employed the two-step GMM regression model to examine the relationship between board meeting effectiveness and bank performance. A positive and statistically significant correlation between the effectiveness of board meetings and the financial success of banks was discovered in this research. Our results suggest that improving the efficiency of board meetings can be a viable strategy for boosting bank performance, which has significant ramifications for bank governance in sub-Saharan Africa.
- Published
- 2023
- Full Text
- View/download PDF
3. Optimum birth interval (36–48 months) may reduce the risk of undernutrition in children: A meta-analysis
- Author
-
James Ntambara, Wendi Zhang, Anni Qiu, Zhounan Cheng, and Minjie Chu
- Subjects
birth interval ,undernutrition ,underweight ,stunting ,wasting ,Nutrition. Foods and food supply ,TX341-641 - Abstract
BackgroundAlthough some studies have highlighted short birth interval as a risk factor for adverse child nutrition outcomes, the question of whether and to what extent long birth interval affects better nutritional outcomes in children remains unclear.MethodsIn this quantitative meta-analysis, we evaluate the relationship between different birth interval groups and child nutrition outcomes, including underweight, wasting, and stunting.ResultsForty-six studies with a total of 898,860 children were included in the study. Compared with a short birth interval of
- Published
- 2023
- Full Text
- View/download PDF
4. Tetrandrine Treatment May Improve Clinical Outcome in Patients with COVID-19
- Author
-
Shiyin Chen, Yiran Liu, Juan Ge, Jianzhong Yin, Ting Shi, James Ntambara, Zhounan Cheng, Minjie Chu, and Hongyan Gu
- Subjects
COVID-19 ,SARS-CoV-2 ,tetrandrine ,clinical outcome ,traditional Chinese medicine ,Medicine (General) ,R5-920 - Abstract
Background and objectives: The COVID-19 pandemic continues worldwide, and there is no effective treatment to treat it. Chinese medicine is considered the recommended treatment for COVID-19 in China. This study aimed to examine the effectiveness of tetrandrine in treating COVID-19, which is originally derived from Chinese medicine. Materials and Methods: A total of 60 patients, categorized into three types (mild, moderate, severe), from Daye Hospital of Chinese Medicine with a diagnosis of COVID-19 were included in this study. Demographics, medical history, treatment, and results were collected. We defined two main groups according to the clinical outcome between improvement and recovery. All underlying factors including clinical outcomes were assessed in the total number of COVID-19 patients and moderate-type patients. Results: In a total of 60 patients, there were significant differences in the clinical outcome underlying treatment with antibiotics, tetrandrine, and arbidol (p < 0.05). When the comparison was limited to the moderate type, treatment with tetrandrine further increased recovery rate (p = 0.007). However, the difference disappeared, and no association was indicated between the clinical outcome and the treatment with and without antibiotic (p = 0.224) and arbidol (p = 0.318) in the moderate-type patients. In all-type and moderate-type patients, tetrandrine improved the rate of improvement in cough and fatigue on day 7 (p < 0.05). Conclusions: Tetrandrine may improve clinical outcome in COVID-19 patientsand could be a promising potential natural antiviral agent for the prevention and treatment of COVID-19.
- Published
- 2022
- Full Text
- View/download PDF
5. PAPR reduction in LTE network using both peak windowing and clipping techniques
- Author
-
Richard Musabe, Mafrebo B. Lionel, Victoire Mugongo Ushindi, Mugisha Atupenda, James Ntaganda, and Gaurav Bajpai
- Subjects
PAPR ,Clipping techniques ,Peak windowing ,OFDM ,LTE ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 ,Information technology ,T58.5-58.64 - Abstract
Abstract Multicarrier technique orthogonal frequency division multiplexing (OFDM) modulation is a solution to provide high-speed and secured data transmission requirement in 4G technologies. Peak-to-average power ratio (PAPR) is one major drawback in OFDM system. Researches described several PAPR reduction techniques, notably peak windowing and clipping. The aim of this paper is to use these techniques to reduce PAPR. The research work describes clipping and windowing techniques such as quadratic amplitude modulation (QAM) and additive white Gaussian noise (AWGN) as channel condition. The simulation results show that in those techniques with clipping threshold level of 0.7, there is a reduction of PAPR of 8 dB, and the reduction of PAPR for the peak windowing when considering Kaiser window is about 11 dB.
- Published
- 2019
- Full Text
- View/download PDF
6. Frailty and delayed graft function in kidney transplant recipients.
- Author
-
Garonzik-Wang JM, Govindan P, Grinnan JW, Liu M, Ali HM, Chakraborty A, Jain V, Ros RL, James NT, Kucirka LM, Hall EC, Berger JC, Montgomery RA, Desai NM, Dagher NN, Sonnenday CJ, Englesbe MJ, Makary MA, Walston JD, and Segev DL
- Published
- 2012
7. Factors associated with older persons' physical health in rural Uganda.
- Author
-
Fred Maniragaba, Abel Nzabona, John Bosco Asiimwe, Emmanuel Bizimungu, John Mushomi, James Ntozi, and Betty Kwagala
- Subjects
Medicine ,Science - Abstract
IntroductionThe proportion of older persons in developing countries is increasing with no clear evidence of improvement in physical health. The aim of this paper was to examine the factors associated with older persons' physical health in rural Uganda.MethodsThis paper is based on a cross-sectional study of 912 older persons age 60 years and older across four major regions of Uganda. The study was conceptualized basing on World Health Organization quality of life BREF (WHOQOL-BREF). Analysis was done at three levels, that is, frequency distributions were generated to describe background characteristics of respondents and cross-tabulations were done to determine associations between dependent and each of the independent variables. Ordinal logistic regression was used to determine the predictors of physical health.ResultsThe likelihood of good physical health is high among older persons (Ops) who controlled their household assets (OR = 3.64; CI = 1.81-7.30) or the household assets controlled by their spouses (OR = 4.44; CI = 1.91-10.32) relative to those whose household assets were controlled by their children. There is high likelihood of good physical health among those who engage in physical activities (OR = 2.28; CI = 1.52-3.43) compared to those who do not.ConclusionThe findings have various policy implications, including creating an enabling environment and building capacities of older persons to remain in control of their household assets. Interventions focusing on deepening sensitization of older persons about importance of physical exercises could be a viable strategy for improving physical health of older persons.
- Published
- 2019
- Full Text
- View/download PDF
8. Predictors of quality of life of older persons in rural Uganda: A cross sectional study [version 2; peer review: 1 approved, 2 approved with reservations]
- Author
-
Fred Maniragaba, Betty Kwagala, Emmanuel Bizimungu, Stephen Ojiambo Wandera, and James Ntozi
- Subjects
Medicine ,Science - Abstract
Background: Little is known about the quality of life of older persons (OPs) in Uganda in particular, and Africa in general. This study examined factors associated with quality of life of older persons in rural Uganda. Method: We performed a cross-sectional survey of 912 older persons from the four regions of Uganda. Data were analyzed at univariate, bivariate and multivariate level where ordinal logistic regression was applied. Results: Older persons in northern (OR=0.39; CI=0.224-0.711) and western (OR=0.33; CI=0.185-0.594) regions had poor quality of life relative to those in central region. Those who were HIV positive had poor quality of life (OR=0.45; CI=0.220-0.928) compared to those who were HIV negative. In contrast, living in permanent houses predicted good quality of life (OR=2.04; CI=1.391-3.002). Older persons whose household assets were controlled by their spouses were associated with good quality of life (OR=2.06;CI=1.032-4.107) relative to those whose assets were controlled by their children. Conclusion: Interventions mitigating the HIV and AIDS related Quality of life should target older persons. The government of Uganda should consider improving housing conditions for older persons in rural areas.
- Published
- 2018
- Full Text
- View/download PDF
9. Studies on the histochemical demonstration of myosin ATPase activity in rabbit intrafusal muscle fibers
- Author
-
James Nt
- Subjects
Histology ,Myosin ATPase activity ,Chemistry ,Rabbit (nuclear engineering) ,Anatomy ,Cell biology - Published
- 1973
10. Prevalence and risk factors for self-reported non-communicable diseases among older Ugandans: a cross-sectional study
- Author
-
Stephen Ojiambo Wandera, Betty Kwagala, and James Ntozi
- Subjects
Africa ,Uganda ,chronic diseases ,non-communicable diseases ,elderly ,Public aspects of medicine ,RA1-1270 - Abstract
Background: There is limited evidence about the prevalence and risk factors for non-communicable diseases (NCDs) among older Ugandans. Therefore, this article is aimed at investigating the prevalence of self-reported NCDs and their associated risk factors using a nationally representative sample. Design: We conducted a secondary analysis of the 2010 Uganda National Household Survey (UNHS) using a weighted sample of 2,382 older people. Frequency distributions for descriptive statistics and Pearson chi-square tests to identify the association between self-reported NCDs and selected explanatory variables were done. Finally, multivariable complementary log–log regressions to estimate the risk factors for self-reported NCDs among older people in Uganda were done. Results: About 2 in 10 (23%) older persons reported at least one NCD [including hypertension (16%), diabetes (3%), and heart disease (9%)]. Among all older people, reporting NCDs was higher among those aged 60–69 and 70–79; Muslims; and Pentecostals and Seventh Day Adventists (SDAs). In addition, the likelihood of reporting NCDs was higher among older persons who depended on remittances and earned wages; owned a bicycle; were sick in the last 30 days; were disabled; and were women. Conversely, the odds of reporting NCDs were lower for those who were relatives of household heads and were poor. Conclusions: In Uganda, self-reported NCDs were associated with advanced age, being a woman, having a disability, ill health in the past 30 days, being rich, depended on remittances and earning wages, being Muslim, Pentecostal and SDAs, and household headship. The Ministry of Health should prevent and manage NCDs by creating awareness in the public and improving the supply of essential drugs for these health conditions. Finally, there is a need for specialised surveillance studies of older people to monitor the trends and patterns of NCDs over time.
- Published
- 2015
- Full Text
- View/download PDF
11. Prevalence and correlates of disability among older Ugandans: evidence from the Uganda National Household Survey
- Author
-
Stephen O. Wandera, James Ntozi, and Betty Kwagala
- Subjects
disability ,socio-economic vulnerability ,older people ,non-communicable diseases ,Uganda ,Public aspects of medicine ,RA1-1270 - Abstract
Background: Nationally representative evidence on the burden and determinants of disability among older people in sub-Saharan Africa in general, and Uganda in particular, is limited. Objective: The aim of this study was to estimate the prevalence and investigate the correlates of disability among older people in Uganda. Design: We conducted secondary analysis of data from a sample of 2,382 older persons from the Uganda National Household Survey. Disability was operationalized as either: 1) having a lot of difficulty on any one question; 2) being unable to perform on any one question; or, 3) having some difficulty with two of the six domains. We used frequency distributions for description, chi-square tests for initial associations, and multivariable logistic regressions to assess the associations. Results: A third of the older population was disabled. Among all older persons, disability was associated with advancement in age (OR=4.91, 95% CI: 3.38–7.13), rural residence (0.56, 0.37–0.85), living alone (1.56, 1.07–2.27), separated or divorced (1.96, 1.31–2.94) or widowed (1.86, 1.32–2.61) marital status, households’ dependence on remittances (1.48, 1.10–1.98), ill health (2.48, 1.95–3.15), and non-communicable diseases (NCDs) (1.81, 0.80–2.33). Gender was not associated with disability among older persons. Conclusions: Disability was associated with advancement in age, rural residence, living alone, divorced/separated/widowed marital status, dependence on remittances, ill health, and NCDs. Interventions to improve health and functioning of older people need to focus on addressing social inequalities and on the early preventive interventions and management of NCDs in old age in Uganda.
- Published
- 2014
- Full Text
- View/download PDF
12. Comparison of Three DNA Extraction Methods on Bone and Blood Stains up to 43 Years Old and Amplification of Three Different Gene Sequences
- Author
-
Cattaneo, C, Craig, OE, James, NT, and Sokol, RJ
- Abstract
Extraction of amplifiable DNA from degraded human material in the forensic context remains a problem, and maximization of yield and elimination of inhibitors of the Polymerase Chain Reaction (PCR) are important issues which rarely feature in comparative studies. The present work used PCR amplification of three DNA sequences (HLA DPB1, amelogenin and mitochondrial) to assess the efficiency of three methods for extracting DNA (sodium acetate, magnetic beads and glass-milk) from 32 skeletal samples and 25 blood stains up to 43 years old. The results, analyzed using multivariate statistics, confirmed that the extraction method was crucial to the subsequent detection of amplification products; the glass-milk protocol performed better than sodium acetate, which was better than magnetic beads. Successful amplification also depended on gene sequence, multiple copy mitochondrial sequences performing best; however, with the singly copy sequences, the longer HLA DPB1 (327 bp) being detected just as often as the shorter amelogenin (106/112 bp). Amplification products were obtained more frequently from blood stains than bone, perhaps reflecting differences inherent in the material, and from younger compared with older specimens, though plateauing seemed to occur after 10 years. PCR inhibitors were more frequent in sodium acetate extracts.
- Published
- 1997
- Full Text
- View/download PDF
13. Socio-cultural inhibitors to use of modern contraceptive techniques in rural Uganda: a qualitative study
- Author
-
Allen Kabagenyi, Alice Reid, James Ntozi, and Lynn Atuyambe
- Subjects
uganda ,contraceptive use ,cultural inhibitions ,beliefs ,practices and family planning ,Medicine - Abstract
INTRODUCTION: family planning is one of the cost-effective strategies in reducing maternal and child morbidity and mortality rates. Yet in Uganda, the contraceptive prevalence rate is only 30% among married women in conjunction with a persistently high fertility rate of 6.2 children per woman. These demographic indicators have contributed to a high population growth rate of over 3.2% annually. This study examines the role of socio-cultural inhibitions in the use of modern contraceptives in rural Uganda. METHODS: this was a qualitative study conducted in 2012 among men aged 15-64 and women aged 15-49 in the districts of Mpigi and Bugiri in rural Uganda. Eighteen selected focus group discussions (FGDs), each internally homogeneous, and eight in-depth interviews (IDIs) were conducted among men and women. Data were collected on sociocultural beliefs and practices, barriers to modern contraceptive use and perceptions of and attitudes to contraceptive use. All interviews were tape recorded, translated and transcribed verbatim. All the transcripts were coded, prearranged into categories and later analyzed using a latent content analysis approach, with support of ATLAS.ti qualitative software. Suitable quotations were used to provide in-depth explanations of the findings. RESULTS: three themes central in hindering the uptake of modern contraceptives emerged: (i) persistence of socio-cultural beliefs and practices promoting births (such as polygamy, extending family lineage, replacement of the dead, gender-based violence, power relations and twin myths). (ii) Continued reliance on traditional family planning practices and (iii) misconceptions and fears about modern contraception. CONCLUSION: sociocultural expectations and values attached to marriage, women and child bearing remain an impediment to using family planning methods. The study suggests a need to eradicate the cultural beliefs and practices that hinder people from using contraceptives, as well as a need to scale-up family planning services and sensitization at the grassroots.
- Published
- 2016
- Full Text
- View/download PDF
14. Colocalization of SCD1 and DGAT2: implying preference for endogenous monounsaturated fatty acids in triglyceride synthesis
- Author
-
Weng Chi Man, Makoto Miyazaki, Kiki Chu, and James Ntambi
- Subjects
stearoyl-coenzyme A desaturase 1 ,acyl-coenzyme A:diacylglycerol acyltransferase 2 ,fluorescence resonance energy transfer ,Biochemistry ,QD415-436 - Abstract
Stearoyl-coenzyme A desaturase (SCD) is an endoplasmic reticulum (ER) protein that catalyzes the Δ9-cis desaturation of saturated fatty acids. Mice with targeted disruption in SCD1 (Scd1−/−) have significant reduction in the tissue content of triglycerides, suggesting that monounsaturated fatty acids endogenously synthesized by SCD1 are important for triglyceride synthesis. Acyl-coenzyme A:diacylglycerol acyltransferase (DGAT) is the enzyme that catalyzes the final reaction in the synthesis of triglycerides. The lack of DGAT2, one of the two DGAT isoforms, results in almost a complete loss of tissue triglycerides. We hypothesize that SCD1 participates in triglyceride synthesis by providing a more accessible pool of monounsaturated fatty acids through substrate channeling. In this study, we test whether SCD1 is proximal to DGAT2 by colocalization study with confocal microscopy, coimmunoprecipitation, and fluorescence resonance energy transfer using HeLa cells as the model of study. All of the results suggest that SCD1 and DGAT2 are located very close to each other in the ER, which is a very important criterion for the channeling of substrate. By performing subcellular fractionation using mouse livers, we also show, for the first time, that SCD is present in the mitochondria-associated membrane.
- Published
- 2006
- Full Text
- View/download PDF
15. Enhanced Photocatalytic Properties and Photoinduced Crystallization of TiO 2 -Fe 2 O 3 Inverse Opals Fabricated by Atomic Layer Deposition.
- Author
-
Hedrich C, James NT, Maragno LG, de Lima V, González SYG, Blick RH, Zierold R, and Furlan KP
- Abstract
The use of solar energy for photocatalysis holds great potential for sustainable pollution reduction. Titanium dioxide (TiO
2 ) is a benchmark material, effective under ultraviolet light but limited in visible light utilization, restricting its application in solar-driven photocatalysis. Previous studies have shown that semiconductor heterojunctions and nanostructuring can broaden the TiO2 's photocatalytic spectral range. Semiconductor heterojunctions are interfaces formed between two different semiconductor materials that can be engineered. Especially, type II heterojunctions facilitate charge separation, and they can be obtained by combining TiO2 with, for example, iron(III) oxide (Fe2 O3 ). Nanostructuring in the form of 3D inverse opals (IOs) demonstrated increased TiO2 light absorption efficiency of the material, by tailoring light-matter interactions through their photonic crystal structure and specifically their photonic stopband, which can give rise to a slow photon effect. Such effect is hypothesized to enhance the generation of free charges. This work focuses on the above-described effects simultaneously, through the synthesis of TiO2 -Fe2 O3 IOs via multilayer atomic layer deposition (ALD) and the characterization of their photocatalytic activities. Our results reveal that the complete functionalization of TiO2 IOs with Fe2 O3 increases the photocatalytic activity through the slow photon effect and semiconductor heterojunction formation. We systematically explore the influence of Fe2 O3 thickness on photocatalytic performance, and a maximum photocatalytic rate constant of 1.38 ± 0.09 h-1 is observed for a 252 nm template TiO2 -Fe2 O3 bilayer IO consisting of 16 nm TiO2 and 2 nm Fe2 O3 . Further tailoring the performance by overcoating with additional TiO2 layers enhances photoinduced crystallization and tunes photocatalytic properties. These findings highlight the potential of TiO2 -Fe2 O3 IOs for efficient water pollutant removal and the importance of precise nanostructuring and heterojunction engineering in advancing photocatalytic technologies.- Published
- 2024
- Full Text
- View/download PDF
16. Catamenial Haemorrhagic Pleural Effusion Caused by Thoracic Endometriosis, which was Controlled by Surgery Undertaken after Failed Medical Management.
- Author
-
Hassanein MFK, Herminie V, James NT, and Chetty D
- Abstract
Haemorrhagic pleural effusion can be a challenging diagnosis that requires a thorough investigation and sometimes a multidisciplinary team of physicians to reach the underlying aetiology. Causes can include pulmonary malignancy, pulmonary infections, connective tissue diseases, asbestos associated, intra-abdominal conditions such as pancreatitis and ovarian tumours, cardiovascular disorders such as ruptured aneurysms and pulmonary infarction, as well as other miscellaneous causes. One such cause is endometriosis in the thoracic cavity. Endometriosis is a chronic illness associated with the occurrence of endometrial tissue outside the endometrium. Insertion of endometrial tissue in the thoracic cavity is rare, with only a few cases described. This case report gives detail of a 30-year-old nulligravida suspected of having thoracic endometriosis following a history of catamenial dyspnoea and associated pleural effusion. The diagnosis was confirmed through the histopathological study of tissue obtained via thoracoscopic surgery. Excision of the endometrial tissue was done, and the patient then continued medical treatment with progestins and gonadotrophin-releasing hormone (GnRH) agonists. Following therapy, the index patient was asymptomatic. A multidisciplinary approach is often needed in the diagnosis and management of thoracic endometriosis, involving both medical and surgical specialities. Minimally invasive surgery is the gold standard of diagnosis, allowing for direct visualisation of implants and nodules and should be followed by medical treatment to reduce the risk of recurrence. Medical therapy alone is associated with higher rates of recurrence. Physicians must have a high degree of suspicion as thoracic endometriosis is a disease that can often be missed., Learning Points: Thoracic endometriosis syndrome is a rare but significant cause of haemorrhagic pleural effusion in women of childbearing age.Diagnosis and treatment can be challenging, and a multidisciplinary approach has been found to improve outcomes., Competing Interests: Conflicts of Interests: The Authors declare that there are no competing interests., (© EFIM 2024.)
- Published
- 2024
- Full Text
- View/download PDF
17. Addressing Multiple Detection Limits with Semiparametric Cumulative Probability Models.
- Author
-
Tian Y, Li C, Tu S, James NT, Harrell FE, and Shepherd BE
- Abstract
Detection limits (DLs), where a variable cannot be measured outside of a certain range, are common in research. DLs may vary across study sites or over time. Most approaches to handling DLs in response variables implicitly make strong parametric assumptions on the distribution of data outside DLs. We propose a new approach to deal with multiple DLs based on a widely used ordinal regression model, the cumulative probability model (CPM). The CPM is a rank-based, semiparametric linear transformation model that can handle mixed distributions of continuous and discrete outcome variables. These features are key for analyzing data with DLs because while observations inside DLs are continuous, those outside DLs are censored and generally put into discrete categories. With a single lower DL, CPMs assign values below the DL as having the lowest rank. With multiple DLs, the CPM likelihood can be modified to appropriately distribute probability mass. We demonstrate the use of CPMs with DLs via simulations and a data example. This work is motivated by a study investigating factors associated with HIV viral load 6 months after starting antiretroviral therapy in Latin America; 56% of observations are below lower DLs that vary across study sites and over time., Competing Interests: The authors report there are no competing interests to declare.
- Published
- 2024
- Full Text
- View/download PDF
18. Variant-based heritability assessment of dexmedetomidine and fentanyl clearance in pediatric patients.
- Author
-
Shannon ML, Muhammad A, James NT, Williams ML, Breeyear J, Edwards T, Mosley JD, Choi L, Kannankeril P, and Van Driest S
- Subjects
- Humans, Child, Fentanyl, Genome-Wide Association Study, Bayes Theorem, Dexmedetomidine
- Abstract
Despite complex pathways of drug disposition, clinical pharmacogenetic predictors currently rely on only a few high effect variants. Quantification of the polygenic contribution to variability in drug disposition is necessary to prioritize target drugs for pharmacogenomic approaches and guide analytic methods. Dexmedetomidine and fentanyl, often used in postoperative care of pediatric patients, have high rates of inter-individual variability in dosing requirements. Analyzing previously generated population pharmacokinetic parameters, we used Bayesian hierarchical mixed modeling to measure narrow-sense (additive) heritability ( h SNP 2 ) of dexmedetomidine and fentanyl clearance in children and identify relative contributions of small, moderate, and large effect-size variants to h SNP 2 . We used genome-wide association studies (GWAS) to identify variants contributing to variation in dexmedetomidine and fentanyl clearance, followed by functional analyses to identify associated pathways. For dexmedetomidine, median clearance was 33.0 L/h (interquartile range [IQR] 23.8-47.9 L/h) and h SNP 2 was estimated to be 0.35 (90% credible interval 0.00-0.90), with 45% of h SNP 2 attributed to large-, 32% to moderate-, and 23% to small-effect variants. The fentanyl cohort had median clearance of 8.2 L/h (IQR 4.7-16.7 L/h), with estimated h SNP 2 of 0.30 (90% credible interval 0.00-0.84). Large-effect variants accounted for 30% of h SNP 2 , whereas moderate- and small-effect variants accounted for 37% and 33%, respectively. As expected, given small sample sizes, no individual variants or pathways were significantly associated with dexmedetomidine or fentanyl clearance by GWAS. We conclude that clearance of both drugs is highly polygenic, motivating the future use of polygenic risk scores to guide appropriate dosing of dexmedetomidine and fentanyl., (© 2023 The Authors. Clinical and Translational Science published by Wiley Periodicals LLC on behalf of American Society for Clinical Pharmacology and Therapeutics.)
- Published
- 2023
- Full Text
- View/download PDF
19. Burnout: A predictor of oral health impact profile among Nigerian early career doctors.
- Author
-
Ogunsuji OO, Adebayo O, Kanmodi KK, Fagbule OF, Adeniyi AM, James NT, Yahya AI, Salihu MO, Babarinde T, Olaopa O, Selowo T, Enebeli UU, and Ishaya DG
- Subjects
- Humans, Quality of Life psychology, Cross-Sectional Studies, Fluorides, Surveys and Questionnaires, Burnout, Psychological, Oral Health, Mouth Diseases
- Abstract
There have been reported association of oral health disorders with burnout, stress, and mental health. Arguably, with these reported associations, and the current prevalence of burnout amongst Nigerian doctors, exploring the role of burnout on oral health amongst Nigerian doctors is timely. This study aims to determine the relationship between burnout and oral health-related quality of life amongst Early Career Doctors (ECDs) in Nigeria, while also identifying the role other possible predictors plays in this relationship. This was a cross-sectional study conducted amongst Nigerian ECDs as part of Challenges of Residency Training in Nigeria (CHARTING) II project. A total of 632 ECDs were recruited across thirty-one tertiary hospitals in the 6 geopolitical zones of the country using a multistage cluster sampling technique. A self-administered paper-based semi-structured questionnaire was given to each participant that consented. The tools used to assess burnout and Oral health-related quality of life (OHRQoL) were Copenhagen Burnout Inventory (CBI) and Oral Health Impact Profile (OHIP-14) respectively. Independent samples T-test, ANOVA and Multiple linear regression were used to draw inferences from the data collected. Overall mean OHIP-14 score of all participants was 11.12 (±9.23). The scores for the 3 dimensions of burnout were below 50% with CBI-Personal Burnout having the highest score of 49.96 (±19.15). Significant positive correlations (p < 0.001) were found between OHIP-14 and all the dimensions of burnout, as the burnout scores were increasing, there was a corresponding increase in the OHIP scores thus poorer OHRQoL. The regression model shows that the predictors of OHIP were CBI-PB (p = 0.003), use of fluoride paste (p = 0.039), use of tobacco (p = 0.005) and being a denture user (p = 0.047). This study shows a positive correlation between burnout and OHIP of ECDs. We found that as burnout was increasing, OHIP increased thus implying poorer oral health related quality of life amongst ECDs. The use of fluoride toothpaste, tobacco and denture are other factors we found that could affect the OHIP of ECDs., Competing Interests: All authors are members of NARD; however, the study was independently conducted and reported. NARD played only a funder’s role. This does not alter our adherence to PLOS ONE policies on sharing data and materials., (Copyright: © 2023 Ogunsuji et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.)
- Published
- 2023
- Full Text
- View/download PDF
20. Population pharmacokinetic analysis of dexmedetomidine in children using real-world data from electronic health records and remnant specimens.
- Author
-
James NT, Breeyear JH, Caprioli R, Edwards T, Hachey B, Kannankeril PJ, Keaton JM, Marshall MD, Van Driest SL, and Choi L
- Subjects
- Adult, Child, Electronic Health Records, Glucuronosyltransferase genetics, Humans, Hypnotics and Sedatives, Models, Biological, Cardiac Surgical Procedures, Dexmedetomidine
- Abstract
Aims: Our objectives were to perform a population pharmacokinetic analysis of dexmedetomidine in children using remnant specimens and electronic health records (EHRs) and explore the impact of patient's characteristics and pharmacogenetics on dexmedetomidine clearance., Methods: Dexmedetomidine dosing and patient data were gathered from EHRs and combined with opportunistically sampled remnant specimens. Population pharmacokinetic models were developed using nonlinear mixed-effects modelling. Stage 1 developed a model without genotype variables; Stage 2 added pharmacogenetic effects., Results: Our final study population included 354 post-cardiac surgery patients aged 0-22 years (median 16 mo). The data were best described with a 2-compartment model with allometric scaling for weight and Hill maturation function for age. Population parameter estimates and 95% confidence intervals were 27.3 L/h (24.0-31.1 L/h) for total clearance, 161 L (139-187 L) for central compartment volume of distribution, 26.0 L/h (22.5-30.0 L/h) for intercompartmental clearance and 7903 L (5617-11 119 L) for peripheral compartment volume of distribution. The estimate for postmenstrual age when 50% of adult clearance is achieved was 42.0 weeks (41.5-42.5 weeks) and the Hill coefficient estimate was 7.04 (6.99-7.08). Genotype was not statistically or clinically significant., Conclusion: Our study demonstrates the use of real-world EHR data and remnant specimens to perform a population pharmacokinetic analysis and investigate covariate effects in a large paediatric population. Weight and age were important predictors of clearance. We did not find evidence for pharmacogenetic effects of UGT1A4 or UGT2B10 genotype or CYP2A6 risk score., (© 2021 British Pharmacological Society.)
- Published
- 2022
- Full Text
- View/download PDF
21. Building longitudinal medication dose data using medication information extracted from clinical notes in electronic health records.
- Author
-
McNeer E, Beck C, Weeks HL, Williams ML, James NT, Bejan CA, and Choi L
- Subjects
- Drug Therapy, Humans, Information Storage and Retrieval methods, Algorithms, Electronic Health Records, Natural Language Processing, Pharmaceutical Preparations administration & dosage
- Abstract
Objective: To develop an algorithm for building longitudinal medication dose datasets using information extracted from clinical notes in electronic health records (EHRs)., Materials and Methods: We developed an algorithm that converts medication information extracted using natural language processing (NLP) into a usable format and builds longitudinal medication dose datasets. We evaluated the algorithm on 2 medications extracted from clinical notes of Vanderbilt's EHR and externally validated the algorithm using clinical notes from the MIMIC-III clinical care database., Results: For the evaluation using Vanderbilt's EHR data, the performance of our algorithm was excellent; F1-measures were ≥0.98 for both dose intake and daily dose. For the external validation using MIMIC-III, the algorithm achieved F1-measures ≥0.85 for dose intake and ≥0.82 for daily dose., Discussion: Our algorithm addresses the challenge of building longitudinal medication dose data using information extracted from clinical notes. Overall performance was excellent, but the algorithm can perform poorly when incorrect information is extracted by NLP systems. Although it performed reasonably well when applied to the external data source, its performance was worse due to differences in the way the drug information was written. The algorithm is implemented in the R package, "EHR," and the extracted data from Vanderbilt's EHRs along with the gold standards are provided so that users can reproduce the results and help improve the algorithm., Conclusion: Our algorithm for building longitudinal dose data provides a straightforward way to use EHR data for medication-based studies. The external validation results suggest its potential for applicability to other systems., (© The Author(s) 2020. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For permissions, please email: journals.permissions@oup.com.)
- Published
- 2021
- Full Text
- View/download PDF
22. Development of a System for Postmarketing Population Pharmacokinetic and Pharmacodynamic Studies Using Real-World Data From Electronic Health Records.
- Author
-
Choi L, Beck C, McNeer E, Weeks HL, Williams ML, James NT, Niu X, Abou-Khalil BW, Birdwell KA, Roden DM, Stein CM, Bejan CA, Denny JC, and Van Driest SL
- Subjects
- Adolescent, Adult, Aged, Analgesics, Opioid pharmacokinetics, Databases, Factual statistics & numerical data, Female, Humans, Male, Middle Aged, Product Surveillance, Postmarketing methods, Young Adult, Data Interpretation, Statistical, Electronic Health Records statistics & numerical data, Fentanyl pharmacokinetics, Lamotrigine pharmacokinetics, Product Surveillance, Postmarketing statistics & numerical data, Tacrolimus pharmacokinetics
- Abstract
Postmarketing population pharmacokinetic (PK) and pharmacodynamic (PD) studies can be useful to capture patient characteristics affecting PK or PD in real-world settings. These studies require longitudinally measured dose, outcomes, and covariates in large numbers of patients; however, prospective data collection is cost-prohibitive. Electronic health records (EHRs) can be an excellent source for such data, but there are challenges, including accurate ascertainment of drug dose. We developed a standardized system to prepare datasets from EHRs for population PK/PD studies. Our system handles a variety of tasks involving data extraction from clinical text using a natural language processing algorithm, data processing, and data building. Applying this system, we performed a fentanyl population PK analysis, resulting in comparable parameter estimates to a prior study. This new system makes the EHR data extraction and preparation process more efficient and accurate and provides a powerful tool to facilitate postmarketing population PK/PD studies using information available in EHRs., (© 2020 The Authors Clinical Pharmacology & Therapeutics © 2020 American Society for Clinical Pharmacology and Therapeutics.)
- Published
- 2020
- Full Text
- View/download PDF
23. REACT: A paraprofessional training program for first responders-A pilot study.
- Author
-
Marks MR, Bowers C, DePesa NS, Trachik B, Deavers FE, and James NT
- Subjects
- Adult, Female, Humans, Male, Middle Aged, Pilot Projects, Program Evaluation, Emergency Responders education, Emergency Responders psychology, Health Knowledge, Attitudes, Practice, Mental Health
- Abstract
The purpose of this study was to evaluate a newly designed peer support training program for first responders titled Recognize, Evaluate, Advocate, Coordinate, and Track (REACT). REACT was developed in partnership with public safety agencies to address the need for promoting psychological health. This resulted in the development of a program that uses train-the-trainer methodology to address primary prevention of stress injuries. REACT was an all-day training that consisted of four modules, each featuring instruction and practice. Six public safety agencies totaling 30 individuals (76.9% from four fire departments, 23.1% from two emergency communication centers) participated in REACT. The primary outcomes were knowledge and training-related self-efficacy; secondary outcomes included general self-efficacy, resilience, and improved attitudes and expectations. A peer-support model, using a train-the-trainer methodology, is a promising approach for addressing the promotion of psychological health.
- Published
- 2017
- Full Text
- View/download PDF
24. Sleep assessments for a mild traumatic brain injury trial in a military population.
- Author
-
Walker JM, James NT, Campbell H, Wilson SH, Churchill S, and Weaver LK
- Subjects
- Actigraphy, Adult, Cataplexy etiology, Female, Humans, Male, Middle Aged, Narcolepsy diagnosis, Narcolepsy etiology, Narcolepsy physiopathology, Polysomnography, Post-Concussion Syndrome therapy, Restless Legs Syndrome etiology, Sleep Apnea, Obstructive diagnosis, Sleep Apnea, Obstructive etiology, Sleep Initiation and Maintenance Disorders drug therapy, Sleep Initiation and Maintenance Disorders etiology, Sleep Initiation and Maintenance Disorders physiopathology, Stress Disorders, Post-Traumatic complications, Surveys and Questionnaires, Brain Concussion complications, Military Personnel, Sleep Initiation and Maintenance Disorders diagnosis
- Abstract
Baseline sleep characteristics were explored for 71 U.S. military service members with mild traumatic brain injury (mTBI) enrolled in a post-concussive syndrome clinical trial. The Pittsburgh Sleep Quality Index (PSQI), sleep diary, several disorder-specific questionnaires, actigraphy and polysomnographic nap were collected. Almost all (97%) reported ongoing sleep problems. The mean global PSQI score was 13.5 (SD=3.8) and 87% met insomnia criteria. Sleep maintenance efficiency was 79.1% for PSQI, 82.7% for sleep diary and 90.5% for actigraphy; total sleep time was 288, 302 and 400 minutes, respectively. There was no correlation between actigraphy and subjective questionnaires. Overall, 70% met hypersomnia conditions, 70% were at high risk for obstructive sleep apnea (OSA), 32% were symptomatic for restless legs syndrome, and 6% reported cataplexy. Nearly half (44%) reported coexisting insomnia, hypersomnia and high OSA risk. Participants with post-traumatic stress disorder (PTSD) had higher PSQI scores and increased OSA risk. Older participants and those with higher aggression, anxiety or depression also had increased OSA risk. The results confirm poor sleep quality in mTBI with insomnia, hypersomnia, and OSA risk higher than previously reported, and imply sleep disorders in mTBI may be underdiagnosed or exacerbated by comorbid PTSD., Competing Interests: The authors of this paper declare no conflicts of interest exist with this submission., (Copyright© Undersea and Hyperbaric Medical Society.)
- Published
- 2016
25. Extraction Time of Kidneys From Deceased Donors and Impact on Outcomes.
- Author
-
Osband AJ, James NT, and Segev DL
- Subjects
- Adult, Cadaver, Delayed Graft Function epidemiology, Female, Follow-Up Studies, Glomerular Filtration Rate, Graft Rejection epidemiology, Graft Survival, Humans, Incidence, Kidney Function Tests, Male, Prognosis, Registries, Risk Factors, Transplantation, Homologous, Cold Ischemia methods, Delayed Graft Function etiology, Graft Rejection etiology, Kidney Failure, Chronic surgery, Kidney Transplantation adverse effects, Postoperative Complications, Tissue and Organ Procurement methods
- Abstract
Cold ischemia time (from flush to out-of-ice) and warm ischemia time (from out-of-ice to reperfusion) are known to impact delayed graft function (DGF) rates and long-term allograft survival following deceased donor kidney transplantation. We propose an additional ischemia time, extraction time, beginning with aortic cross-clamp and perfusion/cooling of the kidneys, and ending with removal of the kidneys and placement on ice on the backtable. During this time the kidneys rewarm, suffering an additional ischemic insult, which may impair transplant function. We measured extraction times of 576 kidneys recovered and transplanted locally between January 2006 and December 2008, then linked to Scientific Registry of Transplant Recipients (SRTR) data for outcomes. Extraction time ranged from 14 to 123 min, with a mean of 44.7 min. In SRTR-adjusted analyses, longer extraction time and DGF were statistically associated (odds ratio [OR] = 1.19 per 5 min beyond 60 min, 95% confidence interval [CI] 1.02-1.39, p = 0.03). Up to 60 min of extraction time, DGF incidence was 27.8%; by 120 min it doubled to nearly 60%. Although not statistically significant (OR = 1.19, 95% CI 0.96-1.49, p = 0.11), primary nonfunction rate also rose dramatically to nearly 20% by 120 min extraction time. Extraction time is a novel and important factor to consider when evaluating a deceased donor kidney offer and when strategizing personnel for kidney recovery., (© Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.)
- Published
- 2016
- Full Text
- View/download PDF
26. Center-level variation in the development of delayed graft function after deceased donor kidney transplantation.
- Author
-
Orandi BJ, James NT, Hall EC, Van Arendonk KJ, Garonzik-Wang JM, Gupta N, Montgomery RA, Desai NM, and Segev DL
- Subjects
- Adult, Aged, Female, Humans, Logistic Models, Male, Middle Aged, Tissue Donors, Delayed Graft Function etiology, Kidney Transplantation adverse effects
- Abstract
Background: Patient-level risk factors for delayed graft function (DGF) have been well described. However, the Organ Procurement and Transplantation Network definition of DGF is based on dialysis in the first week, which is subject to center-level practice patterns. It remains unclear if there are center-level differences in DGF and if measurable center characteristics can explain these differences., Methods: Using the 2003 to 2012 Scientific Registry of Transplant Recipients data, we developed a hierarchical (multilevel) model to determine the association between center characteristics and DGF incidence after adjusting for known patient risk factors and to quantify residual variability across centers after adjustment for these factors., Results: Of 82,143 deceased donor kidney transplant recipients, 27.0% developed DGF, with a range across centers of 3.2% to 63.3%. A center's proportion of preemptive transplants (odds ratio [OR], 0.83; per 5% increment; 95% confidence interval [95% CI], 0.74-;0.93; P = 0.001) and kidneys with longer than 30 hr of cold ischemia time (CIT) (OR, 0.95; per 5% increment; 95% CI, 0.92-;0.98; P = 0.001) were associated with less DGF. A center's proportion of donation after cardiac death donors (OR, 1.12; per 5% increment; 95% CI, 1.03-;1.17; P < 0.001) and imported kidneys (OR, 1.06; per 5% increment; 95% CI, 1.03-;1.10; P < 0.001) were associated with more DGF. After patient-level and center-level adjustments, only 41.8% of centers had DGF incidences consistent with the national median and 28.2% had incidences above the national median., Conclusion: Significant heterogeneity in DGF incidences across centers, even after adjusting for patient-level and center-level characteristics, calls into question the generalizability and validity of the current DGF definition. Enhanced understanding of center-level variability and improving the definition of DGF accordingly may improve DGF's utility in clinical care and as a surrogate endpoint in clinical trials.
- Published
- 2015
- Full Text
- View/download PDF
27. Choosing the order of deceased donor and living donor kidney transplantation in pediatric recipients: a Markov decision process model.
- Author
-
Van Arendonk KJ, Chow EK, James NT, Orandi BJ, Ellison TA, Smith JM, Colombani PM, and Segev AD
- Subjects
- Adolescent, Adult, Age Factors, Child, Computer Simulation, Eligibility Determination, Female, Graft Survival, HLA Antigens immunology, Histocompatibility, Humans, Isoantibodies blood, Kidney Transplantation adverse effects, Kidney Transplantation mortality, Male, Markov Chains, Middle Aged, Multivariate Analysis, Proportional Hazards Models, Registries, Reoperation, Risk Factors, Stochastic Processes, Time Factors, Treatment Outcome, United States, Waiting Lists, Young Adult, Decision Support Techniques, Donor Selection, Kidney Transplantation methods, Living Donors supply & distribution
- Abstract
Background: Most pediatric kidney transplant recipients eventually require retransplantation, and the most advantageous timing strategy regarding deceased and living donor transplantation in candidates with only 1 living donor remains unclear., Methods: A patient-oriented Markov decision process model was designed to compare, for a given patient with 1 living donor, living-donor-first followed if necessary by deceased donor retransplantation versus deceased-donor-first followed if necessary by living donor (if still able to donate) or deceased donor (if not) retransplantation. Based on Scientific Registry of Transplant Recipients data, the model was designed to account for waitlist, graft, and patient survival, sensitization, increased risk of graft failure seen during late adolescence, and differential deceased donor waiting times based on pediatric priority allocation policies. Based on national cohort data, the model was also designed to account for aging or disease development, leading to ineligibility of the living donor over time., Results: Given a set of candidate and living donor characteristics, the Markov model provides the expected patient survival over a time horizon of 20 years. For the most highly sensitized patients (panel reactive antibody > 80%), a deceased-donor-first strategy was advantageous, but for all other patients (panel reactive antibody < 80%), a living-donor-first strategy was recommended., Conclusions: This Markov model illustrates how patients, families, and providers can be provided information and predictions regarding the most advantageous use of deceased donor versus living donor transplantation for pediatric recipients.
- Published
- 2015
- Full Text
- View/download PDF
28. Loss of pediatric kidney grafts during the "high-risk age window": insights from pediatric liver and simultaneous liver-kidney recipients.
- Author
-
Van Arendonk KJ, King EA, Orandi BJ, James NT, Smith JM, Colombani PM, Magee JC, and Segev DL
- Subjects
- Adolescent, Age Factors, Child, Child, Preschool, Female, Humans, Incidence, Infant, Male, Outcome Assessment, Health Care, Registries, Retrospective Studies, Risk Assessment, Young Adult, Graft Rejection epidemiology, Kidney Transplantation statistics & numerical data, Liver Transplantation statistics & numerical data, Transplant Recipients
- Abstract
Pediatric kidney transplant recipients experience a high-risk age window of increased graft loss during late adolescence and early adulthood that has been attributed primarily to sociobehavioral mechanisms such as nonadherence. An examination of how this age window affects recipients of other organs may inform the extent to which sociobehavioral mechanisms are to blame or whether kidney-specific biologic mechanisms may also exist. Graft loss risk across current recipient age was compared between pediatric kidney (n = 17,446), liver (n = 12,161) and simultaneous liver-kidney (n = 224) transplants using piecewise-constant hazard rate models. Kidney graft loss during late adolescence and early adulthood (ages 17-24 years) was significantly greater than during ages <17 (aHR = 1.79, 95%CI = 1.69-1.90, p < 0.001) and ages >24 (aHR = 1.11, 95%CI = 1.03-1.20, p = 0.005). In contrast, liver graft loss during ages 17-24 was no different than during ages <17 (aHR = 1.03, 95%CI = 0.92-1.16, p = 0.6) or ages >24 (aHR = 1.18, 95%CI = 0.98-1.42, p = 0.1). In simultaneous liver-kidney recipients, a trend towards increased kidney compared to liver graft loss was observed during ages 17-24 years. Late adolescence and early adulthood are less detrimental to pediatric liver grafts compared to kidney grafts, suggesting that sociobehavioral mechanisms alone may be insufficient to create the high-risk age window and that additional biologic mechanisms may also be required., (© Copyright 2014 The American Society of Transplantation and the American Society of Transplant Surgeons.)
- Published
- 2015
- Full Text
- View/download PDF
29. National trends over 25 years in pediatric kidney transplant outcomes.
- Author
-
Van Arendonk KJ, Boyarsky BJ, Orandi BJ, James NT, Smith JM, Colombani PM, and Segev DL
- Subjects
- Adolescent, Child, Child, Preschool, Female, Graft Survival, Humans, Infant, Kidney Transplantation mortality, Male, Survival Rate, Time Factors, Treatment Outcome, United States, Kidney Transplantation trends
- Abstract
Objective: To investigate changes in pediatric kidney transplant outcomes over time and potential variations in these changes between the early and late posttransplant periods and across subgroups based on recipient, donor, and transplant characteristics., Methods: Using multiple logistic regression and multivariable Cox models, graft and patient outcomes were analyzed in 17,446 pediatric kidney-only transplants performed in the United States between 1987 and 2012., Results: Ten-year patient and graft survival rates were 90.5% and 60.2%, respectively, after transplantation in 2001, compared with 77.6% and 46.8% after transplantation in 1987. Primary nonfunction and delayed graft function occurred in 3.3% and 5.3%, respectively, of transplants performed in 2011, compared with 15.4% and 19.7% of those performed in 1987. Adjusted for recipient, donor, and transplant characteristics, these improvements corresponded to a 5% decreased hazard of graft loss, 5% decreased hazard of death, 10% decreased odds of primary nonfunction, and 5% decreased odds of delayed graft function with each more recent year of transplantation. Graft survival improvements were lower in adolescent and female recipients, those receiving pretransplant dialysis, and those with focal segmental glomerulosclerosis. Patient survival improvements were higher in those with elevated peak panel reactive antibody. Both patient and graft survival improvements were most pronounced in the first posttransplant year., Conclusions: Outcomes after pediatric kidney transplantation have improved dramatically over time for all recipient subgroups, especially for highly sensitized recipients. Most improvement in graft and patient survival has come in the first year after transplantation, highlighting the need for continued progress in long-term outcomes.
- Published
- 2014
- Full Text
- View/download PDF
30. Immunosuppression regimen and the risk of acute rejection in HIV-infected kidney transplant recipients.
- Author
-
Locke JE, James NT, Mannon RB, Mehta SG, Pappas PG, Baddley JW, Desai NM, Montgomery RA, and Segev DL
- Subjects
- Adolescent, Adult, Aged, Antilymphocyte Serum metabolism, Calcineurin Inhibitors, Female, Graft Survival, HIV Infections immunology, Humans, Kidney Failure, Chronic complications, Male, Middle Aged, Multivariate Analysis, Registries, Risk, Sirolimus chemistry, Treatment Outcome, Young Adult, Graft Rejection, HIV Infections complications, Immunosuppression Therapy methods, Immunosuppressive Agents therapeutic use, Kidney Failure, Chronic therapy, Kidney Transplantation methods
- Abstract
Background: Kidney transplantation (KT) is the treatment for end-stage renal disease in appropriate HIV-positive individuals. However, acute rejection (AR) rates are over twice those of HIV-negative recipients., Methods: To better understand optimal immunosuppression for HIV-positive KT recipients, we studied associations between immunosuppression regimen, AR at 1 year, and survival in 516 HIV-positive and 93,027 HIV-negative adult kidney-only recipients using Scientific Registry of Transplant Recipients data from 2003 to 2011., Results: Consistent with previous reports, HIV-positive patients had twofold higher risk of AR (adjusted relative risk [aRR], 1.77; 95% confidence interval [CI], 1.45-2.2; P<0.001) than their HIV-negative counterparts as well as a higher risk of graft loss (adjusted hazard ratio, 1.51; 95% CI, 1.18-1.94; P=0.001), but these differences were not seen among patients receiving antithymocyte globulin (ATG) induction (aRR for AR, 1.16; 95% CI, 0.41-3.35, P=0.77; adjusted hazard ratio for graft loss, 1.54; 95% CI, 0.73-3.25; P=0.26). Furthermore, HIV-positive patients receiving ATG induction had a 2.6-fold lower risk of AR (aRR, 0.39; 95% CI, 0.18-0.87; P=0.02) than those receiving no antibody induction. Conversely, HIV-positive patients receiving sirolimus-based therapy had a 2.2-fold higher risk of AR (aRR, 2.15; 95% CI, 1.20-3.86; P=0.01) than those receiving calcineurin inhibitor-based regimens., Conclusion: These findings support a role for ATG induction, and caution against the use of sirolimus-based maintenance therapy, in HIV-positive individuals undergoing KT.
- Published
- 2014
- Full Text
- View/download PDF
31. Order of donor type in pediatric kidney transplant recipients requiring retransplantation.
- Author
-
Van Arendonk KJ, James NT, Orandi BJ, Garonzik-Wang JM, Smith JM, Colombani PM, and Segev DL
- Subjects
- Adolescent, Child, Child, Preschool, Female, Graft Survival, Humans, Living Donors, Male, Reoperation, Kidney Transplantation, Tissue Donors
- Abstract
Background: Living-donor kidney transplantation (KT) is encouraged for children with end-stage renal disease due to superior long-term graft survival compared with deceased-donor KT. Despite this, there has been a steady decrease in the use of living-donor KT for pediatric recipients. Due to their young age at transplantation, most pediatric recipients eventually require retransplantation, and the optimal order of donor type is not clear., Methods: Using the Scientific Registry of Transplant Recipients, we analyzed first and second graft survival among 14,799 pediatric (<18 years old) recipients undergoing KT between 1987 and 2010., Results: Living-donor grafts had longer survival compared with deceased-donor grafts, similarly among both first (adjusted hazard ratio [aHR], 0.78; 95% confidence interval [CI], 0.73-0.84; P<0.001) and second (aHR, 0.74; 95% CI, 0.64-0.84; P<0.001) transplants. Living-donor second grafts had longer survival compared with deceased-donor second grafts, similarly after living-donor (aHR, 0.68; 95% CI, 0.56-0.83; P<0.001) and deceased-donor (aHR, 0.77; 95% CI, 0.63-0.95; P=0.02) first transplants. Cumulative graft life of two transplants was similar regardless of the order of deceased-donor and living-donor transplantation., Conclusions: Deceased-donor KT in pediatric recipients followed by living-donor retransplantation does not negatively impact the living-donor graft survival advantage and provides similar cumulative graft life compared with living-donor KT followed by deceased-donor retransplantation. Clinical decision-making for pediatric patients with healthy, willing living donors should consider these findings in addition to the risk of sensitization, aging of the living donor, and deceased-donor waiting times.
- Published
- 2013
- Full Text
- View/download PDF
32. Practice patterns and outcomes in retransplantation among pediatric kidney transplant recipients.
- Author
-
Van Arendonk KJ, Garonzik Wang JM, Deshpande NA, James NT, Smith JM, Montgomery RA, Colombani PM, and Segev DL
- Subjects
- Adolescent, Age Factors, Child, Child, Preschool, Female, Humans, Incidence, Kidney Transplantation ethnology, Male, Patient Selection, Racial Groups, Reoperation statistics & numerical data, Retrospective Studies, Socioeconomic Factors, Survival Rate, Time Factors, Treatment Outcome, Young Adult, Graft Rejection epidemiology, Kidney Transplantation mortality, Kidney Transplantation statistics & numerical data, Practice Patterns, Physicians' trends, Transplantation
- Abstract
Background: More than 25% of pediatric kidney transplants are lost within 7 years, necessitating dialysis or retransplantation. Retransplantation practices and the outcomes of repeat transplantations, particularly among those with early graft loss, are not clear., Methods: We examined retransplantation practice patterns and outcomes in 14,799 pediatric (ages <18 years) patients between 1987 and 2010. Death-censored graft survival was analyzed using extended Cox models and retransplantation using competing risks regression., Results: After the first graft failure, 50.4% underwent retransplantation and 12.1% died within 5 years; after the second graft failure, 36.1% underwent retransplantation and 15.4% died within 5 years. Prior preemptive transplantation and graft loss after 5 years were associated with increased rates of retransplantation. Graft loss before 5 years, older age, non-Caucasian race, public insurance, and increased panel-reactive antibody were associated with decreased rates of retransplantation. First transplants had lower risk of graft loss compared with second (adjusted hazard ratio [aHR], 0.72; 95% confidence interval [CI], 0.64-0.80; P<0.001), third (aHR, 0.62; 95% CI, 0.49-0.78; P<0.001), and fourth (aHR, 0.44; 95% CI, 0.24-0.78; P=0.005) transplants. However, among patients receiving two or more transplants (conditioned on having lost a first transplant), second graft median survival was 8.5 years despite a median survival of 4.5 years for the first transplant. Among patients receiving three or more transplants, third graft median survival was 7.7 years despite median survivals of 2.1 and 3.1 years for the first and second transplants., Conclusions: Among pediatric kidney transplant recipients who experience graft loss, racial and socioeconomic disparities exist with regard to retransplantation, and excellent graft survival can be achieved with retransplantation despite poor survival of previous grafts.
- Published
- 2013
- Full Text
- View/download PDF
33. Age at graft loss after pediatric kidney transplantation: exploring the high-risk age window.
- Author
-
Van Arendonk KJ, James NT, Boyarsky BJ, Garonzik-Wang JM, Orandi BJ, Magee JC, Smith JM, Colombani PM, and Segev DL
- Subjects
- Adolescent, Adult, Age Factors, Child, Child, Preschool, Female, Humans, Male, Models, Statistical, Multivariate Analysis, Risk Factors, Treatment Outcome, Young Adult, Graft Survival, Kidney Transplantation adverse effects, Time Factors
- Abstract
Background and Objective: The risk of graft loss after pediatric kidney transplantation increases during late adolescence and early adulthood, but the extent to which this phenomenon affects all recipients is unknown. This study explored interactions between recipient factors and this high-risk age window, searching for a recipient phenotype that may be less susceptible during this detrimental age interval., Design, Setting, Participants, & Measurements: With use of Scientific Registry of Transplant Recipients data from 1987 to 2010, risk of graft loss across recipient age was quantified using a multivariable piecewise-constant hazard rate model with time-varying coefficients for recipient risk factors., Results: Among 16,266 recipients, graft loss during ages ≥17 and <24 years was greater than that for both 3-17 years (adjusted hazard ratio [aHR], 1.61; P<0.001) and ≥24 years (aHR, 1.28; P<0.001). This finding was consistent across age at transplantation, sex, race, cause of renal disease, insurance type, pretransplant dialysis history, previous transplant, peak panel-reactive antibody (PRA), and type of induction immunosuppression. The high-risk window was seen in both living-donor and deceased-donor transplant recipients, at all levels of HLA mismatch, regardless of centers' pediatric transplant volume, and consistently over time. The relationship between graft loss risk and donor type, PRA, transplant history, insurance type, and cause of renal disease was diminished upon entry into the high-risk window., Conclusions: No recipient subgroups are exempt from the dramatic increase in graft loss during late adolescence and early adulthood, a high-risk window that modifies the relationship between typical recipient risk factors and graft loss.
- Published
- 2013
- Full Text
- View/download PDF
34. Living unrelated renal transplantation: a good match for the pediatric candidate?
- Author
-
Van Arendonk KJ, Orandi BJ, James NT, Segev DL, and Colombani PM
- Subjects
- Adolescent, Age Factors, Child, Child, Preschool, Female, Follow-Up Studies, Humans, Infant, Infant, Newborn, Male, Registries, Survival Analysis, Treatment Outcome, United States, Graft Survival, Kidney Failure, Chronic surgery, Kidney Transplantation methods, Living Donors, Unrelated Donors
- Abstract
Background/purpose: Living donor kidney transplantation is encouraged for children with end-stage renal disease given the superior survival of living donor grafts, but pediatric candidates are also given preference for kidneys from younger deceased donors., Methods: Death-censored graft survival of pediatric kidney-only transplants performed in the U.S. between 1987-2012 was compared across living related (LRRT) (n=7741), living unrelated (LURT) (n=618), and deceased donor renal transplants (DDRT) (n=8945) using Kaplan-Meier analysis, multivariable Cox proportional hazards models, and matched controls analysis., Results: As expected, HLA mismatch was greater among LURT compared to LRRT (p<0.001). Unadjusted graft survival was lower, particularly long-term, for LURT compared to LRRT (p=0.009). However, LURT graft survival was still superior to DDRT graft survival, even when compared only to deceased donors under age 35 (p=0.002). The difference in graft survival between LURT and LRRT was not seen when adjusting for HLA mismatch, year of transplantation, and donor and recipient characteristics using a Cox model (aHR=1.04, 95% CI: 0.87-1.24, p=0.7) or matched controls (HR=1.02, 95% CI: 0.82-1.27, p=0.9)., Conclusion: Survival of LURT grafts is superior to grafts from younger deceased donors and equivalent to LRRT grafts when adjusting for other factors, most notably differences in HLA mismatch., (Copyright © 2013 Elsevier Inc. All rights reserved.)
- Published
- 2013
- Full Text
- View/download PDF
35. The aggressive phenotype revisited: utilization of higher-risk liver allografts.
- Author
-
Garonzik-Wang JM, James NT, Arendonk KJV, Gupta N, Orandi BJ, Hall EC, Massie AB, Montgomery RA, Dagher NN, Singer AL, Cameron AM, and Segev DL
- Subjects
- Adult, Aged, Cluster Analysis, End Stage Liver Disease diagnosis, Graft Survival, Humans, Liver Function Tests, Middle Aged, Phenotype, Regression Analysis, Risk Factors, Severity of Illness Index, Tissue Donors, Transplantation, Homologous, End Stage Liver Disease surgery, Liver Transplantation methods, Tissue and Organ Procurement, Transplants supply & distribution
- Abstract
Organ shortage has led to increased utilization of higher risk liver allografts. In kidneys, aggressive center-level use of one type of higher risk graft clustered with aggressive use of other types. In this study, we explored center-level behavior in liver utilization. We aggregated national liver transplant recipient data between 2005 and 2009 to the center-level, assigning each center an aggressiveness score based on relative utilization of higher risk livers. Aggressive centers had significantly more patients reaching high MELDs (RR 2.19, 2.33 and 2.28 for number of patients reaching MELD>20, MELD>25 and MELD>30, p<0.001), a higher organ shortage ratio (RR 1.51, 1.60 and 1.51 for number of patients reaching MELD>20, MELD>25 and MELD>30 divided by number of organs recovered at the OPO, p<0.04), and were clustered within various geographic regions, particularly regions 2, 3 and 9. Median MELD at transplant was similar between aggressive and nonaggressive centers, but average annual transplant volume was significantly higher at aggressive centers (RR 2.27, 95% CI 1.47-3.51, p<0.001). In cluster analysis, there were no obvious phenotypic patterns among centers with intermediate levels of aggressiveness. In conclusion, highwaitlist disease severity, geographic differences in organ availability, and transplant volume are the main factors associated with the aggressive utilization of higher risk livers., (© Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons.)
- Published
- 2013
- Full Text
- View/download PDF
36. Exploring walking path quality as a factor for urban elementary school children's active transport to school.
- Author
-
Curriero FC, James NT, Shields TM, Gouvis Roman C, Furr-Holden CD, Cooley-Strickland M, and Pollack KM
- Subjects
- Baltimore, Child, Female, Geographic Information Systems, Humans, Male, Models, Statistical, Social Environment, Urban Population, Environment Design, Schools, Social Problems, Walking statistics & numerical data
- Abstract
Background: Path quality has not been well studied as a correlate of active transport to school. We hypothesize that for urban-dwelling children the environment between home and school is at least as important as the environment immediately surrounding their homes and/or schools when exploring walking to school behavior., Methods: Tools from spatial statistics and geographic information systems (GIS) were applied to an assessment of street blocks to create a walking path quality measure based on physical and social disorder (termed "incivilities") for each child. Path quality was included in a multivariate regression analysis of walking to school status for a sample of 362 children., Results: The odds of walking to school for path quality was 0.88 (95% CI: 0.72-1.07), which although not statistically significant is in the direction supporting our hypothesis. The odds of walking to school for home street block incivility suggests the counter intuitive effect (OR = 1.10, 95% CI: 1.08-1.19)., Conclusions: Results suggest that urban children living in communities characterized by higher incivilities are more likely to walk to school, potentially placing them at risk for adverse health outcomes because of exposure to high incivility areas along their route. Results also support the importance of including path quality when exploring the influence of the environment on walking to school behavior.
- Published
- 2013
- Full Text
- View/download PDF
37. Live donor champion: finding live kidney donors by separating the advocate from the patient.
- Author
-
Garonzik-Wang JM, Berger JC, Ros RL, Kucirka LM, Deshpande NA, Boyarsky BJ, Montgomery RA, Hall EC, James NT, and Segev DL
- Subjects
- Aged, Female, Health Education, Humans, Linear Models, Male, Middle Aged, Outcome Assessment, Health Care, Prospective Studies, Communication Barriers, Kidney Transplantation, Living Donors, Patient Advocacy education, Tissue and Organ Procurement methods
- Abstract
Background: Lack of education and reluctance to initiate a conversation about live donor kidney transplantation is a common barrier to finding a donor. Although transplant candidates are often hesitant to discuss their illness, friends or family members are often eager to spread awareness and are empowered by advocating for the candidates. We hypothesized that separating the advocate from the patient is important in identifying live donors., Methods: We developed an intervention to train a live donor champion (LDC; a friend, family member, or community member willing to advocate for the candidate) for this advocacy role. We compared outcomes of 15 adult kidney transplant candidates who had no prospective donors and underwent the LDC intervention with 15 matched controls from our waiting list., Results: Comfort in initiating a conversation about transplantation increased over time for LDCs. Twenty-five potential donors contacted our center on behalf of LDC participants; four participants achieved live donor kidney transplantation and three additional participants have donors in evaluation, compared with zero among matched controls (P < 0.001)., Conclusions: Transplant candidates are ill equipped to seek live donors; by separating the advocate from the patient, understandable concerns about initiating conversations are reduced.
- Published
- 2012
- Full Text
- View/download PDF
38. Pregnancy outcomes of liver transplant recipients: a systematic review and meta-analysis.
- Author
-
Deshpande NA, James NT, Kucirka LM, Boyarsky BJ, Garonzik-Wang JM, Cameron AM, Singer AL, Dagher NN, and Segev DL
- Subjects
- Female, Humans, Pregnancy, Liver Failure epidemiology, Liver Failure surgery, Liver Transplantation statistics & numerical data, Pregnancy Complications epidemiology, Pregnancy Outcome epidemiology
- Abstract
Approximately 14,000 women of reproductive age are currently living in the United States after liver transplantation (LT), and another 500 undergo LT each year. Although LT improves reproductive function in women with advanced liver disease, the associated pregnancy outcomes and maternal-fetal risks have not been quantified in a broad manner. To obtain more generalizable inferences, we performed a systematic review and meta-analysis of articles that were published between 2000 and 2011 and reported pregnancy-related outcomes for LT recipients. Eight of 578 unique studies met the inclusion criteria, and these studies represented 450 pregnancies in 306 LT recipients. The post-LT live birth rate [76.9%, 95% confidence interval (CI) = 72.7%-80.7%] was higher than the live birth rate for the US general population (66.7%) but was similar to the post-kidney transplantation (KT) live birth rate (73.5%). The post-LT miscarriage rate (15.6%, 95% CI = 12.3%-19.2%) was lower than the miscarriage rate for the general population (17.1%) but was similar to the post-KT miscarriage rate (14.0%). The rates of pre-eclampsia (21.9%, 95% CI = 17.7%-26.4%), cesarean section delivery (44.6%, 95% CI = 39.2%-50.1%), and preterm delivery (39.4%, 95% CI = 33.1%-46.0%) were higher than the rates for the US general population (3.8%, 31.9%, and 12.5%, respectively) but lower than the post-KT rates (27.0%, 56.9%, and 45.6%, respectively). Both the mean gestational age and the mean birth weight were significantly greater (P < 0.001) for LT recipients versus KT recipients (36.5 versus 35.6 weeks and 2866 versus 2420 g). Although pregnancy after LT is feasible, the complication rates are relatively high and should be considered during patient counseling and clinical decision making. More case and center reports are necessary so that information on post-LT pregnancy outcomes and complications can be gathered to improve the clinical management of pregnant LT recipients. Continued reporting to active registries is highly encouraged at the center level., (Copyright © 2012 American Association for the Study of Liver Diseases.)
- Published
- 2012
- Full Text
- View/download PDF
39. Center-level factors and racial disparities in living donor kidney transplantation.
- Author
-
Hall EC, James NT, Garonzik Wang JM, Berger JC, Montgomery RA, Dagher NN, Desai NM, and Segev DL
- Subjects
- Adolescent, Adult, Aged, Cohort Studies, Donor Selection, Female, Humans, Incidence, Kidney Failure, Chronic diagnosis, Kidney Failure, Chronic surgery, Kidney Transplantation statistics & numerical data, Logistic Models, Male, Middle Aged, Multivariate Analysis, Needs Assessment, Risk Factors, Treatment Outcome, United States, Young Adult, Black or African American statistics & numerical data, Healthcare Disparities trends, Kidney Failure, Chronic ethnology, Kidney Transplantation ethnology, Living Donors statistics & numerical data, White People statistics & numerical data
- Abstract
Background: On average, African Americans attain living donor kidney transplantation (LDKT) at decreased rates compared with their non-African American counterparts. However, center-level variations in this disparity or the role of center-level factors is unknown., Study Design: Observational cohort study., Setting & Participants: 247,707 adults registered for first-time kidney transplants from 1995-2007 as reported by the Scientific Registry of Transplant Recipients., Predictors: Patient-level factors (age, sex, body mass index, insurance status, education, blood type, and panel-reactive antibody level) were adjusted for in all models. The association of center-level characteristics (number of candidates, transplant volume, LDKT volume, median time to transplant, percentage of African American candidates, percentage of prelisted candidates, and percentage of LDKT) and degree of racial disparity in LDKT was quantified., Outcomes: Hierarchical multivariate logistic regression models were used to derive center-specific estimates of LDKT attainment in African American versus non-African American candidates., Results: Racial parity was not seen at any of the 275 transplant centers in the United States. At centers with the least racial disparity, African Americans had 35% lower odds of receiving LDKT; at centers with the most disparity, African Americans had 76% lower odds. Higher percentages of African American candidates (interaction term, 0.86; P = 0.03) and prelisted candidates (interaction term, 0.80; P = 0.001) at a given center were associated with increased racial disparity at that center. Higher rates of LDKT (interaction term, 1.25; P < 0.001) were associated with less racial disparity., Limitations: Some patient-level factors are not captured, including a given patient's pool of potential donors. Geographic disparities in deceased donor availability might affect LDKT rates. Center-level policies and practices are not captured., Conclusions: Racial disparity in attainment of LDKT exists at every transplant center in the country. Centers with higher rates of LDKT attainment for all races had less disparity; these high-performing centers might provide insights into policies that might help address this disparity., (Copyright © 2012 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.)
- Published
- 2012
- Full Text
- View/download PDF
40. Outcomes of ABO-incompatible kidney transplantation in the United States.
- Author
-
Montgomery JR, Berger JC, Warren DS, James NT, Montgomery RA, and Segev DL
- Subjects
- Adult, Cohort Studies, Female, Follow-Up Studies, Graft Rejection immunology, Humans, Kidney Failure, Chronic epidemiology, Living Donors, Male, Middle Aged, Prognosis, Retrospective Studies, Risk Factors, Survival Rate, United States epidemiology, ABO Blood-Group System immunology, Blood Group Incompatibility immunology, Kidney Failure, Chronic surgery, Kidney Transplantation immunology, Kidney Transplantation mortality
- Abstract
Background: ABO incompatible (ABOi) kidney transplantation is an important modality to facilitate living donor transplant for incompatible pairs. To date, reports of the outcomes from this practice in the United States have been limited to single-center studies., Methods: Using the Scientific Registry of Transplant Recipients, we identified 738 patients who underwent live-donor ABOi kidney transplantation between January 1, 1995, and March 31, 2010. These were compared with matched controls that underwent ABO compatible live-donor kidney transplantation. Subgroup analyses among ABOi recipients were performed according to donor blood type, recipient blood type, and transplant center ABOi volume., Results: When compared with ABO compatible-matched controls, long-term patient survival of ABOi recipients was not significantly different between the cohorts (P=0.2). However, graft loss was significantly higher, particularly in the first 14 days posttransplant (subhazard ratio, 2.34; 95% confidence interval, 1.43-3.84; P=0.001), with little to no difference beyond day 14 (subhazard ratio, 1.28; 95% confidence interval, 0.99-1.54; P=0.058). In subgroup analyses among ABOi recipients, no differences in survival were seen by donor blood type, recipient blood type, or transplant center ABOi volume., Conclusions: These results support the use and dissemination of ABOi transplantation when a compatible live donor is not available, but caution that the highest period of risk is immediately posttransplant.
- Published
- 2012
- Full Text
- View/download PDF
41. The aggressive phenotype: center-level patterns in the utilization of suboptimal kidneys.
- Author
-
Garonzik-Wang JM, James NT, Weatherspoon KC, Deshpande NA, Berger JA, Hall EC, Montgomery RA, and Segev DL
- Subjects
- Aged, Female, Follow-Up Studies, Humans, Male, Middle Aged, Phenotype, Retrospective Studies, Time Factors, Graft Survival, Kidney Failure, Chronic surgery, Kidney Transplantation statistics & numerical data, Tissue Donors, Tissue and Organ Procurement statistics & numerical data, Waiting Lists
- Abstract
Despite the fact that suboptimal kidneys have worse outcomes, differences in waiting times and wait-list mortality have led to variations in the use of these kidneys. It is unknown whether aggressive center-level use of one type of suboptimal graft clusters with aggressive use of other types of suboptimal grafts, and what center characteristics are associated with an overall aggressive phenotype. United Network for Organ Sharing (UNOS) data from 2005 to 2009 for adult kidney transplant recipients was aggregated to the center level. An aggressiveness score was assigned to each center based on usage of suboptimal grafts. Deceased-donor transplant volume correlated with aggressiveness in lower volume, but not higher volume centers. Aggressive centers were mostly found in regions 2 and 9. Aggressiveness was associated with wait-list size (RR 1.69, 95% CI 1.20-2.34, p = 0.002), organ shortage (RR 2.30, 95% CI 1.57-3.37, p < 0.001) and waiting times (RR 1.75, 95% CI 1.20-2.57, p = 0.004). No centers in single-center OPOs were classified as aggressive. In cluster analysis, the most aggressive centers were aggressive in all metrics and vice versa; however, centers with intermediate aggressiveness had phenotypic patterns in their usage of suboptimal kidneys. In conclusion, wait-list size, waiting times, geographic region and OPO competition seem to be driving factors in center-level aggressiveness., (© 2011 The American Society of Transplantation and the American Society of Transplant Surgeons.)
- Published
- 2012
- Full Text
- View/download PDF
42. Effect of eliminating priority points for HLA-B matching on racial disparities in kidney transplant rates.
- Author
-
Hall EC, Massie AB, James NT, Garonzik Wang JM, Montgomery RA, Berger JC, and Segev DL
- Subjects
- Aged, Cohort Studies, Female, Humans, Male, Middle Aged, Retrospective Studies, Black or African American, HLA-B Antigens, Healthcare Disparities standards, Kidney Transplantation statistics & numerical data, Tissue and Organ Procurement standards, White People
- Abstract
Background: African Americans have lower rates of obtaining a deceased donor kidney transplant (DDKT) compared with their white counterparts. One proposed mechanism is differential HLA distributions between African Americans and whites. In May 2003, the United Network for Organ Sharing/Organ Procurement and Transplantation Network changed kidney allocation policy to eliminate priority based on HLA-B matching in an effort to address this disparity. The objective of this study was to quantify the effect of the change in policy regarding priority points for HLA-B matching., Study Design: Observational cohort study., Setting & Participants: A cohort of 178,902 patients registered for a DDKT between January 2000 and August 2009., Factors: African Americans versus whites before and after the policy change. Cox models were adjusted for age, sex, diabetes, dialysis type, insurance status, education, panel-reactive antibody level, and blood type., Outcomes: Adjusted relative rates (aRRs) of deceased donor kidney transplant for African Americans compared with whites., Measurements: Time from initial active wait listing to DDKT, censored for living donor kidney transplant and death., Results: Before the policy change, African Americans had 37% lower rates of DDKT (aRR, 0.63; 95% CI, 0.60-0.65; P < 0.001). After the policy change, African Americans had 23% lower rates of DDKT (aRR, 0.77; 95% CI, 0.76-0.79; P < 0.001). There was a 23% reduction in the disparity between African Americans and whites after the policy change (interaction aRR, 1.23; 95% CI, 1.18-1.29; P < 0.001)., Limitations: As an observational study, findings could have been affected by residual confounding or other changes in practice patterns., Conclusions: Racial disparity in rates of DDKT was decreased by the HLA-B policy change, but parity was not achieved. There are unaddressed factors in kidney allocation that lead to continued disparity on the kidney transplant waiting list., (Copyright © 2011 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.)
- Published
- 2011
- Full Text
- View/download PDF
43. Late graft loss among pediatric recipients of DCD kidneys.
- Author
-
Van Arendonk KJ, James NT, Locke JE, Montgomery RA, Colombani PM, and Segev DL
- Subjects
- Adolescent, Adult, Age Factors, Brain Death, Chi-Square Distribution, Child, Delayed Graft Function etiology, Female, Graft Rejection etiology, Humans, Kaplan-Meier Estimate, Kidney Transplantation mortality, Male, Patient Selection, Proportional Hazards Models, Registries, Risk Assessment, Risk Factors, Time Factors, Tissue and Organ Procurement, Treatment Outcome, United States, Young Adult, Donor Selection, Graft Survival, Kidney Transplantation adverse effects, Tissue Donors supply & distribution
- Abstract
Background and Objectives: Kidney transplantation from donors after cardiac death (DCD) provides similar graft survival to donors after brain death (DBD) in adult recipients. However, outcomes of DCD kidneys in pediatric recipients remain unclear, primarily because of limited sample sizes., Design, Setting, Participants, & Measurements: We identified 137 pediatric (<18 years old) recipients of DCD kidneys between 1994 and 2010 using Scientific Registry of Transplant Recipients data and compared outcomes with 6059 pediatric recipients of DBD kidneys during the same time period, accounting for donor, recipient, and transplant characteristics using time-varying Cox regression and matched controls. Long-term follow-up (4 years or beyond) was available for 31 DCD recipients., Results: Pediatric recipients of DCD kidneys experienced a significantly higher rate of delayed graft function (22.0% versus 12.3%; P = 0.001), although lower than reported delayed graft function rates of DCD grafts in adults. Although DCD and DBD graft survival was equal in the early postoperative period, graft loss among pediatric recipients of DCD kidneys exceeded their DBD counterparts starting 4 years after transplantation. This effect was statistically significant in a multivariate Cox model (hazard ratio = 2.03; 95% confidence interval, 1.21 to 3.39; P = 0.007) and matched-controls analysis (hazard ratio = 2.36; 95% confidence interval, 1.11 to 5.03; P = 0.03)., Conclusions: A significant increase in DCD graft loss starting 4 years after transplantation motivates a cautious approach to the use of DCD kidneys in children, in whom long-term graft survival is of utmost importance.
- Published
- 2011
- Full Text
- View/download PDF
44. Pregnancy outcomes in kidney transplant recipients: a systematic review and meta-analysis.
- Author
-
Deshpande NA, James NT, Kucirka LM, Boyarsky BJ, Garonzik-Wang JM, Montgomery RA, and Segev DL
- Subjects
- Adult, Cesarean Section, Female, Gestational Age, Humans, Infant, Newborn, Infant, Premature, Middle Aged, Pregnancy, Pregnancy Complications epidemiology, Kidney Transplantation, Pregnancy Outcome
- Abstract
Approximately 50,000 women of reproductive age in the United States are currently living after kidney transplantation (KT), and another 2800 undergo KT each year. Although KT improves reproductive function in women with ESRD, studies of post-KT pregnancies are limited to a few voluntary registry analyses and numerous single-center reports. To obtain more generalizable inferences, we performed a systematic review and meta-analysis of articles published between 2000 and 2010 that reported pregnancy-related outcomes among KT recipients. Of 1343 unique studies, 50 met inclusion criteria, representing 4706 pregnancies in 3570 KT recipients. The overall post-KT live birth rate of 73.5% (95%CI 72.1-74.9) was higher than the general US population (66.7%); similarly, the overall post-KT miscarriage rate of 14.0% (95%CI 12.9-15.1) was lower (17.1%). However, complications of preeclampsia (27.0%, 95%CI 25.2-28.9), gestational diabetes (8.0%, 95%CI 6.7-9.4), Cesarean section (56.9%, 95%CI 54.9-58.9) and preterm delivery (45.6%, 95%CI 43.7-47.5) were higher than the general US population (3.8%, 3.9%, 31.9% and 12.5%, respectively). Pregnancy outcomes were more favorable in studies with lower mean maternal ages; obstetrical complications were higher in studies with shorter mean interval between KT and pregnancy. Although post-KT pregnancy is feasible, complications are relatively high and should be considered in patient counseling and clinical decision making., (©2011 The Authors Journal compilation © 2011 The American Society of Transplantation and the American Society of Transplant Surgeons.)
- Published
- 2011
- Full Text
- View/download PDF
45. Influence of caregiver and provider communication on symptom days and medication use for inner-city children with asthma.
- Author
-
Butz A, Kub J, Donithan M, James NT, Thompson RE, Bellin M, Tsoukleris M, and Bollinger ME
- Subjects
- Black or African American statistics & numerical data, Anti-Asthmatic Agents therapeutic use, Asthma ethnology, Child, Drug Utilization, Emergency Service, Hospital statistics & numerical data, Female, Health Personnel, Hospitalization statistics & numerical data, Humans, Logistic Models, Longitudinal Studies, Male, Medical Assistance statistics & numerical data, Severity of Illness Index, Asthma therapy, Caregivers, Health Education methods, Professional-Family Relations, Urban Population statistics & numerical data
- Abstract
Background: Effective pediatric guideline-based asthma care requires the caregiver to accurately relay the child's symptom frequency, pattern of rescue and controller medication use, and level of asthma control to the child's primary care clinician., Objective: This study evaluated the longitudinal effects of a caregiver-clinician asthma communication education intervention (ACE) relative to an asthma education control group (CON) on symptom days and controller medication use in inner-city children with asthma., Participants and Methods: 231 inner-city children with asthma, recruited from urban pediatric emergency departments (EDs) and community practices, were followed for 12 months. Data included number of symptom days and nights, ED visits, hospitalizations, presence of limited activity, and controller medication use over 12 months. Pharmacy records were used to calculate controller to total asthma medication ratios as a proxy of appropriate controller medication use. Multivariate logistic regression models were used to identify factors associated with number of symptom days and nights over the past 30 days at the 12-month follow-up., Results: Most caregivers rated the communication with their child's clinician as high. Unadjusted and adjusted rates of symptom days and nights did not differ by group at follow-up. ACE children tended towards a higher controller to total medication ratio at 12 months as compared to CON children (mean ratio: ACE: 0.54, SD 0.3; CON, 0.45, SD 0.4; p = .07). Activity limitation due to asthma and persistent asthma severity were the only factors significantly associated with reporting any symptom day within the past 30 days, adjusting for treatment group, number of oral corticosteroid courses and number of clinician visits in the last 6 months, seasonality, insurance type, and controller to total asthma medication ratio covariates., Conclusion: A home-based caregiver asthma communication educational intervention was not associated with decreased symptom days. However, a trend was noted in higher controller to total medication ratios in the intervention group. Inner-city caregivers of children with asthma may require a health systems approach to help convey the child's asthma health information to their clinician.
- Published
- 2010
- Full Text
- View/download PDF
46. Health status, physical disability, and obesity among adult Mississippians with chronic joint symptoms or doctor-diagnosed arthritis: findings from the Behavioral Risk Factor Surveillance System, 2003.
- Author
-
James NT, Miller CW, Fos PJ, Zhang L, Wall P, and Welch C
- Subjects
- Activities of Daily Living, Adult, Age Distribution, Aged, Arthralgia complications, Arthralgia ethnology, Arthritis complications, Arthritis ethnology, Disability Evaluation, Female, Humans, Male, Middle Aged, Mississippi epidemiology, Obesity complications, Obesity epidemiology, Obesity ethnology, Quality of Life, Sex Distribution, Social Class, Arthralgia epidemiology, Arthritis epidemiology, Behavioral Risk Factor Surveillance System
- Abstract
Introduction: The purpose of this study was to analyze 2003 Mississippi Behavioral Risk Factor Surveillance System (BRFSS) data to describe the health of Mississippians with arthritis or chronic joint pain. For this study, we made statistical estimates of the extent of arthritis burden among the respondents and delineated measurable differences in sociodemographic factors, health status, and the prevalence of associated risk factors. Our findings compare health-related quality of life, physical activity, and key demographic characteristics and obesity rates, controlling for differences among the subgroups by age, sex, educational attainment, income, and race/ethnicity., Methods: Respondents to Mississippi's 2003 BRFSS were assigned to 1 of 5 distinct and mutually exclusive subgroups: 1) those with intermittent joint symptoms (IJS), 2) those with chronic joint symptoms (CJS), 3) those with doctor-diagnosed arthritis without CJS (DDA-CJS), 4) those with doctor-diagnosed arthritis with chronic joint symptoms (DDA+CJS), and 5) those with no joint symptoms (NJS). To determine the prevalence of arthritis and the continuum of disease progression, we compared the health-related quality of life, physical activity, and obesity of the respondents., Results: Respondents with DDA+CJS were older than those with NJS (mean age, 57.1 years vs 38.7 years); they were more likely to be female (60.5% vs 51.7%), to have a high school diploma or less education (59.3% vs 45.4%), to be in fair to poor health (odds ratio [OR], 10.0), to be physically inactive (OR, 2.7), and to be overweight or obese (OR, 2.5)., Conclusion: Health status, physical disability, and weight control may be substantially improved through heightened levels of physical activity. However, in spite of the potential for marked improvement, adult Mississippians, especially those clients with DDA+CJS, remain reluctant to commit to exercise regimens. Findings from this study suggest a need to encourage Mississippians with DDA+CJS to engage in some regular physical activity, which could reduce the damaging effects of disease and improve their health. Increasing the health care resources earmarked for arthritis self-help and physical activity programs is one potential avenue to address the problem.
- Published
- 2008
47. Sociodemographic and health-related determinants of breast and cervical cancer screening behavior, 2005.
- Author
-
Welch C, Miller CW, and James NT
- Subjects
- Adult, Aged, Behavioral Risk Factor Surveillance System, Breast Neoplasms diagnosis, Confidence Intervals, Female, Humans, Middle Aged, Odds Ratio, Risk Assessment, Socioeconomic Factors, United States epidemiology, Uterine Cervical Neoplasms diagnosis, Women's Health, Attitude to Health, Breast Neoplasms prevention & control, Health Behavior, Mass Screening statistics & numerical data, Patient Acceptance of Health Care statistics & numerical data, Uterine Cervical Neoplasms prevention & control
- Abstract
Objectives: To identify sociodemographic and health-related determinants of Breast and Cervical Cancer Screening behaviors and evaluate progress toward Healthy People 2010 cancer-related objectives., Design: The Behavioral Risk Factor Surveillance System 2005 data served as the numerical predicate for identifying or validating sociodemographic and health-related quality of life predictors, or both, and for determining any relative progress., Setting/participants: Eleven U.S. states (n = 27,625 women)., Main Outcome Measures: Determinants of Breast and Cervical Cancer Screening and assessment of progress toward Healthy People 2010 objectives 3-11 and 3-13., Results: Nine significant predictors of annual Breast and Cervical Cancer Screening (reported as odds ratios) were identified through regression analysis: adequate health care coverage, nonsmoking, age between 40 and 64 years, age greater than or equal to 65 years, no activity limitations, Black, non-Hispanic race, income greater than or equal to $35K, current exercise performance, and no risk for high blood cholesterol. Also, Healthy People 2010 objective 3-11 was not met; however, objective 3-13 was exceeded by 2.0%., Conclusions: The national health initiatives appear to benefit select American women (overall declining mortality rates from breast and cervical cancer); however, there seems to be a negative economy of scale with respect to age-as age increases, Breast and Cervical Cancer Screening declines and morbidity/mortality increases. Given this disparity, as of 2005, related Healthy People 2010 objectives remain unrealized.
- Published
- 2008
- Full Text
- View/download PDF
48. The impact of Hurricane Katrina upon older adult nurses: an assessment of quality of life and psychological distress in the aftermath.
- Author
-
James NT, Miller CW, Nugent K, Welch C, Cabanna M, and Vincent S
- Subjects
- Adult, Age Factors, Aged, Cross-Sectional Studies, Cyclonic Storms, Female, Humans, Male, Middle Aged, Mississippi, Odds Ratio, Young Adult, Disasters, Nurses psychology, Quality of Life psychology, Stress Disorders, Post-Traumatic psychology
- Abstract
The primary purpose of the current study was to evaluate the impact of Hurricane Katrina upon older nurses using cross sectional data from 291 respondents. Collected data served as the numerical predicate for the evaluation of quality of life and psychological distress among nurses who were affected by Hurricane Katrina. While the focus for the present study was upon older nurses, cross sectional data was reflected for the plenary sample as well. Predictors of Katrina's impact upon older nurses were identified through multinomial regression analyses and included the physical function subscale (OR=0.954), the fatigue subscale (OR=0.961), the arousal subscale (OR=4.190), average to poor health (OR=2.040), married (OR=2.769) and the MSPSS (OR = 0.780). Significant associations between age and storm impact (F=10.707, ñ=.001), depression (F=15.782, ñ< .001), social support (F=5.869, ñ=.016), health status (F=29.004, ñ<.001), anxiety (F=5.583, ñ=.019) and posttraumatic distress disorder (F .032, fñ= .46) remained after adjustment for other risk factors. These associations, as reflected in their respective mean scores, indicated that older nurses experienced greater storm impact (2.880 vs. 2.511), depressive symptoms (11.250 vs. 9.080), anxiety (77.800 vs. 75.430), posttraumatic distress (72.830 vs. 70.860) and lower health status (68.891 vs. 73.569). Accordingly, a more robust public policy paradigm for addressing the growing labor shortages in the medical community is needed. Heightened Congressional interest and increased resourcing is required in order to affect necessary programmatic, educational and institutional remediation. Furthermore, given the increasing role of older nurses in the work place, extensive studies are needed to evaluate their status and independent risk factors for sustaining quality of life and psychological well being among these contributors of health care.
- Published
- 2007
49. Pain disability among older adults with arthritis.
- Author
-
James NT, Miller CW, Brown KC, and Weaver M
- Subjects
- Adult psychology, Aged psychology, Aged, 80 and over psychology, Health Status, Humans, Pain Measurement, Regression Analysis, Self-Assessment, United States, Arthritis complications, Disability Evaluation, Disabled Persons psychology, Pain psychology
- Abstract
Objective: The principal objective was to examine pain disability (the degree to which chronic pain interferes with daily activities) among older adults with arthritis. Specifically, answers to two research questions were sought: (a) Does psychological distress reliably predict pain disability; and (b) do certain theoretically important host, sociodemographic, and health-related factors reliably predict pain disability?, Method: Descriptive, univariate, and multivariate regression analyses were employed to assess key psychosocial, disease, and host factors among the sample (N = 141) of adults with arthritis, aged greater than or equal 50 years old., Results: The resultant regression model accounted for 63.7% (60.0% adjusted) of the variance and was significant at p < .01. Psychological distress, overall health, disease activity, and disease self-efficacy were found to predict pain disability., Discussion: Sample members with greater pain disability experienced heightened psychological distress, poorer perceptions of their overall health, more surgeries, higher unemployment, more intense disease activity, longer disease duration, and lower disease self-efficacy.
- Published
- 2005
- Full Text
- View/download PDF
50. Measurement of red blood cell-bound C3b and C3d using an enzyme-linked direct antiglobulin test.
- Author
-
Bellamy JD, Booker DJ, James NT, Stamps R, and Sokol RJ
- Abstract
Complement has a complex role in immune mediated red blood cell (RBC) destruction and usually induces extravascular hemolysis of C3b-coated RBCs by erythrophagocytosis and by acting synergistically with cell-bound immunoglobulins. A sensitive two-stage enzyme-linked direct antiglobulin test (ELDAT) was developed and used to measure RBC-bound C3b and C3d in 120 healthy adult individuals and in 60 patients suffering from a variety of conditions, including warm- and cold-type autoimmune hemolytic anemia, neoplasia, and collagen diseases. The results were compared with those of standard agglutination tests employing polyclonal and monoclonal antiglobulin reagents. Small amounts of C3b and C3d were detected on RBCs of the healthy individuals only by the ELDAT and probably reflected the continuing low-grade activation of complement necessary for the maintenance of homeostasis of a variety of physiological systems. The quantity did not vary with age or gender. In the patients, increased amounts of RBC-bound C3b and C3d were relatively common and probably resulted from autoantibody activity, immune-complexes, and nonspecific adsorption. There was no association between positive ELDAT results and the presence of active hemolysis. The ELDAT was far more sensitive than the agglutination tests for detecting RBC-bound C3b and also for C3d if the monoclonal reagent was employed.
- Published
- 1997
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.