26 results on '"Graviss EA"'
Search Results
2. Modern Outcomes After Liver Retransplantation: A Single-center Experience.
- Author
-
Connor AA, Saharia A, Mobley CM, Hobeika MJ, Victor DW 3rd, Kodali S, Brombosz EW, Graviss EA, Nguyen DT, Moore LW, Gaber AO, and Ghobrial RM
- Subjects
- Humans, Reoperation adverse effects, Retrospective Studies, Risk Factors, Graft Survival, Liver Transplantation adverse effects, Liver Diseases
- Abstract
Background: The need for liver retransplantation (reLT) has increased proportionally with greater numbers of liver transplants (LTs) performed, use of marginal donors, degree of recipient preoperative liver dysfunction, and longer survival after LT. However, outcomes following reLT have been historically regarded as poor., Methods: To evaluate reLT in modern recipients, we retrospectively examined our single-center experience. Analysis included 1268 patients undergoing single LT and 68 patients undergoing reLT from January 2008 to December 2021., Results: Pre-LT mechanical ventilation, body mass index at LT, donor-recipient ABO incompatibility, early acute rejection, and length of hospitalization were associated with increased risk of needing reLT following index transplant. Overall and graft survival outcomes in the reLT cohort were equivalent to those after single LT. Mortality after reLT was associated with Kidney Donor Profile Index, national organ sharing at reLT, and LT donor death by anoxia and blood urea nitrogen levels. Survival after reLT was independent of the interval between initial LT and reLT, intraoperative packed red blood cell use, cold ischemia time, and preoperative mechanical ventilation, all previously linked to worse outcomes., Conclusions: These data suggest that reLT is currently a safer option for patients with liver graft failure, with comparable outcomes to primary LT., Competing Interests: The authors declare no funding or conflicts of interest., (Copyright © 2023 Wolters Kluwer Health, Inc. All rights reserved.)
- Published
- 2023
- Full Text
- View/download PDF
3. Differences in Myocardial Remodeling and Tissue Characteristics in Chronic Isolated Aortic and Mitral Regurgitation.
- Author
-
Malahfji M, Kitkungvan D, Senapati A, Nguyen DT, El-Tallawi C, Tayal B, Debs D, Crudo V, Graviss EA, Reardon MJ, Quinones M, Zoghbi WA, and Shah DJ
- Subjects
- Humans, Middle Aged, Aged, Cicatrix, Contrast Media, Gadolinium, Hypertrophy, Ventricular Remodeling, Mitral Valve Insufficiency diagnosis, Aortic Valve Insufficiency diagnostic imaging
- Abstract
Background: The left ventricular hemodynamic load differs between aortic regurgitation (AR) and primary mitral regurgitation (MR). We used cardiac magnetic resonance to compare left ventricular remodeling patterns, systemic forward stroke volume, and tissue characteristics between patients with isolated AR and isolated MR., Methods: We assessed remodeling parameters across the spectrum of regurgitant volume. Left ventricular volumes and mass were compared against normal values for age and sex. We calculated forward stroke volume (planimetered left ventricular stroke volume-regurgitant volume) and derived a cardiac magnetic resonance-based systemic cardiac index. We assessed symptom status according to remodeling patterns. We also evaluated the prevalence of myocardial scarring using late gadolinium enhancement imaging, and the extent of interstitial expansion via extracellular volume fraction., Results: We studied 664 patients (240 AR, 424 primary MR), median age of 60.7 (49.5-69.9) years. AR led to more pronounced increases in ventricular volume and mass compared with MR across the spectrum of regurgitant volume ( P <0.001). In ≥moderate regurgitation, AR patients had a higher prevalence of eccentric hypertrophy (58.3% versus 17.5% in MR; P <0.001), whereas MR patients had normal geometry (56.7%) followed by myocardial thinning with low mass/volume ratio (18.4%). The patterns of eccentric hypertrophy and myocardial thinning were more common in symptomatic AR and MR patients ( P <0.001). Systemic cardiac index remained unchanged across the spectrum of AR, whereas it progressively declined with increasing MR volume. Patients with MR had a higher prevalence of myocardial scarring and higher extracellular volume with increasing regurgitant volume ( P value for trend <0.001), whereas they were unchanged across the spectrum of AR ( P =0.24 and 0.42, respectively)., Conclusions: Cardiac magnetic resonance identified significant heterogeneity in remodeling patterns and tissue characteristics at matched degrees of AR and MR. Further research is needed to examine if these differences impact reverse remodeling and clinical outcomes after intervention.
- Published
- 2023
- Full Text
- View/download PDF
4. Outcomes of Kidney Donors With Impaired Fasting Glucose.
- Author
-
Hebert SA, Murad DN, Nguyen DT, Graviss EA, Adrogue HE, Matas AJ, and Ibrahim HN
- Subjects
- Blood Glucose, Fasting, Glomerular Filtration Rate, Glucose, Humans, Risk Factors, Kidney Transplantation adverse effects, Living Donors
- Abstract
Background: Many kidney donor candidates with impaired fasting glucose (IFG) and all candidates with diabetes are currently excluded from kidney donation, fearing the development of an accelerated course of diabetic kidney disease in the remaining kidney., Methods: We studied mortality, proteinuria, and end-stage kidney disease (ESKD) in 8280 donors who donated between 1963 and 2007 according to donation fasting plasma glucose (FPG): <100 mg/dL (n = 6204), 100-125 mg/dL (n = 1826), and ≥126 mg/dL (n = 250)., Results: Donors with IFG and those with FPG ≥126 mg/dL were older, less likely to be non-Hispanic White, had a higher body mass index, and were more likely to be related to their recipient. After 15.7 ± 10.5 y from donation to study close, 4.4% died, 29.4% developed hypertension, 13.8% developed proteinuria, and 41 (0.5%) developed ESKD. In both the logistic and Cox models, IFG was associated with a higher diabetes risk (adjusted hazard ratio [aHR], 1.65; 95% confidence interval [CI], 1.18-2.30) and hypertension (aHR, 1.35; 95% CI, 1.10-1.65; P = 0.003 for both), but not higher risk of proteinuria or ESKD. The multivariable risk of mortality in donors with ≥126 mg/dL was higher than the 2 other groups, but risks of proteinuria, cardiovascular disease, and reduced estimated glomerular filtration rate were similar to those with FPG <126 mg/dL. Three cases of ESKD developed in the 250 donors with FPG ≥126 mg/dL at 18.6 ± 10.3 y after donation (aHR, 5.36; 95% CI, 1.0-27.01; P = 0.04)., Conclusions: Donors with IFG and the majority of donors with ≥126 mg/dL do well and perhaps should not be routinely excluded from donation., Competing Interests: The authors declare no conflicts of interest., (Copyright © 2021 Wolters Kluwer Health, Inc. All rights reserved.)
- Published
- 2022
- Full Text
- View/download PDF
5. Pre-transplant T-cell Clonality: An Observational Study of a Biomarker for Prediction of Sepsis in Liver Transplant Recipients.
- Author
-
Jones SL, Moore LW, Li XC, Mobley CM, Fields PA, Graviss EA, Nguyen DT, Nolte Fong J, Saharia A, Hobeika MJ, McMillan RR, Victor DW 3rd, Minze LJ, Gaber AO, and Ghobrial RM
- Subjects
- Aged, Biomarkers, Female, Humans, Male, Middle Aged, Predictive Value of Tests, Preoperative Period, Sepsis immunology, Clonal Hematopoiesis immunology, Liver Transplantation, Receptors, Antigen, T-Cell immunology, Sepsis diagnosis
- Abstract
Objective: This study investigated the ability of pre-transplant T-cell clonality to predict sepsis after liver transplant (LT)., Summary Background Data: Sepsis is a leading cause of death in LT recipients. Currently, no biomarkers predict sepsis before clinical symptom manifestation., Methods: Between December 2013 and March 2018, our institution performed 478 LTs. After exclusions (eg, patients with marginal donor livers, autoimmune disorders, nonabdominal multi-organ, and liver retransplantations), 180 consecutive LT were enrolled. T-cell characterization was assessed within 48 hours before LT (immunoSEQ Assay, Adaptive Biotechnologies, Seattle, WA). Sepsis-2 and Sepsis-3 cases, defined by presence of acute infection plus ≥2 SIRS criteria, or clinical documentation of sepsis, were identified by chart review. Receiver-operating characteristic analyses determined optimal T-cell repertoire clonality for predicting post-LT sepsis. Kaplan-Meier and Cox proportional hazard modeling assessed outcome-associated prognostic variables., Results: Patients with baseline T-cell repertoire clonality ≥0.072 were 3.82 (1.25, 11.40; P = 0.02), and 2.40 (1.00, 5.75; P = 0.049) times more likely to develop sepsis 3 and 12 months post-LT, respectively, when compared to recipients with lower (<0.072) clonality. T-cell repertoire clonality was the only predictor of sepsis 3 months post-LT in multivariate analysis (C-Statistic, 0.75). Adequate treatment resulted in equivalent survival rates between both groups: (93.4% vs 96.2%, respectively, P = 0.41) at 12 months post-LT., Conclusions: T-cell repertoire clonality is a novel biomarker predictor of sepsis before development of clinical symptoms. Early sepsis monitoring and management may reduce post-LT mortality. These findings have implications for developing sepsis-prevention protocols in transplantation and potentially other populations., Competing Interests: The authors report no conflicts of interest., (Copyright © 2021 The Author(s). Published by Wolters Kluwer Health, Inc.)
- Published
- 2021
- Full Text
- View/download PDF
6. Kidney Transplant Recipients Rarely Show an Early Antibody Response Following the First COVID-19 Vaccine Administration.
- Author
-
Yi SG, Knight RJ, Graviss EA, Moore LW, Nguyen DT, Ghobrial RM, Gaber AO, and Huang HJ
- Subjects
- Biomarkers blood, COVID-19 etiology, COVID-19 immunology, Case-Control Studies, Humans, Immunocompromised Host, Kidney Failure, Chronic complications, Kidney Failure, Chronic immunology, Postoperative Complications immunology, Treatment Outcome, Antibodies, Viral blood, COVID-19 prevention & control, COVID-19 Vaccines immunology, Kidney Transplantation, Postoperative Complications prevention & control, SARS-CoV-2 immunology
- Abstract
Competing Interests: The authors declare no funding or conflicts of interest.
- Published
- 2021
- Full Text
- View/download PDF
7. Endothelial Dysfunction-related Neurological Bleeds with Continuous Flow-Left Ventricular Assist Devices Measured by Digital Thermal Monitor.
- Author
-
Ali A, Uribe C, Araujo-Gutierrez R, Cruz-Solbes AS, Marcos-Abdala HG, Youker KA, Guha A, Torre-Amione G, Nguyen DT, Graviss EA, Cooke JP, and Bhimaraj A
- Subjects
- Aged, Cross-Sectional Studies, Female, Heart Failure physiopathology, Humans, Male, Middle Aged, Vasodilation physiology, Endothelium, Vascular physiology, Heart-Assist Devices adverse effects, Hemorrhage etiology
- Abstract
Endothelial dysfunction has been demonstrated in patients with Continuous Flow-Left Ventricular Assist Devices (CF-LVADs) but association with adverse events has not been shown. We used a noninvasive, operator-independent device called VENDYS® to assess vasodilatory function based on digital thermal measurements postrelease of a brachial artery occlusion in ambulatory patients with CF-LVAD (n = 56). Aortic valve opening and pulse perception were also documented before the test. Median duration of CF-LVAD support was 438 days. The VENDYS® test generates a vascular reactivity index (VRI). Outcomes for the CF-LVAD patients were compared between VRI < 1 and VRI ≥ 1. The bleeding events were driven primarily by a difference in neurologic bleeds. Multivariate analysis showed that VRI < 1 correlated with future bleeding events (HR: 5.56; P = 0.01). The C-statistic with the VRI dichotomized as above was 0.82. There was a trend toward a worse survival in patients with poor endothelial function. Endothelial vasodilatory dysfunction measured by a simple test utilizing digital thermal monitoring can predict adverse bleeding events in patients with CF-LVADs., Competing Interests: Disclosure: Drs. Guha and Bhimaraj have consulting agreements with Abbott, the current maker of Heartmate-II LVADs. The other authors have no conflicts of interest to report., (Copyright © ASAIO 2020.)
- Published
- 2021
- Full Text
- View/download PDF
8. Relationship of LVEF and Myocardial Scar to Long-Term Mortality Risk and Mode of Death in Patients With Nonischemic Cardiomyopathy.
- Author
-
Klem I, Klein M, Khan M, Yang EY, Nabi F, Ivanov A, Bhatti L, Hayes B, Graviss EA, Nguyen DT, Judd RM, Kim RJ, Heitner JF, and Shah DJ
- Subjects
- Adult, Aged, Cardiomyopathies mortality, Cardiomyopathies physiopathology, Female, Heart Diseases physiopathology, Humans, Male, Middle Aged, Prospective Studies, Risk Factors, Survival Analysis, Ventricular Dysfunction, Left mortality, Ventricular Dysfunction, Left pathology, Cardiomyopathies complications, Heart Diseases etiology, Ventricular Function, Left physiology
- Abstract
Background: Nonischemic cardiomyopathy is a leading cause of reduced left ventricular ejection fraction (LVEF) and is associated with high mortality risk from progressive heart failure and arrhythmias. Myocardial scar on cardiovascular magnetic resonance imaging is increasingly recognized as a risk marker for adverse outcomes; however, left ventricular dysfunction remains the basis for determining a patient's eligibility for primary prophylaxis with implantable cardioverter-defibrillator. We investigated the relationship of LVEF and scar with long-term mortality and mode of death in a large cohort of patients with nonischemic cardiomyopathy., Methods: This study is a prospective, longitudinal outcomes registry of 1020 consecutive patients with nonischemic cardiomyopathy who underwent clinical cardiovascular magnetic resonance imaging for the assessment of LVEF and scar at 3 centers., Results: During a median follow-up of 5.2 (interquartile range, 3.8, 6.6) years, 277 (27%) patients died. On survival analysis, LVEF ≤35% and scar were strongly associated with all-cause (log-rank test P =0.002 and P <0.001, respectively) and cardiac death ( P =0.001 and P <0.001, respectively). Whereas scar was strongly related to sudden cardiac death (SCD; P =0.001), there was no significant association between LVEF ≤35% and SCD risk ( P =0.57). On multivariable analysis including established clinical factors, LVEF and scar are independent risk markers of all-cause and cardiac death. The addition of LVEF provided incremental prognostic value but insignificant discrimination improvement by C-statistic for all-cause and cardiac death, but no incremental prognostic value for SCD. Conversely, scar extent demonstrated significant incremental prognostic value and discrimination improvement for all 3 end points. On net reclassification analysis, the addition of LVEF resulted in no significant improvement for all-cause death (11.0%; 95% CI, -6.2% to 25.9%), cardiac death (9.8%; 95% CI, -5.7% to 29.3%), or SCD (7.5%; 95% CI, -41.2% to 42.9%). Conversely, the addition of scar extent resulted in significant reclassification improvement of 25.5% (95% CI, 11.7% to 41.0%) for all-cause death, 27.0% (95% CI, 11.6% to 45.2%) for cardiac death, and 40.6% (95% CI, 10.5% to 71.8%) for SCD., Conclusions: Myocardial scar and LVEF are both risk markers for all-cause and cardiac death in patients with nonischemic cardiomyopathy. However, whereas myocardial scar has strong and incremental prognostic value for SCD risk stratification, LVEF has no incremental prognostic value over clinical measures. Scar assessment should be incorporated into patient selection criteria for primary prevention implantable cardioverter-defibrillator placement.
- Published
- 2021
- Full Text
- View/download PDF
9. The Impact of HIV Infection on TB Disparities Among US-Born Black and White Tuberculosis Patients in the United States.
- Author
-
Marks SM, Katz DJ, Davidow AL, Pagaoa MA, Teeter LD, and Graviss EA
- Subjects
- Black People, Female, Humans, Male, Odds Ratio, Racial Groups, United States epidemiology, White People, Black or African American, HIV Infections complications, HIV Infections epidemiology, Health Status Disparities, Ill-Housed Persons, Tuberculosis epidemiology
- Abstract
Background/objectives: US-born non-Hispanic black persons (blacks) (12% of the US population) accounted for 41% of HIV diagnoses during 2008-2014. HIV infection significantly increases TB and TB-related mortality. TB rate ratios were 6 to 7 times as high in blacks versus US-born non-Hispanic whites (whites) during 2013-2016. We analyzed a sample of black and white TB patients to assess the impact of HIV infection on TB racial disparities., Methods: In total, 552 black and white TB patients with known HIV/AIDS status were recruited from 10 US sites in 2009-2010. We abstracted data from the National TB Surveillance System, medical records, and death certificates and interviewed 477 patients. We estimated adjusted odds ratios (AORs) with 95% confidence intervals (CIs) for associations of TB with HIV infection, late HIV diagnosis (≤3 months before or any time after TB diagnosis), and mortality during TB treatment., Results: Twenty-one percent of the sample had HIV/AIDS infection. Blacks (AOR = 3.4; 95% CI, 1.7-6.8) and persons with recent homelessness (AOR = 2.5; 95% CI, 1.5-4.3) had greater odds of HIV infection than others. The majority of HIV-infected/TB patients were diagnosed with HIV infection 3 months or less before (57%) or after (4%) TB diagnosis. Among HIV-infected/TB patients, blacks had similar percentages to whites (61% vs 57%) of late HIV diagnosis. Twenty-five percent of HIV-infected/TB patients died, 38% prior to TB diagnosis and 62% during TB treatment. Blacks did not have significantly greater odds of TB-related mortality than whites (AOR = 1.1; 95% CI, 0.6-2.1)., Conclusions: Black TB patients had greater HIV prevalence than whites. While mortality was associated with HIV infection, it was not significantly associated with black or white race.
- Published
- 2020
- Full Text
- View/download PDF
10. Delayed Implantation of Pumped Kidneys Decreases Renal Allograft Futility in Combined Liver-Kidney Transplantation.
- Author
-
Lunsford KE, Agopian VG, Yi SG, Nguyen DTM, Graviss EA, Harlander-Locke MP, Saharia A, Kaldas FM, Mobley CM, Zarrinpar A, Hobeika MJ, Veale JL, Podder H, Farmer DG, Knight RJ, Danovitch GM, Gritsch HA, Li XC, Ghobrial RM, Busuttil RW, and Gaber AO
- Subjects
- Aged, Allografts immunology, Allografts supply & distribution, Cold Ischemia instrumentation, Cold Ischemia methods, Cold Ischemia statistics & numerical data, End Stage Liver Disease complications, Feasibility Studies, Female, Graft Rejection immunology, Graft Rejection prevention & control, Graft Survival immunology, Humans, Kidney immunology, Kidney Transplantation ethics, Kidney Transplantation methods, Kidney Transplantation statistics & numerical data, Liver Transplantation ethics, Liver Transplantation methods, Liver Transplantation statistics & numerical data, Male, Medical Futility ethics, Middle Aged, Organ Preservation instrumentation, Organ Preservation statistics & numerical data, Perfusion instrumentation, Perfusion methods, Perfusion statistics & numerical data, Renal Insufficiency etiology, Renal Insufficiency surgery, Retrospective Studies, Time Factors, Time-to-Treatment statistics & numerical data, Transplantation, Homologous adverse effects, Transplantation, Homologous ethics, Transplantation, Homologous methods, Treatment Outcome, End Stage Liver Disease surgery, Graft Rejection epidemiology, Kidney Transplantation adverse effects, Liver Transplantation adverse effects, Organ Preservation methods
- Abstract
Background: Combined liver-kidney transplantation (CLKT) improves survival for liver transplant recipients with renal dysfunction; however, the tenuous perioperative hemodynamic and metabolic milieu in high-acuity CLKT recipients increases delayed graft function and kidney allograft failure. We sought to analyze whether delayed KT through pumping would improve kidney outcomes following CLKT., Methods: A retrospective analysis (University of California Los Angeles [n = 145], Houston Methodist Hospital [n = 79]) was performed in all adults receiving CLKT at 2 high-volume transplant centers from February 2004 to January 2017, and recipients were analyzed for patient and allograft survival as well as renal outcomes following CLKT., Results: A total of 63 patients (28.1%) underwent delayed implantation of pumped kidneys during CLKT (dCLKT) and 161 patients (71.9%) received early implantation of nonpumped kidneys during CLKT (eCLKT). Most recipients were high-acuity with median biologic model of end-stage liver disease (MELD) score of, 35 for dCLKT and 34 for eCLKT (P = ns). Pretransplant, dCLKT had longer intensive care unit stay, were more often intubated, and had greater vasopressor use. Despite this, dCLKT exhibited improved 1-, 3-, and 5-year patient and kidney survival (P = 0.02) and decreased length of stay (P = 0.001), kidney allograft failure (P = 0.012), and dialysis duration (P = 0.031). This reduced kidney allograft futility (death or continued need for hemodialysis within 3 mo posttransplant) for dCLKT (6.3%) compared with eCLKT (19.9%) (P = 0.013)., Conclusions: Delayed implantation of pumped kidneys is associated with improved patient and renal allograft survival and decreased hospital length of stay despite longer kidney cold ischemia. These data should inform the ethical debate as to the futility of performing CLKT in high-acuity recipients.
- Published
- 2020
- Full Text
- View/download PDF
11. Examining the Relationship and Prognostic Implication of Diabetic Status and Extracellular Matrix Expansion by Cardiac Magnetic Resonance.
- Author
-
Khan MA, Yang EY, Nguyen DT, Nabi F, Hinojosa J, Jabel M, Nagueh SF, Graviss EA, and Shah DJ
- Subjects
- Adult, Aged, Diabetic Cardiomyopathies mortality, Diabetic Cardiomyopathies pathology, Diabetic Cardiomyopathies physiopathology, Female, Fibrosis, Heart Failure mortality, Heart Failure pathology, Heart Failure physiopathology, Humans, Male, Middle Aged, Predictive Value of Tests, Prognosis, Stroke Volume, Ventricular Function, Left, Diabetes Mellitus, Type 2 diagnosis, Diabetes Mellitus, Type 2 mortality, Diabetic Cardiomyopathies diagnostic imaging, Extracellular Matrix pathology, Heart Failure diagnostic imaging, Magnetic Resonance Imaging, Cine, Myocardium pathology, Prediabetic State diagnosis, Prediabetic State mortality
- Abstract
Background: Although not fully understood, diabetes mellitus is thought to be associated with cardiac fibrosis and stiffness due to alteration of myocardial extracellular matrix. Newer cardiac magnetic resonance techniques may be able to identify extracellular matrix expansion by measuring extracellular volume fraction (ECV). We used cardiac magnetic resonance to evaluate the association of alteration in the extracellular matrix with diabetic status and its implications on incident heart failure events and all-cause mortality., Methods: We studied 442 patients who underwent comprehensive contrast cardiac magnetic resonance to assess cardiac morphology and function, left ventricular replacement fibrosis, and pre-post contrast T1 mapping to quantify ECV. The cohort did not have coexisting pathologies associated with ECV alteration. We categorized our final cohort based on diabetic status using criteria from the American Diabetic Association. Subsequent heart failure hospitalization and all-cause death were ascertained., Results: Our patients were predominantly white with a median age of 57 with 48% being men. Compared with nondiabetes mellitus, diabetes mellitus was significantly associated with elevated ECV after adjusting for clinical and imaging covariates: β coefficient 1.33 (95% CI, 0.22-2.44); P =0.02. Over a median follow-up of 24.5 (interquartile range, 14.8-33.4) months, 52 deaths and 24 heart failure events occurred. Patients with diabetes mellitus and elevated ECV had the worst outcomes compared with patients with diabetes mellitus and normal ECV or nondiabetics. Elevated ECV remained an independent predictor of outcomes (hazard ratio, 3.31 [95% CI, 1.93-5.67]; P <0.001) after adjusting for covariates., Conclusions: Elevated ECV is an independent predictor of mortality among patients with diabetes mellitus and may have an additive effect with diabetes mellitus on outcomes. ECV may represent a novel noninvasive biomarker to evaluate severity of diabetic heart disease.
- Published
- 2020
- Full Text
- View/download PDF
12. Weight Gain After Simultaneous Kidney and Pancreas Transplantation.
- Author
-
Knight RJ, Islam AK, Pham C, Graviss EA, Nguyen DT, Moore LW, Kagan A, Sadhu AR, Podder H, and Gaber AO
- Subjects
- Adult, Diabetes Mellitus, Type 2 metabolism, Diabetes Mellitus, Type 2 mortality, Female, Follow-Up Studies, Glycated Hemoglobin analysis, Graft Rejection immunology, Graft Rejection prevention & control, Graft Survival, Humans, Immunosuppressive Agents adverse effects, Insulin blood, Kidney Transplantation methods, Male, Metabolic Syndrome blood, Metabolic Syndrome diagnosis, Metabolic Syndrome etiology, Middle Aged, Obesity blood, Obesity diagnosis, Obesity etiology, Pancreas Transplantation methods, Postoperative Period, Retrospective Studies, Diabetes Mellitus, Type 2 surgery, Kidney Transplantation adverse effects, Metabolic Syndrome epidemiology, Obesity epidemiology, Pancreas Transplantation adverse effects, Weight Gain
- Abstract
Background: Excessive weight (EW) gain is common after solid organ transplantation, but there is little information concerning obesity after pancreas transplantation. The study goal was to characterize EW gain after kidney-pancreas (KP) transplantation., Methods: This was a retrospective single-center review of 100 KP recipients transplanted between September 2007 and June 2015., Results: The median percent weight gain for all recipients at 1 year posttransplant was 10% (interquartile range, 2.7%-19.3%) of baseline weight. EW gain, defined as greater than or equal to a 19% 1-year increase in weight, included all recipients (n = 26) above the upper limit of interquartile range for weight gain at 1 year. In multivariate analysis, recipient age <40 years, the use of tacrolimus/mammalian target of rapamycin immunosuppression, and an acute rejection event were independent risk factors for EW gain. At a mean follow-up of 43±23 months, there was no difference in patient or graft survival between the EW and non-EW cohorts. Although mean hemoglobin A1c levels between groups were equivalent, the EW versus non-EW cohort displayed a significant increase in mean insulin levels and a trend towards higher C-peptide levels. Criteria for posttransplant metabolic syndrome was met in 34.6% of EW versus 17.6% of non-EW cohorts (P = 0.07)., Conclusions: At intermediate-term follow-up, EW gain after KP transplantation was not associated with an increased risk of death or graft loss, although there was a trend toward a greater risk of posttransplant metabolic syndrome. There may be a metabolic consequence of successful pancreas transplantation that results in EW gain in a proportion of recipients, leading to an increased risk of long-term cardiovascular complications.
- Published
- 2020
- Full Text
- View/download PDF
13. Outcomes of Liver Transplantation for Hepatocellular Carcinoma Beyond the University of California San Francisco Criteria: A Single-center Experience.
- Author
-
Victor DW 3rd, Monsour HP Jr, Boktour M, Lunsford K, Balogh J, Graviss EA, Nguyen DT, McFadden R, Divatia MK, Heyne K, Ankoma-Sey V, Egwim C, Galati J, Duchini A, Saharia A, Mobley C, Gaber AO, and Ghobrial RM
- Subjects
- Ablation Techniques methods, Aged, Antineoplastic Agents therapeutic use, Carcinoma, Hepatocellular diagnosis, Carcinoma, Hepatocellular mortality, Carcinoma, Hepatocellular pathology, Chemotherapy, Adjuvant methods, Disease Progression, Disease-Free Survival, Female, Follow-Up Studies, Humans, Liver diagnostic imaging, Liver pathology, Liver Neoplasms diagnosis, Liver Neoplasms mortality, Liver Neoplasms pathology, Liver Transplantation statistics & numerical data, Male, Middle Aged, Neoadjuvant Therapy methods, Neoplasm Recurrence, Local pathology, Neoplasm Recurrence, Local prevention & control, Neoplasm Staging, Retrospective Studies, Risk Factors, Sorafenib therapeutic use, Time Factors, Tumor Burden, Carcinoma, Hepatocellular therapy, Liver Neoplasms therapy, Liver Transplantation standards, Neoplasm Recurrence, Local epidemiology, Patient Selection
- Abstract
Background: Hepatocellular carcinoma (HCC) is the most common primary malignant liver tumor. Currently, liver transplantation may be the optimal treatment for HCC in cirrhotic patients. Patient selection is currently based on tumor size. We developed a program to offer liver transplantation to selected patients with HCC outside of traditional criteria., Methods: Retrospective review for patients transplanted with HCC between April 2008 and June 2017. Patients were grouped by tumor size according to Milan, University of California San Francisco (UCSF), and outside UCSF criteria. Patient demographics, laboratory values, and outcomes were compared. Patients radiographically outside Milan criteria were selected based on tumor control with locoregional therapy (LRT) and 9 months of stability from LRT. α-fetoprotein values were not exclusionary., Results: Two hundred twenty HCC patients were transplanted, 138 inside Milan, 23 inside UCSF, and 59 beyond UCSF criteria. Patient survival was equivalent at 1, 3, or 5 years despite pathologic tumor size. Waiting time to transplantation was not significantly different at an average of 344 days. In patients outside UCSF, tumor recurrence was equivalent to Milan and UCSF criteria recipients who waited >9 months from LRT. Although tumor recurrence was more likely in outside of UCSF patients (3% versus 9% versus 15%; P = 0.02), recurrence-free survival only trended toward significance among the groups (P = 0.053)., Conclusions: Selective patients outside of traditional size criteria can be effectively transplanted with equivalent survival to patients with smaller tumors, even when pathologic tumor burden is considered. Tumor stability over time can be used to help select patients for transplantation.
- Published
- 2020
- Full Text
- View/download PDF
14. Prognostic Implications of Diffuse Interstitial Fibrosis in Asymptomatic Primary Mitral Regurgitation.
- Author
-
Kitkungvan D, Yang EY, El Tallawi KC, Nagueh SF, Nabi F, Khan MA, Nguyen DT, Graviss EA, Lawrie GM, Zoghbi WA, Bonow RO, Quinones MA, and Shah DJ
- Subjects
- Extracellular Fluid physiology, Female, Fibrosis diagnostic imaging, Fibrosis physiopathology, Humans, Magnetic Resonance Imaging, Cine methods, Male, Middle Aged, Mitral Valve Insufficiency physiopathology, Prognosis, Prospective Studies, Extracellular Fluid diagnostic imaging, Mitral Valve Insufficiency diagnostic imaging
- Published
- 2019
- Full Text
- View/download PDF
15. Myocardial Extracellular Volume Fraction Adds Prognostic Information Beyond Myocardial Replacement Fibrosis.
- Author
-
Yang EY, Ghosn MG, Khan MA, Gramze NL, Brunner G, Nabi F, Nambi V, Nagueh SF, Nguyen DT, Graviss EA, Schelbert EB, Ballantyne CM, Zoghbi WA, and Shah DJ
- Subjects
- Adult, Aged, Contrast Media pharmacology, Extracellular Space, Female, Fibrosis diagnosis, Follow-Up Studies, Gadolinium DTPA pharmacology, Humans, Male, Middle Aged, Prognosis, Prospective Studies, Reproducibility of Results, Cardiomyopathies diagnosis, Magnetic Resonance Imaging, Cine methods, Myocardium pathology
- Abstract
Background: Cardiac magnetic resonance techniques permit quantification of the myocardial extracellular volume fraction (ECV), representing a surrogate marker of reactive interstitial fibrosis, and late gadolinium enhancement (LGE), representing replacement fibrosis or scar. ECV and LGE have been independently linked with heart failure (HF) events. In deriving ECV, coronary artery disease type LGE, but not non-coronary artery disease type LGE, has been consistently excluded. We examined the associations between LGE, global ECV derived from myocardial tissue segments free of any detectable scar, and subsequent HF events., Methods: Mid short-axis T1 maps were divided into 6 cardiac segments, each classified as LGE absent or present. Global ECV was derived from only segments without LGE. ECV was considered elevated if >30%, the upper 95% bounds of a reference group without known cardiac disease (n=28). Patients were divided into 4 groups by presence of elevated ECV and of any LGE. Subsequent HF hospitalization and any death were ascertained. Their relationship with ECV was examined separately and as a composite with Cox proportional hazard models., Results: Of 1604 serial patients with T1 maps, 1255 were eligible after exclusions and followed over a median 26.3 (interquartile range, 15.9-37.5) months. Patients with elevated ECV had increased risk for death (hazard ratio [HR] 2.45 [95% CI, 1.76-3.41]), HF hospitalization (HR, 2.45 [95% CI, 1.77-3.40]), and a combined end point of both outcomes (HR, 2.46 [95% CI, 1.94-3.14]). After adjustments for covariates including LGE, the relationship persisted for death (HR, 1.82 [95% CI, 1.28-2.59]), hospitalization (HR, 1.60 [95% CI, 1.12-2.27]), and combined end points (HR, 1.73 [95% CI, 1.34-2.24])., Conclusions: ECV measures of diffuse myocardial fibrosis were associated with HF outcomes, despite exclusion of replacement fibrosis segments from their derivation and even among patients without any scar. ECV may have a synergistic role with LGE in HF risk assessment.
- Published
- 2019
- Full Text
- View/download PDF
16. Risk Stratification of Patients With Current Generation Continuous-Flow Left Ventricular Assist Devices Being Bridged to Heart Transplantation.
- Author
-
Guha A, Nguyen D, Cruz-Solbes AS, Amione-Guerra J, Schutt RC, Bhimaraj A, Trachtenberg BH, Park MH, Graviss EA, Gaber O, Suarez E, Montane E, Torre-Amione G, and Estep JD
- Subjects
- Adult, Female, Heart Failure mortality, Heart Failure physiopathology, Humans, Male, Middle Aged, Prognosis, Proportional Hazards Models, Retrospective Studies, Treatment Outcome, Decision Support Systems, Clinical, Heart Failure therapy, Heart Transplantation mortality, Heart-Assist Devices
- Abstract
Patients bridged to transplant (BTT) with continuous-flow left ventricular assist devices (CF-LVADs) have increased in the past decade. Decision support tools for these patients are limited. We developed a risk score to estimate prognosis and guide decision-making. We included heart transplant recipients bridged with CF-LVADs from the United Network for Organ Sharing (UNOS) database and divided them into development (2,522 patients) and validation cohorts (1,681 patients). Univariate and multivariate Cox proportional hazards models were performed. Variables that independently predicted outcomes (age, African American race, recipient body mass index [BMI], intravenous [IV] antibiotic use, pretransplant dialysis, and total bilirubin) were assigned weight using linear transformation, and risk scores were derived. Patients were grouped by predicted posttransplant mortality: low risk (≤ 38 points), medium risk (38-41 points), and high risk (≥ 42 points). We performed Cox proportional hazards analysis on wait-listed CF-LVAD patients who were not transplanted. Score significantly discriminated survival among the groups in the development cohort (6.7, 12.9, 20.7; p = 0.001), validation cohort (6.4, 10.1, 13.6; p < 0.001), and ambulatory cohort (6.4, 11.5, 17.2; p < 0.001). We derived a left ventricular assist device (LVAD) BTT risk score that effectively identifies CF-LVAD patients who are at higher risk for worse outcomes after heart transplant. This score may help physicians weigh the risks of transplantation in patients with CF-LVAD.
- Published
- 2018
- Full Text
- View/download PDF
17. The QuantiFERON-TB Gold In-Tube Assay in Neuro-Ophthalmology.
- Author
-
Little LM, Rigi M, Suleiman A, Smith SV, Graviss EA, Foroozan R, and Lee AG
- Subjects
- Adult, Aged, Equipment Design, Female, Humans, Latent Tuberculosis microbiology, Male, Middle Aged, Mycobacterium tuberculosis isolation & purification, Reproducibility of Results, Retrospective Studies, Antigens, Bacterial analysis, Interferon-gamma Release Tests instrumentation, Latent Tuberculosis diagnosis, Mycobacterium tuberculosis immunology, Neurology methods, Ophthalmology methods, Tuberculin Test instrumentation
- Abstract
Background: Although QuantiFERON-TB Gold In-Tube (QFT-GIT) testing is regularly used to detect infection with Mycobacterium tuberculosis, its utility in a patient population with a low risk for tuberculosis (TB) has been questioned. The following is a cohort study analyzing the efficacy of QFT-GIT testing as a method for detection of active TB disease in low-risk individuals in a neuro-ophthalmologic setting., Methods: Ninety-nine patients from 2 neuro-ophthalmology centers were identified as having undergone QFT-GIT testing between January 2012 and February 2016. Patients were divided into groups of negative, indeterminate, and positive QFT-GIT results. Records of patients with positive QFT-GIT results were reviewed for development of latent or active TB, as determined by clinical, bacteriologic, and/or radiographic evidence., Results: Of the 99 cases reviewed, 18 patients had positive QFT-GIT tests. Of these 18 cases, 12 had documentation of chest radiographs or computed tomography which showed no evidence for either active TB or pulmonary latent TB infection (LTBI). Four had chest imaging which was indicative of possible LTBI. None of these 18 patients had symptoms of active TB and none developed active TB within the follow-up period., Conclusions: Based on our results, we conclude that routine testing with QFT-GIT in a low-risk cohort did not diagnose active TB infection. We do not recommend routine QFT-GIT testing for TB low-risk individuals, as discerned through patient and exposure history, ocular examination, and clinical judgment, in neuro-ophthalmology practice.
- Published
- 2017
- Full Text
- View/download PDF
18. Effect of Chlorhexidine Bathing Every Other Day on Prevention of Hospital-Acquired Infections in the Surgical ICU: A Single-Center, Randomized Controlled Trial.
- Author
-
Swan JT, Ashton CM, Bui LN, Pham VP, Shirkey BA, Blackshear JE, Bersamin JB, Pomer RM, Johnson ML, Magtoto AD, Butler MO, Tran SK, Sanchez LR, Patel JG, Ochoa RA Jr, Hai SA, Denison KI, Graviss EA, and Wray NP
- Subjects
- Academic Medical Centers, Adult, Aged, Aged, 80 and over, Catheter-Related Infections prevention & control, Chlorhexidine administration & dosage, Comorbidity, Coumarins, Female, Humans, Infection Control methods, Isocoumarins, Male, Middle Aged, Pneumonia, Ventilator-Associated prevention & control, Risk Factors, Severity of Illness Index, Surgical Wound Infection prevention & control, Time Factors, Anti-Infective Agents, Local administration & dosage, Baths methods, Chlorhexidine analogs & derivatives, Cross Infection prevention & control, Intensive Care Units organization & administration
- Abstract
Objective: To test the hypothesis that compared with daily soap and water bathing, 2% chlorhexidine gluconate bathing every other day for up to 28 days decreases the risk of hospital-acquired catheter-associated urinary tract infection, ventilator-associated pneumonia, incisional surgical site infection, and primary bloodstream infection in surgical ICU patients., Design: This was a single-center, pragmatic, randomized trial. Patients and clinicians were aware of treatment-group assignment; investigators who determined outcomes were blinded., Setting: Twenty-four-bed surgical ICU at a quaternary academic medical center., Patients: Adults admitted to the surgical ICU from July 2012 to May 2013 with an anticipated surgical ICU stay for 48 hours or more were included., Interventions: Patients were randomized to bathing with 2% chlorhexidine every other day alternating with soap and water every other day (treatment arm) or to bathing with soap and water daily (control arm)., Measurements and Main Results: The primary endpoint was a composite outcome of catheter-associated urinary tract infection, ventilator-associated pneumonia, incisional surgical site infection, and primary bloodstream infection. Of 350 patients randomized, 24 were excluded due to prior enrollment in this trial and one withdrew consent. Therefore, 325 were analyzed (164 soap and water versus 161 chlorhexidine). Patients acquired 53 infections. Compared with soap and water bathing, chlorhexidine bathing every other day decreased the risk of acquiring infections (hazard ratio = 0.555; 95% CI, 0.309-0.997; p = 0.049). For patients bathed with soap and water versus chlorhexidine, counts of incident hospital-acquired infections were 14 versus 7 for catheter-associated urinary tract infection, 13 versus 8 for ventilator-associated pneumonia, 6 versus 3 for incisional surgical site infections, and 2 versus 0 for primary bloodstream infection; the effect was consistent across all infections. The absolute risk reduction for acquiring a hospital-acquired infection was 9.0% (95% CI, 1.5-16.4%; p = 0.019). Incidences of adverse skin occurrences were similar (18.9% soap and water vs 18.6% chlorhexidine; p = 0.95)., Conclusions: Compared with soap and water, chlorhexidine bathing every other day decreased the risk of acquiring infections by 44.5% in surgical ICU patients.
- Published
- 2016
- Full Text
- View/download PDF
19. Barriers to preemptive renal transplantation: a single center questionnaire study.
- Author
-
Knight RJ, Teeter LD, Graviss EA, Patel SJ, DeVos JM, Moore LW, and Gaber AO
- Subjects
- Adult, Data Interpretation, Statistical, Ethnicity, Female, Humans, Male, Middle Aged, Multivariate Analysis, Odds Ratio, Polycystic Kidney Diseases complications, Polycystic Kidney Diseases therapy, Referral and Consultation, Renal Dialysis, Surveys and Questionnaires, Time Factors, Health Services Accessibility, Kidney Transplantation methods
- Abstract
Background: Preemptive transplantation results in excellent patient and graft survival yet most transplant candidates are referred for transplantation after initiation of dialysis. The goal of this study was to determine barriers to preemptive renal transplantation., Methods: A nonvalidated questionnaire was administered to prospective kidney transplant recipients to determine factors that hindered or favored referral for transplantation before the initiation of dialysis., Results: One hundred ninety-seven subjects referred for a primary renal transplant completed the questionnaire. Ninety-one subjects (46%) had been informed of preemptive transplantation before referral, and 80 (41%) were predialysis at the time of evaluation. The median time from diagnosis of renal disease to referral was 60 months (range, 2-444 months). In bivariate analysis, among other factors, knowledge of preemptive transplantation was highly associated (odds ratio=94.69) with referral before initiation of dialysis. Given the strong association between knowledge of preemptive transplantation and predialysis referral, this variable was not included in the multivariate analysis. Using multivariate logistic regression analysis, white recipient race, referral by a transplant nephrologist, recipient employment, and the diagnosis of polycystic kidney disease were significantly associated with presentation to the pretransplant clinic before initiation of dialysis., Conclusion: The principle barrier to renal transplantation referral before dialysis was patient education regarding the option of preemptive transplantation. Factors significantly associated with referral before dialysis were the diagnosis of polycystic kidney disease, white recipient race, referral by a transplant nephrologist, and employed status. Greater effort should be applied to patient education regarding preemptive transplantation early after the diagnosis of end-stage renal disease.
- Published
- 2015
- Full Text
- View/download PDF
20. Intermediate-term graft loss after renal transplantation is associated with both donor-specific antibody and acute rejection.
- Author
-
Devos JM, Gaber AO, Teeter LD, Graviss EA, Patel SJ, Land GA, Moore LW, and Knight RJ
- Subjects
- Adult, Antibodies blood, Black People, Case-Control Studies, Cohort Studies, Female, Follow-Up Studies, Graft Rejection ethnology, Hispanic or Latino, Humans, Incidence, Kaplan-Meier Estimate, Male, Middle Aged, Multivariate Analysis, Pancreas Transplantation, Risk Factors, Time Factors, White People, Black or African American, Antibodies immunology, Graft Rejection epidemiology, Graft Rejection immunology, HLA Antigens immunology, Kidney Transplantation, Tissue Donors, Transplantation
- Abstract
Background: Renal transplant recipients with de novo DSA (dDSA) experience higher rates of rejection and worse graft survival than dDSA-free recipients. This study presents a single-center review of dDSA monitoring in a large, multi-ethnic cohort of renal transplant recipients., Methods: The authors performed a nested case-control study of adult kidney and kidney-pancreas recipients from July 2007 through July 2011. Cases were defined as dDSA-positive whereas controls were all DSA-negative transplant recipients. DSA were determined at 1, 3, 6, 9, and 12 months posttransplant, and every 6 months thereafter., Results: Of 503 recipients in the analysis, 24% developed a dDSA, of whom 73% had dDSA against DQ antigen. Median time to dDSA was 6.1 months (range 0.2-44.6 months). After multivariate analysis, African American race, kidney-pancreas recipient, and increasing numbers of human leukocyte antigen mismatches were independent risk factors for dDSA. Recipients with dDSA were more likely to suffer an acute rejection (AR) (35% vs. 10%, P<0.001), an antibody-mediated AR (16% vs. 0.3%, P<0.001), an AR ascribed to noncompliance (8% vs. 2%, P=0.001), and a recurrent AR (6% vs. 1%, P=0.002) than dDSA-negative recipients. At a median follow-up of 31 months, the death-censored actuarial graft survival of dDSA recipients was worse than the DSA-free cohort (P=0.002). Yet, for AR-free recipients, there was no difference in graft survival between cohorts (P=0.66)., Conclusions: Development of dDSA was associated with an increased incidence of graft loss, yet the detrimental effect of dDSA was limited in the intermediate term to recipients with AR.
- Published
- 2014
- Full Text
- View/download PDF
21. Cross-sectional and case-control analyses of the association of kidney function staging with adverse postoperative outcomes in general and vascular surgery.
- Author
-
Gaber AO, Moore LW, Aloia TA, Suki WN, Jones SL, Graviss EA, Knight RJ, and Bass BL
- Subjects
- Aged, Analysis of Variance, Case-Control Studies, Chi-Square Distribution, Cross-Sectional Studies, Female, Humans, Kidney Function Tests, Logistic Models, Male, Middle Aged, Predictive Value of Tests, Proportional Hazards Models, Risk Factors, United States epidemiology, Colectomy mortality, Kidney Failure, Chronic complications, Postoperative Complications mortality, Surgical Procedures, Operative mortality, Vascular Surgical Procedures mortality
- Abstract
Objective: This study aimed to assess kidney dysfunction in general surgical patients and examine the effect on postoperative mortality and morbidity., Background: An estimated 13% of the US population has chronic kidney disease (CKD), but awareness among patients and caregivers is lacking., Methods: The American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) data sets for 2005-2007 were analyzed. Preoperative kidney function was assessed by the Modification of Diet in Renal Disease formula for estimated glomerular filtration rate (eGFR) and staged according to National Kidney Foundation. Cross-sectional analyses were performed for 30-day mortality (Cox proportional hazard) and incidence of major complications (nominal logistic regression). A case-control cohort of colectomy cases was analyzed comparing patients in the stage 4 CKD group and the no CKD group (no-CKD)., Results: Sixty-four percent of evaluable patients had reduced eGFR, but eGFR was not evaluable in 28% of the surgical cases. In the 260,352 evaluable cases, adjusted hazard ratio for 30-day mortality was 2.30 [95% confidence interval (CI), 2.11-2.51] for stage 3 CKD; 3.37 (95% CI, 3.01-3.76) for stage 4 CKD; and 3.05 (95% CI, 2.68-3.47) for stage 5 CKD compared with no-CKD (P < 0.0001). CKD was an independent risk factor for having major complications postsurgery [stage 3, odds ratio (OR) = 1.24 (95% CI, 1.19-1.29); stage 4, OR = 1.65 (95% CI, 1.52-1.78); and stage 5 CKD, OR = 1.40 (95% CI, 1.30-1.51); P < 0.0001]. The case-control for colectomy was confirmatory: increased 30-day mortality in stage 4 CKD versus no-CKD (hazard ratio = 2.58, 95% CI, 1.13-5.92; P = 0.025)., Conclusions: Renal insufficiency may be underrecognized in the general and vascular (noncardiac) surgery population, is a leading independent predictor of poor early postoperative outcomes, and should be routinely assessed in the preoperative setting.
- Published
- 2013
- Full Text
- View/download PDF
22. Foxp3+ regulatory T cells in antiretroviral-naive HIV patients.
- Author
-
Montes M, Lewis DE, Sanchez C, Lopez de Castilla D, Graviss EA, Seas C, Gotuzzo E, and White AC Jr
- Subjects
- Adult, Aged, CD4 Lymphocyte Count, Female, Flow Cytometry methods, Humans, Male, Middle Aged, Forkhead Transcription Factors immunology, HIV Infections immunology, T-Lymphocytes, Regulatory immunology
- Abstract
We characterized regulatory T cells from antiretroviral-naive HIV patients by flow cytometry. The proportion of CD4 cells positive for CD25 and Foxp3 was increased, mainly in those with CD4 cell counts less than 200 cells/microl. The total number of Foxp3-positive cells correlated with the CD4 cell count. Further studies are needed on whether Foxp3-positive cell numbers or function explain the susceptibility to autoimmune and inflammatory diseases seen in some patients with advanced HIV.
- Published
- 2006
- Full Text
- View/download PDF
23. Laparoscopic colon resection early in the learning curve: what is the appropriate setting?
- Author
-
Reichenbach DJ, Tackett AD, Harris J, Camacho D, Graviss EA, Dewan B, Vavra A, Stiles A, Fisher WE, Brunicardi FC, and Sweeney JF
- Subjects
- Female, Humans, Male, Middle Aged, North Carolina, Retrospective Studies, Texas, Treatment Outcome, Clinical Competence, Colectomy education, Colectomy methods, Colonic Diseases surgery, Internship and Residency, Laparoscopy
- Abstract
Introduction: Laparoscopic colon resection (LCR) is a safe and effective treatment of benign and malignant colonic lesions. There is little question that a steep learning curve exists for surgeons to become skilled and proficient at LCR. Because of this steep learning curve, debate exists regarding the appropriate hospital setting for LCR. We hypothesize that outcomes of LCR performed early in the learning curve at a regional medical center (New Hanover Regional Medical Center; NHRMC) and a university medical center (Baylor College of Medicine; BCM) would not be significantly different., Methods: The first 50 consecutive LCRs performed at each institution between August 2001 and December 2003 were reviewed. Age, mean body mass index (BMI), gender, history of previous abdominal surgery (PAS), operative approach [laparoscopic (LAP) versus hand/laparoscopic assisted (HAL)], conversions (Conv), operative time (OR time), pathology (benign vs. malignant), lymph nodes (LN) harvested in malignant cases, length of stay (LOS), morbidity and mortality were obtained. Continuous data were expressed as mean +/- SD. Data were analyzed by chi, Fisher exact test, or t test., Results: NHRMC patients were on average older females with a higher incidence of PAS. A LAP approach was more frequently performed at BCM (86%), whereas HAL was used more frequently at NHRMC (24%). Conversions to open were similar at both institutions (12%). Benign disease accounted for the majority of operations at both institutions. In cases of malignancy, more LN were harvested at BCM. OR time and LOS were shorter at NHRMC. Complication rates were similar between institutions. There were no anastomotic leaks or deaths., Conclusions: LCR can be performed safely and with acceptable outcomes early in the learning curve at regional medical centers and university medical centers. Outcomes depend more on surgeons possessing advanced laparoscopic skills and adhering to accepted oncologic surgical principles in cases of malignancy, than on the size or location of the healthcare institution.
- Published
- 2006
- Full Text
- View/download PDF
24. Incidence and risk factors for immune reconstitution inflammatory syndrome during highly active antiretroviral therapy.
- Author
-
Shelburne SA, Visnegarwala F, Darcourt J, Graviss EA, Giordano TP, White AC Jr, and Hamill RJ
- Subjects
- AIDS-Related Opportunistic Infections immunology, Adult, CD4 Lymphocyte Count, Female, HIV Infections drug therapy, HIV Infections immunology, HIV Infections virology, HIV-1 isolation & purification, Humans, Incidence, Male, Prognosis, RNA, Viral blood, Retrospective Studies, Risk Factors, Systemic Inflammatory Response Syndrome epidemiology, Texas epidemiology, Viral Load, AIDS-Related Opportunistic Infections complications, Antiretroviral Therapy, Highly Active adverse effects, Systemic Inflammatory Response Syndrome etiology
- Abstract
Background: There is little systematic information regarding the immune reconstitution inflammatory syndrome (IRIS)., Objective: To determine the incidence, risk factors, and long-term outcome of IRIS in HIV-infected patients receiving highly active antiretroviral therapy (HAART) who were coinfected with one of three common opportunistic pathogens., Design: A retrospective cohort identified through a city-wide prospective surveillance program., Methods: A retrospective chart review was performed for 180 HIV-infected patients who received HAART and were coinfected with Mycobacterium tuberculosis, Mycobacterium avium complex, or Cryptococcus neoformans between 1997 and 2000. Medical records were reviewed for baseline demographics, receipt and type of HAART, response to antiretroviral therapy, development of IRIS, and long-term outcome., Results: In this cohort, 31.7% of patients who received HAART developed IRIS. Patients with IRIS were more likely to have initiated HAART nearer to the time of diagnosis of their opportunistic infection (P < 0.001), to have been antiretroviral naive at time of diagnosis of their opportunistic infection (P < 0.001), and to have a more rapid initial fall in HIV-1 RNA level in response to HAART (P < 0.001)., Conclusions: IRIS is common among HIV-infected persons coinfected with M. tuberculosis, M. avium complex, or C. neoformans. Antiretroviral drug-naive patients who start HAART in close proximity to the diagnosis of an opportunistic infection and have a rapid decline in HIV-1 RNA level should be monitored for development of this disorder.
- Published
- 2005
- Full Text
- View/download PDF
25. Bacteremic and nonbacteremic pneumococcal pneumonia. A prospective study.
- Author
-
Musher DM, Alexandraki I, Graviss EA, Yanbeiy N, Eid A, Inderias LA, Phan HM, and Solomon E
- Subjects
- Adult, Aged, Aged, 80 and over, Anti-Bacterial Agents therapeutic use, Bacteremia physiopathology, Female, Humans, Male, Middle Aged, Pneumonia, Pneumococcal physiopathology, Prognosis, Prospective Studies, Radiography, Thoracic, Risk Factors, Bacteremia etiology, Pneumonia, Pneumococcal complications
- Abstract
We prospectively identified cases of pneumococcal pneumonia and used stringent criteria to stratify them into bacteremic and nonbacteremic cases. Although patients were distributed among racial groups in proportion to all patients seen at this medical center, the proportion of African-Americans with bacteremic disease was significantly increased. All patients had at least 1 underlying condition predisposing to pneumococcal infection, and most had several. Although the mean number of predisposing factors was greater among bacteremic patients than nonbacteremic patients, only alcohol ingestion was significantly more common. Nearly one-third of patients had substantial anemia (hemoglobin < or = 10 g/dL) on admission, which may have predisposed to infection. In the case of other laboratory abnormalities, such as albumin, creatinine, and bilirubin, it was difficult to determine which abnormality might have predisposed to pneumococcal infection and which might have resulted from it. The radiologic appearance was varied. Airspace consolidation and air bronchogram on chest X-ray were highly associated with bacteremic disease, as was the presence of pleural effusion. Although the Pneumonia Patient Outcomes Research Team (PORT) risk score was a predictor of mortality, it did not help to predict the presence of bacteremia in an individual case. Most patients who died in the first week in hospital were bacteremic, and a high PORT risk score with bacteremia reliably predicted a high likelihood of a fatal outcome. Eleven patients had extrapulmonary disease with meningitis, empyema, and septic arthritis predominating; all of these patients were bacteremic. The antibiotic susceptibility of our strains correlated well with those that have been reported in the United States during the years of this study. The use of numerous antibiotics of different classes in many patients, especially those who were the most ill, precluded analysis of outcome based on antibiotic therapy. Only 17 patients had been vaccinated. Since nearly all patients had conditions for which pneumococcal vaccine is recommended and more than one-third had been hospitalized in the preceding 6 months, the low rate of vaccination can be regarded as a missed opportunity to administer a potentially beneficial vaccine.
- Published
- 2000
- Full Text
- View/download PDF
26. Cryptosporidiosis in Houston, Texas. A report of 95 cases.
- Author
-
Hashmey R, Smith NH, Cron S, Graviss EA, Chappell CL, and White AC Jr
- Subjects
- AIDS-Related Opportunistic Infections complications, AIDS-Related Opportunistic Infections diagnosis, Adult, Child, Cholangitis, Sclerosing etiology, Cholecystitis etiology, Female, Humans, Incidence, Infant, Male, Opportunistic Infections complications, Opportunistic Infections diagnosis, Recurrence, Risk Factors, Seasons, Texas epidemiology, Cryptosporidiosis complications, Cryptosporidiosis diagnosis, Cryptosporidiosis epidemiology
- Abstract
Cryptosporidiosis is an important cause of diarrhea. We identified 95 patients with cryptosporidiosis over a 6-year period in our county hospital system, including 9 children and 86 adults infected with the human immunodeficiency virus (HIV). Risk factors included male-to-male sexual practices and Hispanic race. Diarrhea, weight loss, and gastrointestinal complaints were the most common symptoms at presentation. Among the HIV-infected adults, 20 (23%) developed biliary tract disease. Biliary involvement was associated with low CD4 counts. Treatment with paromomycin and antimotility agents was effective in reducing diarrheal symptoms in 54 of 70 (77%) patients with the acquired immunodeficiency syndrome (AIDS), although there was a high rate of relapse. Paromomycin did not prevent the development of biliary disease. Biliary disease responded to cholecystectomy or sphincterotomy with stent placement. Though often a cause of morbidity, cryptosporidiosis was only rarely the cause of death, even among patients with HIV. Cryptosporidiosis continues to be an important medical problem even in developed-countries. Current methods of prevention and treatment are suboptimal.
- Published
- 1997
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.