Colonoscopic Polypectomy Reduces Mortality Due to Colorectal Cancer
The premise underlying surveillance colonoscopy is that early detection and removal of cancerous polyps reduces both the incidence of colorectal cancer and subsequent death. The impact of colonoscopy on colorectal cancer rates has been examined previously, and a study published in The New England Journal of Medicine has now reported the effect of colonoscopy on mortality rates. Using data from patients who underwent colonoscopy between 1980 and 1990 as part of the National Polyp Study, Zauber and coauthors identified 2,602 patients who had adenomas removed during this 10-year period. After a median follow-up period of 15.8 years, a total of 1,246 of these patients had died from any cause, but only 12 of these deaths were due to colorectal cancer. In comparison, the Surveillance Epidemiology and End Results program estimated that there would be 25.4 expected deaths due to colorectal cancer in the general population. Thus, the standardized incidence-based mortality ratio with colonoscopic polypectomy was 0.47 (95% confidence interval [CI], 0.26–0.80), indicating that this procedure reduced mortality by 53%.
In addition to identifying patients from the National Polyp Study who had adenomas removed, the study by Zauber and colleagues also identified 773 patients with nonadenomatous polyps. Mortality in both groups was similar during the first 10 years after polypectomy (relative risk, 1.2; 95% CI, 0.1–10.6). After this period, however—when surveillance for the adenoma group was no longer organized—mortality in the adenoma group rose compared to the nonadenoma group, a finding that reinforces the importance of routine surveillance colonoscopies in patients with advanced adenomas.
Vitamin D Intake Is Associated with Reduced Risk of Crohn’s Disease
Using data from the Nurses’ Health Study, Ananthakrishnan and colleagues evaluated vitamin D status and prospectively followed a large cohort of women to determine subsequent diagnoses of Crohn’s disease or ulcerative colitis. The 72,719 women evaluated in this study had completed an assessment of diet and lifestyle in 1986; 25-hydroxy vitamin D prediction scores were then developed using these data, and these scores were validated against measured plasma levels of 25-hydroxy vitamin D. Over the course of almost 1.5 million person-years of follow-up, 122 women were diagnosed with Crohn’s disease, and 123 women were diagnosed with ulcerative colitis.
When the study cohort was organized into quartiles based on vitamin D status, women in the top quartile were found to have a 25-hydroxy vitamin D level of
32.2 ng/mL, while women in the lowest quartile had a level of 22.3 ng/mL. Comparing these 2 groups, the researchers found that women in the highest quartile had a 45% lower risk of developing Crohn’s disease compared to women in the lowest quartile (multivariate-adjusted hazard ratio=0.54; 95% CI, 0.30–0.99). For ulcerative colitis, the multivariate-adjusted hazard ratio was 0.65 (95% CI, 0.34–1.25), but this trend was not statistically significant.
Unlike retrospective studies, which can be limited by reverse causation and/or recall and selection biases, this prospective study used vitamin D levels measured 10–12 years prior to diagnosis, and all cases of Crohn’s disease and ulcerative colitis were confirmed through medical record review. While further studies are needed to determine how vitamin D impacts the pathogenesis of inflammatory bowel disease, this study suggests that measurement of vitamin D levels and/or vitamin D supplementation might be beneficial in patients diagnosed with or at risk for Crohn’s disease or ulcerative colitis. Results of this study were published in the March issue of Gastroenterology.
Long-Term Mortality Rate of Live Liver Donors Is Similar to Mortality in the General Population
Given the shortage of organs for liver donation, transplantation of part of the liver from a live donor is increasingly being considered. However, this procedure carries risks for donors, and some donors have died. To examine mortality rates among live liver donors, Muzaale and coauthors followed 4,111 live liver donors over a mean follow-up period of 7.6 years.
Results of this study, which was published in the February issue of Gastroenterology, showed that 7 donors died within 90 days after donation, yielding a rate of 1.7 early deaths per 1,000 (95% CI, 0.7–3.5). Looking at all catastrophic events (early deaths or acute liver failure), this rate was 2.9 per 1,000 (95% CI, 1.5–5.1). Neither the risk of death nor the risk of catastrophic events varied with the age of the recipient (adult vs pediatric) or the portion of the liver donated (left lateral segment, left lobe, or right lobe). In terms of long-term mortality, rates were similar among live liver donors, live kidney donors, and matched healthy participants from the Third National Health and Nutrition Examination Survey (1.2%, 1.2%, and 1.4% at 11 years, respectively; P=.9).