Diabetes, Digestive, and Kidney Diseases Extramural Research
(1) To promote extramural basic and clinical biomedical research that improves the understanding of the mechanisms underlying disease and leads to improved preventions, diagnosis, and treatment of diabetes, digestive, and kidney diseases. Programmatic areas within the National Institute of Diabetes and Digestive and Kidney Diseases include diabetes, digestive, endocrine, hematologic, liver, metabolic, nephrologic, nutrition, obesity, and urologic diseases. Specific programs areas of interest include the following: (a) For diabetes, endocrine, and metabolic diseases areas: Fundamental and clinical studies including the etiology, pathogenesis, prevention, diagnosis, treatment and cure of diabetes mellitus and its complications; Normal and abnormal function of the pituitary, thyroid, parathyroid, adrenal, and other hormone secreting glands; Hormonal regulation of bone, adipose tissue, and liver; on fundamental aspects of signal transduction, including the action of hormones, coregulators, and chromatin remodeling proteins; Hormone biosynthesis, secretion, metabolism, and binding; and on hormonal regulation of gene expression and the role(s) of selective receptor modulators as partial agonists or antagonists of hormone action; and Fundamental studies relevant to metabolic disorders including membrane structure, function, and transport phenomena and enzyme biosynthesis; and basic and clinical studies on the etiology, pathogenesis, prevention, and treatment of inherited metabolic disorders (such as cystic fibrosis). (b) For digestive disease and nutrition areas: Genetics and genomics of the GI tract and its diseases; Genetics and genomics of liver/pancreas and diseases; Genetics and genomics of nutrition; genetics and genomics of obesity; Bariatric surgery; Clinical nutrition research; Clinical obesity research; Complications of chronic liver disease; Fatty liver disease; Genetic liver disease; HIV and liver; Cell injury, repair, fibrosis and inflammation in the liver; Liver cancer; Liver transplantation; Pediatric liver disease; Viral hepatitis and infectious diseases; Gastrointestinal and nutrition effects of AIDS; Gastrointestinal mucosal and immunology; Gastrointestinal motility; Basic neurogastroenterology; Gastrointestinal development; Gastrointestinal epithelial biology; Gastrointestinal inflammation; Digestive diseases epidemiology and data systems; Nutritional epidemiology and data systems; Autoimmune liver disease; Bile, Bilirubin and cholestasis; Bioengineering and biotechnology related to digestive diseases, liver, nutrition and obesity; Cell and molecular biology of the liver; Developmental biology and regeneration; Drug-induced liver disease; Gallbladder disease and biliary diseases; Exocrine pancreas biology and diseases; Gastrointestinal neuroendocrinology; Gastrointestinal transport and absorption; Nutrient metabolism; Pediatric clinical obesity; Clinical trials in digestive diseases; Liver clinical trials; Obesity prevention and treatment; and Obesity and eating disorders. (c) For kidney, urologic and hematologic diseases areas: Studies of the development, physiology, and cell biology of the kidney; Pathophysiology of the kidney; Genetics of kidney disorders; Immune mechanisms of kidney disease; Kidney disease as a complication of diabetes; Effects of drugs, nephrotoxins and environmental toxins on the kidney; Mechanisms of kidney injury repair; Improved diagnosis, prevention and treatment of chronic kidney disease and end-stage renal disease; Improved approaches to maintenance dialysis therapies; Basic studies of lower urinary tract cell biology, development, physiology, and pathophysiology; Clinical studies of bladder dysfunction, incontinence, pyelonephritis, interstitial cystitis, benign prostatic hyperplasia, urolithiasis, and vesicoureteral reflux; Development of novel diagnostic tools and improved therapies, including tissue engineering strategies, for urologic disorders;Research on hematopoietic cell differentiation; metabolism of iron overload and deficiency; Structure, biosynthesis and genetic regulation of hemoglobin; as well as Research on the etiology, pathogenesis, and therapeutic modalities for the anemia of inflammation and chronic diseases. (2) To encourage basic and clinical research training and career development of scientists during the early stages of their careers. The Ruth L. Kirschstein National Research Service Award (NRSA) funds basic and clinical research training, support for career development, and the transition from postdoctoral biomedical research training to independent research related to diabetes, digestive, endocrine, hematologic, liver, metabolic, nephrologic, nutrition, obesity, and urologic diseases. (3) To expand and improve the Small Business Innovation Research (SBIR) program. The SBIR Program aims to increase and facilitate private sector commercialization of innovations derived from Federal research and development; to enhance small business participation in Federal research and development; and to foster and encourage participation of socially and economically disadvantaged small business concerns and women-owned small business concerns in technological innovation. (4) To utilize the Small Business Technology Transfer (STTR) program. The STTR Program intends to stimulate and foster scientific and technological innovation through cooperative research and development carried out between small business concerns and research institutions; to foster technology transfer between small business concerns and research institutions; to increase private sector commercialization of innovations derived from Federal research and development; and to foster and encourage participation of socially and economically disadvantaged small business concerns and women-owned small business concerns in technological innovation.
General information about this opportunity
Last Known Status
National Institutes of Health, Department of Health and Human Services
Type(s) of Assistance Offered
B - Project Grants
Fiscal Year 2016
Project Grants: $1,491,623,000 with 3,203 awards are estimated NRSAs: $59,551,000 with 469 awards and 1,109 FTTPs/trainees are estimated SBIR: $58,145,000 with 128 awards are estimated Fiscal Year 2016 Actual: Project Grants: $1,489,668,000 with 3,176 awards; NRSA: $58,286,000 with 486 awards and 1,118 FTTPs/trainees SBIR/STTR: $58,634,000 with 115 awardsFiscal Year 2017
FY2017 Actual: Project Grants: $1,392,981,000 with 3,152 awards NRSAs: $58,694,000 with 471 awards and 1,092 FTTPs/trainees SBIR: $57,919,000 with 111 awardsFiscal Year 2019
FY2019 Enacted: Project Grants: $1,659,000,000 with 3,468 awards estimated NRSAs: $62,678,000 with 491 awards and 1,134 FTTPs/trainees estimated SBIR: $63,805,000 with 120 awards estimatedFiscal Year 2020
FY 2020 President Budget: Project Grants: $1,282,000,000 with 3,153 awards estimated NRSAs: $52,825,000 with 413 awards and 957 FTTPs/trainees estimated SBIR: $58,965,000 with 100 awards estimated
Public Health Service Act, Sections 301, 405, 428, 431, 487, 491, 493, 495, and 498, as amended; Public Laws 78-410, 99- 158, 100-607, 106-554, and 107-360; 42 U.S.C. 241, as amended; 42 U.S.C. 285c-2, 42 U.S.C. 285c-5, 42 U.S.C. 288; Small Business Research and Development Enhancement Act of 1992, Public Law 102-564.
Who is eligible to apply/benefit from this assistance?
Project Grants: Universities, colleges, medical, dental and nursing schools, schools of public health, laboratories, hospitals, State and local health departments, other public or private institutions, both non-profit and for-profit, and individuals who propose to establish, expand, and improve research activities in health sciences and related fields. NRSAs: Support is provided for academic and research training only, in health and health-related areas that are periodically specified by the National Institutes of Health. To be eligible, predoctoral awardees must have completed the baccalaureate degree and postdoctoral awardees must have a professional or scientific degree (M.D., Ph.D., D.D.S., D.O., D.V.M., Sc.D., D.Eng., or equivalent domestic or foreign degree). Individuals must be nominated and sponsored by a public or nonprofit private institution having staff and facilities appropriate to the proposed research training program. All awardees must be citizens or have been admitted to the United States for permanent residence. Nonprofit domestic organizations may apply for the Institutional NRSA. SBIR and STTR grants can be awarded only to domestic small businesses that meet the following criteria: 1) Is independently owned and operated, is not dominant in the field of operation in which it is proposing, has a place of business in the United States and operates primarily within the United States or makes a significant contribution to the US economy, and is organized for profit; 2) Is (a) at least 51% owned and controlled by one or more individuals who are citizens of, or permanent resident aliens in, the United States, or (b) for SBIR only, it must be a for-profit business concern that is at least 51% owned and controlled by another for-profit business concern that is at least 51% owned and controlled by one or more individuals who are citizens of, or permanent resident aliens in, the United States. 3) Has, including its affiliates, an average number of employees for the preceding 12 months not exceeding 500, and meets the other regulatory requirements found in 13 C.F.R. Part 121. Business concerns are generally considered to be affiliates of one another when either directly or indirectly, (a) one concern controls or has the power to control the other; or (b) a third-party/parties controls or has the power to control both. STTR grants which "partner" with a research institution in cooperative research and development. At least 40 percent of the project is to be performed by the small business concern and at least 30 percent by the research institution. In both Phase I and Phase II, the research must be performed in the U.S. and its possessions. To be eligible for funding, a grant application must be approved for scientific merit and program relevance by a scientific review group and a national advisory council.
Health professionals, graduate students, health professional students, scientists, and researchers, any nonprofit or for-profit organization, company, or institution engaged in biomedical research. Project Grants: Although no degree of education is either specified or required, nearly all successful applicants have doctoral degrees in one of the sciences or professions. NRSAs: Predoctoral awardees must have completed the baccalaureate degree and postdoctoral awardees must have a professional or scientific degree.
Each applicant for research projects must present a research plan and furnish evidence that scientific competence, facilities, equipment, and supplies are appropriate to carry out the plan. For SBIR and STTR grants, applicant organization (small business concern) must present in a research plan an idea that has potential for commercialization and furnish evidence that scientific competence, experimental methods, facilities, equipment, and funds requested are appropriate to carry out the plan. Individual NRSA applications for postdoctoral training must include the candidate's academic record, research experience, citizenship, institutional sponsorship, and the proposed area and plan of training. Institutional Training grant applications for predoctoral and postdoctoral training must show the objectives, methodology and resources for the research training program; the qualifications and experience of directing staff; the criteria to be used in selecting individuals for stipend support; and a detailed budget and justification for the amount of grant funds requested. For-profit organizations' costs are determined in accordance with Subpart 31.2 of the Federal Acquisition Regulations. For other grantees, costs will be determined in accordance with HHS Regulations 45 CFR, Part 75, Subpart Q. For SBIR and STTR grants, applicant organization (small business concern) must present in a research plan an idea that has potential for commercialization and furnish evidence that scientific competence, experimental methods, facilities, equipment, and funds requested are appropriate to carry out the plan. Grant form PHS 398 is used to apply for SBIR and STTR Phase I Phase II and Phase I/Phase II Fast Track.
What is the process for applying and being award this assistance?
Preapplication coordination is not applicable.
2 CFR 200, Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal Awards applies to this program. Project Grants: Applications for Federal assistance must be submitted electronically through Grants.gov (http://www.grants.gov) using the SF424 Research and Related (R&R) forms and the SF424 (R&R) Application Guide. Applications may not be submitted in paper format. A registration process through Grants.gov is necessary before submission and applicants are highly encouraged to start the process at least four weeks prior to the grant submission date. Two steps are required for on time submission: (1) The application must be successfully received by Grants.gov no later than 5:00 p.m. local time (of the applicant institution/organization) on the submission/receipt date. (2) Applicants must complete a verification step in the eRA Commons within two business days of notification from NIH. Note: Since email can be unreliable, it is the responsibility of the applicant to periodically check on their application status in the Commons. The standard application forms, as furnished by PHS and required by 45 CFR Part 92, must be used for this program by those applicants that are State or local units of government. SBIR and STTR Grant Solicitations and SBIR Contract Solicitation may be obtained electronically through the NIH's "Small Business Funding Opportunities" home page at www.nih.gov/grants/funding/sbir.htm on the World Wide Web. The Solicitations include submission procedures, review considerations, and grant application or contract proposal forms.
Research Grant and Training Program applications are reviewed initially for scientific merit by an appropriate review panel, composed of scientific authorities, and by the National Diabetes and Digestive and Kidney Diseases Advisory Council composed of leaders in medical science, education, and public affairs. Approved applications will compete on a merit basis for available funds. The successful applicant is sent a Notice of Grant Award. All accepted SBIR/STTR applications are evaluated for scientific and technical merit by an appropriate scientific peer review panel and by a national advisory council or board. All applications receiving a priority score compete for available SBIR/STTR set-aside funds on the basis of scientific and technical merit and commercial potential of the proposed research, program relevance, and program balance among the areas of research.
Contact the headquarters or regional location, as appropriate for application deadlines
Approval/Disapproval Decision Time
Project Grants: From 6 to 9 months. National Research Service Awards: From 6 to 9 months. SBIR/STTR applications: About 7-1/2 months.
A principal investigator (P.I.) may question the substantive or procedural aspects of the review of his/her application by communicating with the staff of the Institute. A description of the NIH Peer Review Appeal procedures is available on the NIH home page http://grants.nih.gov/grants/guide/notice-files/not97-232.html.
Project Grants: Renewals are determined by competitive application and review. Extensions considered upon request. Individual NRSAs: Awards may be made for 1, 2, or 3 years. No individual may receive NIH fellowship support at the postdoctoral level for more than 3 years.
How are proposals selected?
The major elements in evaluating proposals include assessments of: (1) The scientific merit and general significance of the proposed study and its objectives; (2) the technical adequacy of the experimental design and approach; (3) the competency of the proposed investigator or group to successfully pursue the project; (4) the adequacy of the available and proposed facilities and resources; (5) the necessity of the budget components requested in relation to the proposed project; and (6) the relevance and importance to announced program objectives. The following criteria will be used in considering the scientific and technical merit of SBIR/STTR Phase I grant applications: (1) The soundness and technical merit of the proposed approach; (2) the qualifications of the proposed principal investigator, supporting staff, and consultants; (3) the technological innovation of the proposed research; (4) the potential of the proposed research for commercial application; (5) the appropriateness of the budget requested; (6) the adequacy and suitability of the facilities and research environment; and (7) where applicable, the adequacy of assurances detailing the proposed means for (a) safeguarding human or animal subjects, and/or (b) protecting against or minimizing any adverse effect on the environment. Phase II grant applications will be reviewed based upon the following criteria: (1) The degree to which the Phase I objectives were met and feasibility demonstrated; (2) the scientific and technical merit of the proposed approach for achieving the Phase II objectives; (3) the qualifications of the proposed principal investigator, supporting staff, and consultants; (4) the technological innovation, originality, or societal importance of the proposed research; (5) the potential of the proposed research for commercial application; (6) the reasonableness of the budget requested for the work proposed; (7) the adequacy and suitability of the facilities and research environment; and (8) where applicable, the adequacy of assurances detailing the proposed means for (a) safeguarding human or animal subjects, and/or (b) protecting against or minimizing any adverse effect on the environment.
How may assistance be used?
Project Grants provide funds for salaries, equipment, supplies, travel, and other expenses associated with scientific investigation relevant to program objectives. NRSAs are made directly to individuals for research training in specified biomedical shortage areas, or, to institutions to enable them to make NRSAs to individuals selected by them. Each individual who receives a NRSA is obligated upon termination of the award to comply with certain service and payback provisions. SBIR Phase I grants (of approximately 6-months duration) are to establish the technical merit and feasibility of a proposed research effort that may lead to a commercial product or process. Phase II grants are for the continuation of the research initiated in Phase I and that are likely to result in commercial products or processes. Only Phase I awardees are eligible to receive Phase II support. STTR Phase I grants (normally of 1-year duration) are to determine the scientific, technical, and commercial merit and feasibility of the proposed cooperative effort that has potential for commercial application. Phase II funding is based on results of research initiated in Phase I and scientific and technical merit and commercial potential of the Phase II application.
What are the requirements after being awarded this opportunity?
Records must be available for review or audit by appropriate officials of the Federal agency, pass-through entity, and Government Accountability Office (GAO). Foreign grantees are subject to the same audit requirements as for-profit (commercial) organizations.
Grantees generally must retain financial and programmatic records, supporting documents, statistical records, and all other records that are required by the terms of a grant, or may reasonably be considered pertinent to a grant, for a period of 3 years from the date the annual FSR is submitted. For awards under SNAP (other than those to foreign organizations and Federal institutions), the 3-year retention period will be calculated from the date the FSR for the entire competitive segment is submitted. Those grantees must retain the records pertinent to the entire competitive segment for 3 years from the date the FSR is submitted to NIH. Foreign organizations and Federal institutions must retain records for 3 years from the date of submission of the annual FSR to NIH. See 45 CFR 75.53 and 92.42 for exceptions and qualifications to the 3-year retention requirement (e.g., if any litigation, claim, financial management review, or audit is started before the expiration of the 3-year period, the records must be retained until all litigation, claims, or audit findings involving the records have been resolved and final action taken). Those sections also specify the retention period for other types of grant-related records, including F&A cost proposals and property records. See 45 CFR 75.48 and 92.36 for record retention and access requirements for contracts under grants. In accordance with 45 Code of Federal Regulations, Part 75.53(e), the HHS Inspector General, the U.S. Comptroller General, or any of their duly authorized representatives have the right of timely and unrestricted access to any books, documents, papers, or other records of recipients that are pertinent to awards in order to make audits, examinations, excerpts, transcripts, and copies of such documents. This right also includes timely and reasonable access to a recipient's personnel records for the purpose of interview and discussion related to such documents. The rights of access are not limited to the required retention period, but shall last as long as records are retained.
Other Assistance Considerations
Formula and Matching Requirements
Statutory formula is not applicable to this assistance listing.
Matching requirements are not applicable to this assistance listing.
MOE requirements are not applicable to this assistance listing.
Length and Time Phasing of Assistance
Project Grants: Awards are usually made for a 12-month period with recommendation of up to 4 years of additional support. SBIR: Normally, Phase I awards are for 6 months; normally, Phase II awards are for 2 years. STTR: Normally, Phase I awards are for 1 year; normally, Phase II awards are for 2 years. The Notice of Award (NoA) is the legal document issued to notify the grantee that an award has been made and that funds may be requested from the designated HHS payment system or office. An NoA is issued for the initial budget period. If subsequent budget periods are also approved, the NoA will include a reference to those budgetary commitments. Funding for subsequent budget periods are generally provided in annual increments following the annual assessment of progress. This funding is also contingent on the availability of funds. The NoA includes all applicable terms of award either by reference or specific statements. It provides contact information for the assigned program officer and grants management specialist. The grantee accepts an NIH award and its associated terms and conditions by drawing or requesting funds from the Payment Management System, or upon the endorsement of a check from the US Treasury for foreign awardees.
Who do I contact about this opportunity?
Regional or Local Office
Project Grants: Dr. Philip Smith, Acting Director, Division of Diabetes, Endocrinology, and Metabolic Diseases, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, 2 Democracy Plaza, Room 6037, 6707 Democracy Blvd., Bethesda, MD 20892-2560. Telephone: (301) 594-8816; Dr. Stephen James, Director, Division of Digestive Diseases and Nutrition, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, 2 Democracy Plaza, Room 6029, 6707 Democracy Blvd., Telephone: (301) 594-7680; Dr. Robert Star, Director, Division of Kidney, Urologic and Hematologic Diseases, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, 2 Democracy Plaza, Room 6119, 6707 Democracy Blvd., Bethesda, MD 20892-2560. Telephone: (301) 496-6325. Grants Management Contact: Mr. Robert Pike, Chief Grants Management Officer, Grants Management Branch, Division of Extramural Activities, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, 2 Democracy Plaza, Room 7333, 6707 Democracy Blvd., Bethesda, MD 20892. Telephone: (301) 594-8854. Use the same numbers for FTS.
31 Center Drive, Room 9A34
Bethesda, MD 20892 US
(Project Grants) FY 18$1,604,805,000.00; FY 19 est $1,785,519,000.00; FY 20 est $1,393,396,000.00; FY 17$1,649,244,000.00; FY 16$1,606,588,000.00; - The amounts above are the total Project Grants, NRSA and SBIR/STTR awards. The Project Grants and NRSA awards include Type 1 Diabetes funds and exclude TAPS. The SBIR/STTR awards include Type 1 Diabetes funds.
Range and Average of Financial Assistance
Project Grants: Range of $1,000 to $30,000,000; $455,000 average NRSAs: Range of $8,000 to $802,000; $128,000 average SBIR: Range of $30,000 to $1,928,000; $530,000 average
Regulations, Guidelines and Literature
Project Grants: 42 CFR 52; 42 CFR 66; 42 CFR 74; 45 CFR 75; 45 CFR 92. Administration Policy Directive No. 65 01 (47 Fed. Reg. 52966 et seq. (1982), as amended by Policy Directive No. 65 01.1 (48 Fed. Reg. 38794 et seq. (1983)). Grants will be available under the authority of and administered in accordance with the NIH Grants Policy Statement, http://grants.nih.gov/grants/policy/nihgps_2003/; Omnibus Solicitation of the Public Health Service for SBIR Grant and Cooperative Agreement Applications. Omnibus Solicitation of the National Institutes of Health for STTR Grant Applications.
Examples of Funded Projects
Fiscal Year 2016
Division of Diabetes, Endocrinology, and Metabolic Diseases Projects: Diabetes Control and Complications Trial (DCCT)/Epidemiology of Diabetes Interventions and Complications (EDIC) Study Research Group. Intensive Diabetes Treatment and Cardiovascular Outcomes in Type 1 Diabetes: The DCCT/EDIC Study 30-Year Follow-up. Diabetes Care 39: 686-693, 2016. AND Diabetes Control and Complications Trial (DCCT)/Epidemiology of Diabetes Interventions and Complications (EDIC) Study Research Group. Mortality in Type 1 Diabetes in the DCCT/EDIC Versus the General Population. Diabetes Care 39: 1378-1383, 2016. Intensive Blood Glucose Management for Those with Type 1 Diabetes Preserves Heart Health and Reduces Risk of Early Mortality: A long-term NIDDK study reports that keeping blood glucose (sugar) as close to normal as possible for an average of 6.5 years early in the course of type 1 diabetes reduces cardiovascular (heart and blood vessel) disease and can reduce mortality to rates close to those seen in people of similar age in the general population. The landmark Diabetes Control and Complications Trial (DCCT) began in 1983. The DCCT randomly assigned half its participants to an intensive blood glucose management regimen designed to keep blood glucose levels as close to normal as safely possible, and half to the less intensive conventional treatment at the time. When DCCT ended in 1993, it was clear that intensive management had significantly reduced eye, nerve, and kidney complications, but at that time the participants were too young to determine their rates of cardiovascular disease. All DCCT participants were taught the intensive management regimen and invited to join the Epidemiology of Diabetes Interventions and Complications (EDIC) study. EDIC continued to monitor participants’ health, and overall blood glucose management has since been similar in both DCCT treatment groups. To study the long-term effects of the different treatments tested in the DCCT, researchers examined differences in cardiovascular problems, which can take many years to develop, between the former intensive and conventional treatment groups. After an impressive average 30-year follow-up, DCCT/EDIC researchers found that those who practiced intensive blood glucose management during the DCCT still had significantly reduced cardiovascular disease compared to those who did not, despite having similar blood glucose management for 20 years after the DCCT ended. Compared to the former conventional treatment group, the former intensive management group had a 30-percent reduced incidence of cardiovascular disease and 32 percent fewer major cardiovascular events (such as non-fatal heart attack, stroke, or death from cardiovascular disease) after 30 years of follow-up. These results were similar for both men and women who participated in the studies. However, the beneficial effects of intensively managing blood glucose during the DCCT appeared to be wearing off over time. For example, after 20 years of follow-up, DCCT/EDIC researchers reported that the former intensive treatment group had a 42-percent reduced risk of cardiovascular disease compared to the former conventional treatment group. After 30 years of follow-up, that number had fallen to 30 percent. Even with this reduction in protection, these new data show that a finite period of near-normal blood glucose management early in the course of type 1 diabetes can have beneficial effects on cardiovascular health for up to 30 years. Historically, those with type 1 diabetes have had a higher mortality rate than the general population. Previous DCCT/EDIC analyses compared intensive versus conventional blood glucose management and showed that those in the former intensive treatment group had reduced mortality compared with that of the former conventional treatment group. Now, mortality in the DCCT/EDIC study from its inception through 2014 was compared to 2013 national mortality data. Researchers found that overall mortality when both DCCT/EDIC treatment groups were combined was no greater than what would be expected in the general U.S. population. However, they found that the mortality rate in the former conventional treatment group was 31 percent higher than that seen in the general population. While the former intensive treatment group’s mortality rate was below that in the general population, the difference was not statistically significant. Researchers also found participants’ long-term blood glucose control affected mortality rates, and those who had worse control had correspondingly worse mortality rates. This effect of blood glucose control on lifespan was more pronounced among women than among men. In general, these results suggest that the increased mortality historically seen in those with type 1 diabetes can be reduced or eliminated through careful management of blood glucose. Overall, these findings add to DCCT/EDIC’s decades of evidence demonstrating how people with type 1 diabetes can dramatically increase their chances of living long, healthy lives by practicing early, intensive blood glucose management. Zhou K, Yee SW, Seiser EL,…Pearson ER. Variation in the glucose transporter gene SLC2A2 is associated with glycemic response to metformin. Nat Genet 48: 1055-1059, 2016. Variation in a Glucose Transporter Affects Response to the Type 2 Diabetes Drug, Metformin: New research indicates that a common variation in the gene encoding a protein that allows glucose (sugar) to move in and out of cells has a surprising impact on the effectiveness of the first-line anti-diabetes medication metformin. Metformin is a very widely used, safe, and helpful treatment for type 2 diabetes, but it is more effective in some people than in others, and scientists are trying to understand why. An international consortium of investigators looked at genomic variation in over 13,000 volunteers of varying ancestry who were taking metformin. They found that a common variation in the gene for a glucose transporter protein, GLUT2, had a significant impact on metformin effectiveness. (The gene encoding GLUT2 is known as SCL2A2.) Before treatment, people with two copies of a version of the gene (designated “C”) typically had somewhat worse blood glucose control, as detected by higher levels of HbA1c, a marker for glucose levels. Yet, these individuals had slightly better (lower) HbA1c when taking a standard dose of metformin than did people with two copies of the other version (“T”) of the GLUT2-encoding gene. This effect was most pronounced in people who were obese, but was also seen in those who were not. People with one copy of each version had an intermediate response to metformin. GLUT2 allows glucose to move passively in and out of cells in the liver, an organ with a critical role in regulating blood glucose levels. The GLUT2 that is produced by the C and T versions of the gene is the same, equally capable of allowing glucose movement. However, the researchers found that liver cells with the C version make less GLUT2 than liver cells with the T version. This suggests that in the absence of metformin, individuals with type 2 diabetes and the C version are at a disadvantage compared to those with the T version when it comes to regulating blood glucose levels, but that metformin treatment overcomes and even slightly reverses this effect. Metformin still works in people with two copies of the T version of the gene, but more of the drug—or an additional medication—would be needed to achieve the same degree of HbA1c reduction. This discovery has broad applicability, because the C and T versions of the gene are both common in a wide variety of racial/ethnic groups, albeit to differing degrees. For example, about 70 percent of African Americans have at least one copy of C, while 24 percent of Latinos do. With further research, tests to reveal a patient’s GLUT2 gene version could one day help further precision medicine by allowing health care providers to tailor metformin dosage for that individual, so that he or she takes neither more nor less of the medication than needed. Division of Digestive Diseases and Nutrition Projects: Inge TH, Courcoulas AP, Jenkins TM,… Buncher CR; for the Teen-LABS Consortium. Weight Loss and Health Status 3 Years after Bariatric Surgery in Adolescents. N Engl J Med 374:113-123, 2016. Weight Loss and Health Benefits from Bariatric Surgery in Teens with Severe Obesity: In a study of teens with severe obesity, bariatric surgery resulted in substantial weight loss and improvements in health and quality of life 3 years after the surgeries were performed; the study also identified risks associated with the surgeries. These findings are from the Teen Longitudinal Assessment of Bariatric Surgery, or Teen-LABS, study. Obesity increases risk for type 2 diabetes, cardiovascular disease, and many other serious conditions. Previous research has shown that adults with severe obesity (also known as extreme obesity) can experience dramatic health benefits from bariatric surgery. However, very little has been known about the effects of this surgery in adolescents, particularly over the long-term—even though it is used in clinical practice for this age group. Thus, researchers designed Teen-LABS, an observational study that enrolled adolescents who were already planning to have bariatric surgery. Their goal was to collect outcome data on health risks and benefits that could help with treatment decisions. Conducted at five U.S. clinical centers, Teen-LABS enrolled 242 people ages 13-19. Prior to surgery, all were obese, and nearly all had severe obesity, based on body mass index (BMI), a measure of weight relative to height. The majority of the participants in the study were Caucasian females, a demographic representative of patients who seek bariatric surgery at these clinical centers. The study focused on those who underwent either of two bariatric surgical procedures: gastric bypass (used for a majority of the teens), or sleeve gastrectomy. Before surgery, the participants’ average weight was 328 pounds. Three years after surgery, their weight decreased by an average of 90 pounds, or 27 percent. Some of the participants had type 2 diabetes, some had kidney disease, and many had high blood pressure or abnormal levels of blood lipids (cholesterol or triglycerides) prior to surgery. The study found that 95 percent of the teens who had type 2 diabetes had reversal of their disease, 86 percent of those with kidney damage experienced improvements in kidney function, and most of the teens with high blood pressure or lipid abnormalities saw improvements in these conditions 3 years after surgery. Additionally, 26 percent of the teens were no longer obese 3 years after surgery. Although a majority still had some level of obesity, not as many had severe obesity. The study also identified risks. During the study period, 13 percent of participants needed additional abdominal surgery, most commonly gallbladder removal. The study also found that although fewer than 5 percent of the teens were iron-deficient before surgery, more than half had low iron stores 3 years later. These results contribute important knowledge about the benefits and risks of bariatric surgery in adolescents. However, further research will be critical to determine the longer-term effects of bariatric surgery on health and well-being, including whether health improvements are sustained and whether additional risks emerge. This information will help teens, their parents, and their health care providers make more informed treatment decisions, so that young people with obesity can have improved health during adolescence and as they become adults. Chu H, Khosravi A, Kusumawardhani IP,…Mazmanian SK. Gene-microbiota interactions contribute to the pathogenesis of inflammatory bowel disease. Science. 352: 1116-1120, 2016. AND Lassen KG, McKenzie CI, Mari M,…Xavier RJ. Genetic Coding Variant in GPR65 Alters Lysosomal pH and Links Lysosomal Dysfunction with Colitis Risk. Immunity. 44: 1392-1405, 2016. Exploring the Genes That Keep the Gut’s Immune System in Check: Recent research into the genetics of inflammatory bowel disease (IBD) has pointed to abnormal interactions between the gut and the bacteria that inhabit it, implicating genetic defects in a process that cells use to break down microbial material. IBD is a painful and debilitating collection of diseases, including Crohn’s disease and ulcerative colitis, that are marked by inflammation and damage in the gut. The causes of IBD are unclear; however, the inflammation is believed to be caused by complicated interactions between genetic and environmental factors. In particular, research has pointed to an improper immune response to bacteria in the gut—a reaction that can be affected by human genetics. Variations in many areas of the genome have been associated with IBD, including some involved in immunity, but it has been difficult to determine how these variants might be contributing to the disease. Recently, two groups of researchers have identified how certain IBD genetic risk variants may affect the way gut cells respond to bacteria. Both groups focused on a process called autophagy, whereby damaged or unnecessary materials in cells—including bacteria and bacterial components—are packaged and broken down. One of the research groups concentrated on the genes ATG16L1 and NOD2, both of which code for proteins that are known to play important roles in autophagy and have variants that are implicated in IBD. The scientists found that immune cells from mice lacking the ATG16L1 protein were unable to suppress inflammation when exposed to a “friendly” type of bacteria called Bacteroides fragilis (B. fragilis) that normally resides in the human gut. B. fragilis helps keep the gut’s immune system in check by delivering certain bacterial molecules to intestinal immune cells. They deliver the molecules in small spheres, called outer membrane vesicles, that bud from the bacterial cells’ outer coating. These vesicles are engulfed, packaged, and broken down by immune cells in the gut, where their components suppress an immune reaction. However, the researchers found that mouse immune cells lacking functioning ATG16L1 protein were unable to respond to these vesicles, thus potentially failing to prevent an improper inflammatory reaction to B. fragilis and other “friendly” gut bacteria. Testing this idea in a mouse model of colitis, the scientists found that mice lacking functional ATG16L1 were not protected from colitis when they were given outer membrane vesicles from B. fragilis, but mice with ATG16L1 were. Mice and cells lacking functional NOD2 also had defective responses to these B. fragilis vesicles, supporting the idea that NOD2 could cooperate with ATG16L1 in suppressing inflammation. Importantly, mice or cells from male and female IBD patients with a human genetic variant of ATG16L1 that is implicated in IBD also did not respond to these vesicles, suggesting that a failure of ATG16L1-mediated autophagy could be contributing to disease in some people with IBD. Another team of scientists investigated the role of autophagy as a cellular defense mechanism against potentially harmful bacteria. Some types of bacteria can invade cells, causing disease, and cells typically use autophagy to package and degrade the invading microbes. Armed with this knowledge, the researchers performed genetic screening in a human cell line to identify genes implicated in IBD that are involved in both autophagy and cellular defense against bacteria. Among the genes they identified was GPR65, which has variants associated with IBD. GPR65 encodes a protein that is important for the proper function of lysosomes, which are acid-rich globules in cells that break down material packaged for autophagy. The researchers found that male and female mice without functional GPR65 protein were more prone to a disease resembling human IBD when given a type of bacteria that causes intestinal inflammation in mice. This effect was seen when GPR65 was absent from either the cells lining the gut or the immune cells within the gut. The lysosomes of intestinal and immune cells lacking GPR65 were unable to properly degrade invading bacteria. This could be explained by the observation that the lysosomes were not positioned properly in the cell and were not as acidic as normal lysosomes. Importantly, the researchers also tested a human cell line engineered to have a genetic variant found in male and female IBD patients, and immune cells from IBD patients who have this variant, and they found that these cells were also defective in destroying invading bacteria. These results suggest that this genetic variant of GPR65 could promote IBD by crippling autophagy and cellular defense against disease-causing bacteria. By showing that certain genetic variants identified in IBD patients can cause defects in the way cells relate to, or defend themselves from, bacteria in the gut, these results provide possible links between the genetics and the biological processes of IBD. They also open the door to future treatments that could help restore proper relationships between bacteria and the gut immune system in people with IBD. Division of Kidney, Urologic, and Hematologic Diseases Projects: Harper JD, Cunitz BW, Dunmire B,…Bailey MR. First in Human Clinical Trial of Ultrasonic Propulsion of Kidney Stones. J Urol 195: 956-964, 2016. Moving Stones with Sound—New Ultrasound Technology Repositions Kidney Stones in People: Researchers have developed new ultrasonic propulsion technology that can reposition kidney stones and facilitate stone fragment passage in people. Kidney stones are one of the most common disorders of the urinary tract. Smaller stones may pass with little or no pain, while larger stones may get stuck along the lower urinary tract and block the flow of urine, causing severe pain and/or bleeding. Current treatments for kidney stones, such as lithotripsy, may leave behind residual stone fragments. Most fragments will pass on their own, but others may grow larger, cause pain, and lead to the need for additional treatment. Toward the goals of finding safe ways to reposition kidney stones and encouraging the passage of stone fragments, scientists developed ultrasonic propulsion technology. The technology uses a handheld device to generate a real-time ultrasound image to visualize the kidney stone, and directs controlled, short bursts of ultrasound waves toward the stone to try to make it move. In the first human clinical trial testing this technology, scientists found that it could reposition kidney stones in 14 of 15 men and women studied, and cause some degree of movement of both large and small stones. In fact, one person experienced pain relief after a large, obstructing stone was moved. These findings suggest that the procedure could successfully reposition kidney stones in some people. The scientists then examined six study participants who had residual stone fragments after previously undergoing a lithotripsy procedure to treat their kidney stones. Four of them passed more than 30 fragments within days after undergoing the ultrasonic propulsion procedure, demonstrating that the technology could facilitate the passage of stone fragments. An unexpected finding was that the technology may also be useful for diagnosis—in four people, what was thought to be one large stone was actually found to be a cluster of small, passable stones after they were moved. Stone size is an important factor that doctors consider when making treatment decisions, so having this diagnostic information could aid them in making those decisions. Importantly, the technology was found to be safe and did not cause pain. It is also noninvasive and could be performed in a clinic setting while people are awake without the need for sedation. Ultrasound propulsion technology is still being refined and tested in people, but with further research, it may eventually be possible to use this new technology after procedures that leave residual stone fragments to facilitate their passage and potentially reduce the need for future intervention. The technology may also be useful for moving large, obstructing stones; repositioning stones before surgery; and serving as a diagnostic tool. Orandi BJ, Luo X, Massie AB,…Segev DL. Survival Benefit with Kidney Transplants from HLA-Incompatible Live Donors. N Engl J Med 374: 940-950, 2016. Promising Result Reported from Multi-center Kidney Transplantation Study: Researchers have reported a survival benefit for people who received kidney transplants from HLA-incompatible live donors compared with either those remaining on the kidney transplant waiting list or those who received kidney transplants from immune system-compatible deceased donors. Human leukocyte antigen (HLA) is a protein on the surfaces of human cells that identifies the cells as “self” or “foreign,” and performs essential roles in immune responses. There are multiple forms of HLAs, which vary among individuals and are analyzed in laboratory tests to determine whether one person’s organs and tissues are compatible with another person’s, and could be used in a transplant. The more closely the HLAs match between a donor and recipient, the less likely a transplant will be rejected by the recipient’s immune system. To overcome HLA-incompatible transplants, organ transplant recipients undergo “desensitization” protocols to remove antibodies in the blood that can harm the donated organ. Previous research from a single center indicated a survival benefit with kidney transplants from HLA-incompatible live donors as compared with those waiting for a compatible organ. To assess whether the survival benefit seen in the single-center study is generalizable on a national scale, a 22-center study was designed and conducted. The researchers assessed the survival of people who received kidney transplants from HLA-incompatible live donors, at multiple time points up to 8 years after transplantation. They compared these outcomes with the survival of two control groups—those who remained on the waiting list or received a transplant from a deceased donor, and those who remained on the waiting list but did not receive a transplant. The multicenter study reported that a kidney transplant from an HLA-incompatible live donor was associated with a significant survival benefit compared to the two control groups. As a compatible live kidney donor is rarely available, these results suggest that patients now could consider the option to undergo incompatible transplantation.Fiscal Year 2017
*********Division of Diabetes, Endocrinology, and Metabolic Diseases projects********* Dabelea D, Stafford JM, Mayer-Davis EJ, …and Pihoker C; for the SEARCH for Diabetes in Youth Research Group. Association of Type 1 Diabetes vs Type 2 Diabetes Diagnosed During Childhood and Adolescence With Complications During Teenage Years and Young Adulthood. JAMA 317: 825-835, doi: 10.1001/jama.2017.0686, 2017. Youth with type 2 diabetes develop complications more often than peers with type 1 diabetes: Researchers have found that teens and young adults with type 2 diabetes develop common diabetic complications more often than their peers with type 1 diabetes in the years shortly after diagnosis. These findings are from the SEARCH for Diabetes in Youth study, which is the largest study of its kind in the United States. SEARCH includes a geographically and racially/ethnically diverse group of children and adolescents who were under 20 years of age when they were diagnosed with type 1 or type 2 diabetes. As both types of diabetes are on the rise in young people, SEARCH researchers examined how quickly and how often those diagnosed in youth with type 1 or type 2 diabetes go on to develop the kidney, nerve, and eye diseases that are common diabetic complications, as well as several risk factors for heart disease. Included in the study were 1,746 youth diagnosed with type 1 diabetes and 272 diagnosed with type 2 diabetes between 2002 and 2015. All had participated in follow-up examinations that measured risk factors for complications, and had diabetes, on average, for just under 8 years when their diabetes complications were assessed. From this assessment data, and taking into account the different ages at which the youth were diagnosed, the researchers estimated the chances of youth with diabetes developing complications over time. They estimated that by about age 21, approximately 32 percent of participants with type 1 diabetes and 72 percent of participants with type 2 diabetes would have at least one complication from diabetes or would be at high risk for a complication. For those with type 2 diabetes, this included nearly 20 percent with a sign of kidney disease, 18 percent with nerve disease, and 9 percent with eye disease. For those with type 1 diabetes, this included about 6 percent with a sign of kidney disease, 9 percent with nerve disease, and about 6 percent with eye disease. Measures for two risk factors for heart disease (hypertension and arterial stiffness) were greater for youth with type 2 diabetes than those with type 1 diabetes, but were close to equal between the two groups for a third risk factor (cardiovascular autonomic neuropathy). The researchers looked at factors including blood glucose (sugar) control, body mass index (a measure of weight relative to height), and blood pressure, but none of these factors explained the differences they observed, suggesting that more research into how and why diabetic complications occur in young people is needed. These findings also suggest that early monitoring of youth with both types of diabetes could result in earlier diagnosis and treatment of complications, which could ultimately contribute to better health over the lifespan. Nathan DM, Bebu I, Hainsworth D,…Lachin JM. Frequency of Evidence-based Screening for Retinopathy in Type 1 Diabetes. N Engl J Med. 376: 1507-1516, doi: 10.1056/NEJMoa1612836, 2017. Personalizing Eye Exam Schedule for People with Type 1 Diabetes: Researchers have developed an evidence-based screening schedule for an eye disease (retinopathy) in people with type 1 diabetes, with the frequency of screening tailored to an individual’s current level of eye disease and hemoglobin (Hb) A1c level, a measure of average blood glucose (sugar) control. Diabetes is the leading cause of new cases of blindness in adults. Vision loss, however, can be prevented if the damage is detected and treated in a timely manner. Currently, for people with type 1 diabetes, annual retinal examinations are recommended to screen for signs of retinopathy, starting 3 to 5 years after diagnosis. Previous results from NIDDK’s landmark Diabetes Control and Complications Trial (DCCT) and its follow-up study, Epidemiology of Diabetes Interventions and Complications (EDIC), demonstrated that a period of intensive blood glucose control lowered the risk of complications, including those involving the eyes. This led researchers to ask whether an annual eye screening is necessary for people with type 1 diabetes who intensively control their blood glucose levels. In the over 30 years that DCCT/EDIC participants have been followed, data from approximately 24,000 eye examinations were collected along with information about each participant’s eye health. From these data, DCCT/EDIC researchers modeled the likelihood that a person would progress from a lower level of retinopathy to very severe retinopathy in specific periods of time. They found that, for people with no or mild retinopathy, annual examinations might not be necessary. For people with moderate to severe retinopathy, however, more frequent examinations might be needed to detect retinopathy severe enough that timely treatment is needed to prevent vision loss. They also found that the risk of eye disease progression was closely related to the participant’s HbA1c level. With the goal to limit the likelihood of progression to very severe retinopathy between examinations to approximately 5 percent, the scientists developed an eye examination schedule based on a person’s current state of retinopathy and additionally on HbA1c level. Averaged over all levels of HbA1c, they estimated that a person could go 4 years between examinations if they had no initial retinopathy, 3 years if they had mild retinopathy, and 6 months if they had moderate retinopathy. For people with severe retinopathy, monitoring more frequently than every 3 months would be necessary to reduce the probability to approximately 15 percent of developing very severe retinopathy before their next examination. People at all levels of retinopathy with higher HbA1c levels were predicted to need more frequent eye exams, as they are at higher risk to develop eye disease. For example, the eye examination schedule for people at a current HbA1c level of 6 percent would be 5 years if they had no initial retinopathy, 5 years if they had mild retinopathy, 6 months if they had moderate retinopathy, and 3 months if they had severe retinopathy; whereas for people at a current HbA1c level of 10 percent, the corresponding schedule would be 3 years, 2 years, 3 months, and 1 month, respectively. Taking into consideration both the reduced number of screenings the scientists propose for people with no or mild retinopathy and the more frequent examinations needed for people with moderate or severe retinopathy, they also calculated how their schedule would affect the number of screenings overall for people with type 1 diabetes. They found that, over 20 years, screening according to their tailored examination schedule would, on average, result in a 58 percent reduction in number of exams overall, compared to annual screening for everyone. Combining this overall reduction with an approximate $200 cost for screening and approximate 1 million people with type 1 diabetes, the scientists estimate that this screening schedule could result in a savings of approximately $1 billion over 20 years. Although the additional screening for those with worse retinopathy would increase their screening burden, it could also increase the likelihood of detecting further progression of retinopathy, so that therapy proven to preserve vision can be delivered in the timeframe before irreparable vision loss occurs. Importantly, this schedule has yet to be tested in real-life situations in people with type 1 diabetes. In addition, it is not known if it will be appropriate for people with type 2 diabetes, as it remains to be determined whether retinopathy progresses similarly in people with type 2 diabetes and people with type 1 diabetes. Regardless, this risk-based screening schedule has the potential to personalize treatment to reduce both undetected diabetic eye disease and the burden of annual retinal exams for some people, which may result in cost savings overall and better health for people with type 1 diabetes. **********Division of Digestive Diseases and Nutrition Projects********** Burnett LC, LeDuc CA, Sulsona CR, Paull D, Rausch R, Eddiry S, Carli JF, Morabito MV, Skowronski AA, Hubner G, Zimmer M, Wang L, Day R, Levy B, Fennoy I, Dubern B, Poitou C, Clement K, Butler MG, Rosenbaum M, Salles JP, Tauber M, Driscoll DJ, Egli D, Leibel RL. Deficiency in prohormone convertase PC1 impairs prohormone processing in Prader-Willi syndrome. J Clin Invest 127: 293-305, doi: 10.1172/JCI88648, 2017. An Enzyme Deficiency Contributes to Disease Symptoms in Prader-Willi Syndrome: Researchers have discovered a critical role for the enzyme prohormone convertase 1 (PC1) in the complex genetic disorder, Prader-Willi Syndrome (PWS). PWS is caused when a part of the genome is missing, resulting in several genes not passing down from a father to a child, leading to many detrimental effects on the infant’s body that persist throughout adulthood. Beginning in childhood, affected individuals are often short in stature and develop insatiable appetites, which leads to chronic overeating, obesity, and an increased risk for diabetes and other disorders. Physical symptoms arise from poor regulation of various hormones, including insulin; growth hormone (GH); possibly the appetite-regulating hormone, ghrelin; and others. Most PWS instances are due to a large genetic deletion on chromosome 15. However, researchers identified five PWS patients with a smaller deletion, defining a critical region sufficient to cause the major PWS-associated traits. This region contains three genes, including one known as SNORD116. While none of the existing PWS mouse models develop obesity, mice lacking a paternal copy of Snord116 (referred to as Snord116 p-/m+ mice) develop many of the other clinical features exhibited in their human counterparts, including overeating, decreased body length, and hormone impairments. It is well-established that a part of the brain called the hypothalamus plays a crucial role in regulating appetite through production of and interactions with various hormones. To study how the PWS genetic deletions affect the brain, investigators generated brain cells (neurons) in the laboratory from another type of cell that, unlike neurons, can be obtained from patient volunteers. Using a technique developed by other researchers, they first “reprogrammed” samples of patients’ skin cells to an early stage, stem cell-like (or pluripotent) state in the laboratory and then had them differentiate into (become) neurons. Because the induced pluripotent stem cell-derived (iPSC-derived) neurons contain the PWS patients’ genetic material, scientists could study the effects of the gene deletion encompassing SNORD116. In addition, they studied male Snord116 p-/m+ mice. Analysis of human iPSC-derived neurons revealed that the gene PCSK1, which codes for the PC1 enzyme, had reduced activity, suggesting the possibility of PC1’s involvement in the development of PWS. Furthermore, mice lacking paternal Snord116 had decreased PC1 levels compared to normal mice. PC1 processes prohormones (precursors to hormones) including proinsulin, pro-GH-releasing hormone, and proghrelin, into their bioactive forms—insulin, GH, and ghrelin, respectively. To determine if the hormonal impairments observed in PWS are a consequence of impaired prohormone processing by deficient PC1, researchers measured hormone levels in vivo in Snord116 p-/m+ mice and human PWS patients. Compared to normal mice, Snord116 p-/m+ mice exhibited increased levels of proinsulin, pro-GH-releasing hormone, and proghrelin, indicating an inability of PC1 to properly process the prohormones. The ratio of proinsulin to insulin in the blood of PWS patients was elevated, but not to the extent as that of a patient completely lacking PC1. These data suggest impaired PC1 activity due to paternal deletion of SNORD116 drives the hormonal features of PWS. This research highlights the effectiveness of a combined approach using human cells and blood samples along with mouse models to study a complex genetic disorder. While the findings contribute to a growing body of knowledge investigating how the loss of a gene alters hormone levels and function in PWS, more research is necessary to determine if other mechanisms are involved. Unalp-Arida A, Ruhl CE, Choung RS, Brantner TL, and Murray JA. Lower prevalence of celiac disease and gluten-related disorders in persons living in southern vs. northern latitudes of the United States. Gastroenterology 152: 1922-1932, doi: 10.1053/j.gastro.2017.02.012, 2017. AND Bouziat R, Hinterleitner R, Brown JJ,…Jabri B. Reovirus infection triggers inflammatory responses to dietary antigens and development of celiac disease. Science 356: 44-50, doi: 10.1126/science.aah5298, 2017. Uncovering Factors Linked to Celiac Disease: Two recent studies have provided important insights into celiac disease, including its prevalence in different areas of the United States and the possibility that a viral infection may trigger the disease in genetically susceptible people. The immune system is constantly poised to attack foreign material in the body, but, importantly, it will refrain from attacking benign substances, such as the food we ingest or the body’s own cells. In people with celiac disease, however, the immune system in the small intestine treats gluten—a protein naturally found in wheat, barley, and rye—as a foreign invader. The resulting immune response in the gut mistakenly identifies one of the body’s own proteins as foreign, damaging the intestinal lining, interfering with nutrient absorption, and leading to bloating, diarrhea, and anemia. The two genetic variants that are known to convey risk for celiac disease are very common—up to one-third of the U.S. population carries one of them. Yet only a small fraction of these people will develop the disease, meaning other genetic or non-genetic factors are likely involved. Some studies have sought to determine whether celiac disease is more common in some geographic regions than others, which could help pinpoint factors involved in the onset of the disease. In one such recent study, scientists combed through health data from 22,277 women, men, and children living in the United States who participated in a national health survey between 2009 and 2014. The survey included questionnaires, medical histories, and blood samples, allowing the researchers to determine the number of diagnosed celiac disease cases, as well as those that were previously undiagnosed but were detected in the serological tests performed during the survey. The researchers found that people living north of latitude 40 degree North (approximately the northern border of Kansas) were over five times as likely to have celiac disease as those living south of 35 degree North (approximately the southern border of Tennessee). People living in between these latitudes were also over three times as likely to have celiac disease as people living south of the 35 degree North line. The reasons for a higher frequency of celiac disease in northern states are not clear—genetic or environmental factors could be involved—but the trend does appear to be independent of race, ethnicity, socioeconomic status, and body mass index. The scientists also found that participants who had previously undiagnosed celiac disease that was detected during the survey had lower levels of vitamin B-12 and folate in their blood, likely reflecting a deficiency in the uptake of these nutrients because of intestinal damage. This deficiency was not observed in participants with diagnosed celiac disease, underscoring the importance of diagnosing the disease and undergoing proper treatment (i.e., avoiding gluten). Other studies have hinted that viral infections may contribute to the onset of celiac disease, but, until recently, direct evidence of a role for viruses has been lacking. A new study found that infection with a common virus, called a reovirus, may trigger celiac disease in people who are genetically susceptible to developing the disorder. People are typically exposed to reoviruses throughout their lives, but infections tend to go unnoticed because the viruses are cleared by the immune system, and any symptoms are usually mild. Nonetheless, the researchers thought that the immune responses evoked by these infections might lead to gluten intolerance in genetically susceptible people. To test this idea, the scientists first infected mice with two types of reoviruses that were originally isolated from humans, and they examined the effects on the immune systems of the mice. Both types of reoviruses infiltrated the intestinal cells, where they activated genes such as those involved in antiviral immunity. But one of the reovirus types, called T1L, evoked a more robust immune response and also activated genes in areas of the gut involved in regulating immune tolerance to ingested food. These changes appeared to disrupt the immune system by stimulating attack pathways while blocking suppression pathways, effectively interfering with the immune system’s ability to develop tolerance to certain dietary proteins. Next, the scientists sought to determine whether an immune reaction to T1L could lead to gluten intolerance in mice genetically modified to carry a human genetic variant that confers susceptibility to celiac disease. Like the experiments in the non-genetically modified mice, the scientists found that a T1L infection in these celiac disease-prone mice stimulated the immune system and prevented the mice from developing tolerance to ingested gluten. Lastly, the scientists examined plasma samples from women and men with celiac disease and found they had higher levels of antibodies to reoviruses than people without the disease, providing evidence linking celiac disease to immune responses from reovirus infections. Similarly, people with high levels of reovirus antibodies were more likely to have celiac disease. Taken together, the results from these studies provide insight into factors that could be involved in triggering celiac disease in people who are at genetic risk, offering leads on potential approaches to disease prevention. More work along these particular lines of research could shed light onto why celiac disease is more prevalent in northern areas of the United States and whether vaccination or other antiviral approaches may be effective in preventing the disease. **********Division of Kidney, Urologic, and Hematologic Diseases Projects********** Kim AR, Ulirsch JC, Wilmes S, … Sankaran VG. Functional Selectivity in Cytokine Signaling Revealed Through a Pathogenic EPO Mutation. Cell 168: 1053-1064.e15, doi: 10.1016/j.cell.2017.02.026, 2017 A Personalized Medicine Treatment Plan Developed After Identification of a Rare Pathogenic Mutation: After discovering a rare genetic mutation responsible for a previously unknown severe blood disorder in a 6-year old boy, researchers developed a personalized treatment plan for his newborn sibling, also born with the same blood disorder. Doctors at first diagnosed the boy when he was 1 year of age with Diamond-Blackfan anemia (DBA). DBA is a serious medical condition characterized by insufficient level of red blood cells, also known as anemia. Red cells carry oxygen from the lungs to the body’s organs and tissues. The boy was given standard therapy for DBA. He was treated initially with blood transfusions, which provide a source of needed red cells, and then underwent a bone marrow transplant with a fully matched donor at 6 years of age. Although the doctors hoped the bone marrow transplant would be curative, as is typically the case for people with DBA, early signs showed that the transplant was not working as expected, and unfortunately the boy subsequently did not survive due to complications of the procedure. Clinical scientists became aware of this 6-year old boy while they were conducting research to discover yet unknown genetic causes of DBA. Given the unanticipated transplant outcome, the researchers were interested to learn the cause of the boy’s severe anemia, thinking it might be different from typical DBA, and that new insights might help in developing a better treatment approach for others. An analysis of the boy’s genes did not reveal mutations known to cause DBA but did identify, for the first time, a mutation in the gene encoding the small protein hormone erythropoietin (EPO). The genetic mutation alters one of the 160 amino acid building blocks of the EPO protein; it changes an arginine amino acid to a glutamine amino acid. EPO normally stimulates red cell production by interacting with other proteins, called EPO receptors, on the surface of early stage (progenitor) red cells in the bone marrow. The investigators determined that although mutant EPO had slightly less attraction for the EPO receptor than normal EPO had, the EPO mutation decreased its ability to remain attached to the EPO receptor, apparently resulting in significantly diminished ability to stimulate red cell proliferation (i.e., an increase in number of cells) in culture, as compared to normal EPO. These laboratory findings are consistent with the inability of mutant EPO to produce adequate levels of mature red cells in the young boy. While these research findings were undergoing review for publication, the researchers learned that the parents of the 6-year old boy had a newborn child, who also had anemia. Further testing confirmed that the newborn carried the same arginine to glutamine mutation in EPO. Equipped with new knowledge gained from research, the clinical scientists developed a treatment strategy. They obtained appropriate permissions to initiate a personalized medicine treatment plan consisting of injections of normal EPO produced in the laboratory, also called recombinant EPO. Recombinant EPO is used often to increase levels of red cells in patients whose red cells have been depleted by a different condition, for example as a result of kidney disease or chemotherapy for cancer. The researchers reasoned that recombinant normal EPO might also work for this child, by latching onto EPO receptors more productively than the child’s own EPO, and thus restore red cell production. After 11 weeks of treatment, the child’s red cell production had increased—eliminating the need for blood transfusions. This study underscores the benefit of research to greatly improve the life of a patient. Additionally, this personalized medicine treatment plan potentially could help others with the same disorder. Kitada K, Daub S, Zhang Y,…Titze J. High salt intake reprioritizes osmolyte and energy metabolism for body fluid conservation. J Clin Invest. 127(5): 1944-1959, doi:10.1172/JCI88532, 2017. AND Stegbauer J, Chen D, Herrera M,…Coffman TM. Resistance to hypertension mediated by intercalated cells of the collecting duct. JCI Insight.2(7):e92720, doi: 10.1172/jci.insight.92720, 2017. Insights into Salt Handling, Water Balance, and Blood Pressure Regulation by the Kidneys: Two studies in mice have shed light on the complex relationships between kidney physiology, salt intake, water balance, and hypertension. One of the kidney’s critical functions is to achieve electrolyte balance in the body by controlling urine salt concentration and water retention. Impairment of this essential function can lead to hypertension (high blood pressure). Two recent reports explored the links between salt, hypertension, and kidney function using rodent model systems. Scientists have long believed that the body removes excess dietary salt through urination, leading to water loss that must be replenished—in other words, eating salty foods makes people thirsty. Recently, however, research has cast doubt on this simple relationship between salt and water consumption. In one previous study in 10 men, researchers found, surprisingly, that over time, increased salt consumption was associated with reduced water intake. In the present study, the team of researchers tested their previous observation experimentally using male mice that were fed either a low-salt diet with water or a high-salt diet with saline (salted water). Mice consuming a high-salt diet excreted more concentrated sodium in their urine than did mice on the lower-salt diet. Interestingly, over time the mice on a high-salt diet drank less fluid, retained more water, and consumed more food than did mice on a low-salt diet. These results raised an important question: how does the body remove excess salt without simultaneously expelling too much water? The scientists considered that urea, a biological chemical abundantly found in urine, could be a key factor because urea in the kidney is known to drive reabsorption of water from developing urine. They found that the kidneys of mice on a high-salt diet contained higher levels of urea compared with those on a low-salt diet, helping to explain the observed water retention. Further examination of the mice revealed that additional urea was produced by muscle and liver tissue in response to increased salt. The muscle tissue appeared to be breaking down some of its molecular components as fuel to generate energy, likely to compensate for the energy-intensive process of urea production. This need for additional energy could also help explain the increased appetite observed in mice fed a high-salt diet. Together, these results uncover a novel coordinated, energy-intensive response to dietary salt by the liver, muscles, and kidneys to elevate urea levels, thereby conserving water. In a separate study, scientists sought to gain a better understanding of the molecular basis of water maintenance and blood pressure regulation by the kidney. A segment of the nephron (the basic functional unit of the kidney) called the collecting duct fine-tunes the amounts of various essential substances, such as sodium, that can be retained in the body or excreted into the developing urine. The protein angiotensin II was previously shown to control water reabsorption in the collecting duct. To better understand angiotensin II’s role in the kidney, the researchers genetically engineered mice to lack the gene encoding the type 1 angiotensin (AT1) receptor, its essential protein partner, specifically in the collecting ducts. AT1 receptor-deficient mice had the same blood pressure as normal mice, and both groups experienced hypertension similarly when they were fed high-salt diets. Mice were then administered angiotensin II, which is also known to induce hypertension. Blood pressure in normal mice predictably increased, but surprisingly, blood pressure in AT1 receptor-deficient mice rose even higher; this finding was unexpected because elimination of AT1 receptors was expected to prevent angiotensin II’s ability to raise blood pressure. These AT1 receptor-deficient mice excreted less sodium than did normal mice when given angiotensin II, suggesting that the higher salt levels may have been responsible for the elevated blood pressure. The researchers then asked whether cyclooxygenase-2 (COX-2), a known regulator of angiotensin II function, was affected by AT1 receptor deficiency. Drugs that inhibit COX-2 function have been shown to influence blood pressure, leading the scientists to ask whether there could be a link between COX-2 and AT1 receptor activity in this segment of the kidney. They examined collecting ducts, and found that those of normal mice given angiotensin II contained higher levels of COX-2 than did their untreated counterparts, but the absence of AT1 receptors attenuated this response. Finally, the scientists again treated mice with angiotensin II to induce hypertension, but also administered a chemical inhibitor of COX-2 function. The COX-2 inhibitor eliminated the difference between AT1 receptor-deficient mice and normal mice, allowing the blood pressures of both groups to rise to similar levels, further implicating COX-2 as a mediator of angiotensin II-induced hypertension. Taken together, these results define a surprising, novel role in the collecting duct for the angiotensin II-AT1 receptor-COX-2 molecular pathway as a regulator of blood pressure. These studies in mice challenge long-standing views and reveal the complexity of the kidney’s role in salt and water balance, and in blood pressure regulation. If the molecular pathways described are found to work similarly in people, these two studies could pave the way for a more detailed understanding of how the human body maintains water balance in response to salt intake, and could generate novel therapeutic approaches for reducing the risk of hypertension.Fiscal Year 2019
Division of Diabetes, Endocrinology, and Metabolic Diseases: Haller MJ, Schatz DA, Skyler JS, Krischer JP, Bundy BN, Miller JL, Atkinson MA, Becker DJ, Baidal D, DiMeglio LA, Gitelman SE, Goland R, Gottlieb PA, Herold KC, Marks JB, Moran A, Rodriguez H, Russell W, Wilson DM, Greenbaum CJ; Type 1 Diabetes TrialNet ATG-GCSF Study Group. Low-dose anti-thymocyte globulin (ATG) preserves ?-cell function and improves HbA1c in new-onset type 1 diabetes. Diabetes Care 41: 1917-1925, 2018 Preserving Insulin Production in People with Newly Diagnosed Type 1 Diabetes: Researchers have discovered that treatment with a medicine that suppresses the immune system, called anti-thymocyte globulin (ATG), preserved insulin production and improved blood glucose (sugar) control for at least a year in people with newly diagnosed type 1 diabetes, as compared to placebo (no medicine). Type 1 diabetes is an autoimmune disease in which a person’s immune system destroys ? (beta) cells in the pancreas that make insulin. Previous research has shown that people whose bodies continue to produce some insulin had better blood glucose control, less hypoglycemia, and reduced rates of disease complications. Therefore, current research is examining ways to preserve more insulin production in people with type 1 diabetes. For example, a pilot study previously showed that treatment with ATG in combination with a modified protein called GCSF preserved insulin production for 1 year in people with established type 1 diabetes (duration of disease of 4 months to 2 years). ATG is a medicine used to prevent or treat immune system rejection of a transplanted organ; GCSF is used to increase white blood cell counts in people undergoing chemotherapy. Researchers in NIDDK’s Type 1 Diabetes TrialNet built on the results of the pilot study to determine whether treatment with ATG alone or in combination with GCSF could preserve insulin production when used close to the initial diagnosis of type 1 diabetes. To examine this question, researchers enrolled 89 female and male children and adults, ages 12 to 42 years, with newly diagnosed type 1 diabetes (less than 100 days since diagnosis) in a three-arm clinical trial. One group received a single course of ATG administered via intravenous infusions; another group received the single course of ATG followed by treatment with GCSF administered through an injection every 2 weeks for a total of 6 doses; and the control group received placebo. The study was blinded, meaning that all participants received the infusions and injections, but did not know whether they were getting medicine or placebo. After 1 year of follow-up, the researchers found that the group receiving ATG alone produced more C-peptide, a measure of insulin production, compared to the placebo group. However, C-peptide levels in the ATG/ GCSF group were similar to the placebo group. People in both the ATG and ATG/GCSF groups had better average blood glucose control, as measured by hemoglobin A1c levels, than those in the placebo group. Trial participants continue to be followed to determine if the treatment effects persist for 2 years. The findings show that a single course of ATG could preserve insulin production in people with newly diagnosed type 1 diabetes for at least 1 year, as compared to placebo, but that GCSF did not enhance benefit. These results differ from the pilot study that found benefit from ATG/GCSF combination therapy, although the pilot study did not examine ATG alone and those participants had more established type 1 diabetes. Like most drugs that target the immune system, ATG treatment has side effects of administration, and its therapeutic effects wane over time after treatment. TrialNet is continuing to follow the study participants, and the data from the 2 year follow-up will help researchers determine whether treatment with ATG alone or in combination with other agents should be pursued for preventing or delaying type 1 diabetes in individuals prior to clinical diagnosis. Inge TH, Laffel LM, Jenkins TM, Marcus MD, Leibel NI, Brandt ML, Haymond M, Urbina EM, Dolan LM, Zeitler PS; for the Teen-Longitudinal Assessment of Bariatric Surgery (Teen-LABS) and Treatment Options of Type 2 Diabetes in Adolescents and Youth (TODAY) Consortia. Comparison of surgical and medical therapy for type 2 diabetes in severely obese adolescents. JAMA Pediatr 172: 452-460, 2018. Treating Type 2 Diabetes in Adolescents with Severe Obesity: In a new analysis comparing data from two different studies that evaluated treatments for adolescents with severe obesity and type 2 diabetes, researchers determined that bariatric surgery led to improved outcomes over treatment with medication. Type 2 diabetes is increasingly being diagnosed in youth, and it disproportionately affects youth from racial and ethnic minority populations in the United States. The Treatment Options for type 2 Diabetes in Adolescents and Youth (TODAY) clinical trial showed that the disease may be more aggressive and difficult to treat in youth compared to adults. Because the onset and severity of disease complications correlate with both the duration of diabetes and control of blood glucose (sugar) levels, those with early disease onset are at greater risk for complications than those who develop the disease later in life. Thus, it is imperative to find treatments that help this vulnerable population manage their disease and achieve better glucose control. To glean more information to compare treatment options, researchers returned to the outcomes of two NIDDK-supported studies: TODAY and the Teen-Longitudinal Assessment of Bariatric Surgery (Teen-LABS). In the TODAY clinical trial, youth with type 2 diabetes received either the anti-diabetes medication metformin alone; metformin in combination with another medication, rosiglitazone; or metformin with an intensive lifestyle intervention. The trial showed that metformin alone and metformin with lifestyle intervention were insufficient to control blood glucose adequately in about half of the participants. The trial also showed that, although the combination of metformin with rosiglitazone worked somewhat better than metformin alone, this drug combination failed to maintain adequate blood glucose control in a high proportion of participants. Teen-LABS is an observational study that has been evaluating health outcomes of adolescents with severe obesity who underwent bariatric surgery. Teen-LABS found that bariatric surgery resulted in major weight loss and improvement in overall health and quality of life. However, questions remained regarding treatment options for youth with severe obesity and type 2 diabetes. Because some of the Teen-LABS and TODAY participants had both type 2 diabetes and severe obesity, researchers embarked on a new analysis to compare bariatric surgery and medication for treatment of these conditions in youth for the first time. They analyzed data for 30 adolescents with severe obesity and type 2 diabetes who underwent bariatric surgery in Teen-LABS in comparison with data from 63 adolescents with these conditions who received medication in the TODAY study. The researchers found that hemoglobin A1c levels (HbA1c, a measure of average blood glucose control) worsened over 2 years of follow-up in adolescents treated with medication (an average increase from an HbA1c of 6.4 percent to an HbA1c of 7.8 percent, with a higher value indicating worse control), whereas HbA1c levels improved in adolescents treated with bariatric surgery (an average decrease in HbA1c from 6.8 percent to 5.5 percent). Adolescents treated with medication were also more likely to gain weight (an average increase of nearly 13 pounds), have worse diabetes (as indicated by HbA1c and other measures), and show no improvement in blood pressure or kidney function. In contrast, adolescents treated with bariatric surgery showed weight reduction (an average loss of over 94 pounds), remission of their diabetes, and improvements in blood pressure and kidney function. The striking improvements in those treated with bariatric surgery indicate that this could be a treatment option for adolescents with severe obesity and type 2 diabetes. Bariatric surgery, however, has serious surgical risks that need to be balanced with the potential benefits. For example, within 2 years of having bariatric surgery, 23 percent of adolescents with type 2 diabetes treated with bariatric surgery required subsequent operation and/or hospitalization for conditions related to the bariatric surgical procedure. Further research is needed to understand the long-term outcomes of bariatric surgery in this population. These results also continue to highlight the pressing need for better treatments for youth with type 2 diabetes— and for better prevention of diabetes in youth. Division of Digestive Diseases and Nutrition: Mohanan V, Nakata T, Desch AN, Lévesque C, Boroughs A, Guzman G, Cao Z, Creasey E, Yao J, Boucher G, Charron G, Bhan AK, Schenone M, Carr SA, Reinecker HC, Daly MJ, Rioux JD, Lassen KG, Xavier RJ. C1orf106 is a colitis risk gene that regulates stability of epithelial adherens junctions. Science 359: 1161-1166, 2018 Becoming Unglued: How a Genetic Variant May Affect the Gut Barrier and Contribute to Inflammatory Bowel Disease: Researchers have found that a genetic variant may impart risk for inflammatory bowel disease (IBD) by disrupting the cellular “glue” that keeps the gut’s lining intact. People with IBD suffer from chronic inflammation in the gut, resulting in symptoms such as diarrhea, cramping, and weight loss. Scientists have been sorting through the complicated mix of factors that contribute to IBD, including numerous possible genetic components that are important for maintaining effective physical and immunological barriers to the multitude of bacteria that inhabit the gut. The International IBD Genetics Consortium, of which the NIDDK-supported IBD Genetics Consortium is a member, has identified over 200 regions of the human genome that are associated with IBD. Scientists are now combing through these regions to identify genes—and variants of those genes—that are involved in the disease. One of the genetic variations that consortium scientists had identified as a risk factor for IBD was in a gene called C1orf106; however, until recently it was not clear exactly how variants of this gene might lead to disease. Researchers attempting to uncover the function of “normal” C1orf106 found that laboratory-grown gut cells produced high amounts of the C1orf106-encoded protein when they were in close contact with each other. This suggested that C1orf106 may contribute to cellular junctions—the “glue” that cobbles gut cells together to create a continuous, sheet-like barrier. Another hint was uncovered when the researchers found that the C1orf106 protein interacts with cytohesin-1, a protein that disrupts cellular junctions by activating a molecular switch called ARF6. Functional C1orf106 in cells caused degradation of cytohesin-1 and lower ARF6 activity, stabilizing cellular junctions. These signs pointed to a role for C1orf106 in maintaining the intestinal barrier by keeping cytohesin-1 levels in intestinal cells relatively low. Likewise, male and female mice engineered to lack C1orf106 had higher levels of cytohesin-1 than mice whose genes were unaltered. These mice also showed greater intestinal damage after they were infected by a bacterial pathogen, supporting the idea that C1orf106 is important for maintaining a barricade against gut pathogens. However, some variants of the C1orf106 gene may not be as effective as others. In fact, when the scientists replaced C1orf106 in cells with the specific variant of the gene that is associated with human IBD, the cells were unable to make enough of the C1orf106 protein to form proper junctions. These studies strongly imply that defects in C1orf106 contribute to IBD by failing to maintain an adequate intestinal barrier. This information could help to guide the development of improved therapy for people with this genetic variant, although more work is needed to determine if the observations from the mouse model hold true in humans. Lertudomphonwanit C, Mourya R, Fei L, Zhang Y, Gutta S, Yang L, Bove KE, Shivakumar P, Bezerra JA. Large-scale proteomics identifies MMP-7 as a sentinel of epithelial injury and of biliary atresia. Sci Transl Med 9, pii: eaan8462, 2017. New Biomarker To Diagnose Life-threatening Liver Disease in Children: Researchers have identified a protein present at high levels in blood from infants with biliary atresia that may enable early and accurate detection of this potentially deadly disease. Biliary atresia is a serious liver disease that occurs during the first few months of life. In this disease, bile ducts that drain from the liver, delivering bile acids to the intestine, become inflamed and scarred, leading to a back-up of bile into the liver. This back-up can result in liver damage, as evidenced by jaundice, or the yellowing of the skin and eyes. If not treated with surgery or liver transplantation, biliary atresia can lead to liver failure and is ultimately fatal in these infants. Although a rare disease, biliary atresia remains the most common form of severe liver disease in children and the leading cause for pediatric liver transplantation. While its causes are not fully understood, both inherited and environmental factors appear to play a role in disease development. Early diagnosis and treatment are critically important for ensuring the best outcomes for infants with biliary atresia, but often diagnosis is delayed because jaundice in infan