We Accelerate Discovery

You are here

Printer

Common Fund Strategic Planning Social Media Summary

Blog Icon

As part of the strategic planning process to develop new Common Fund programs for 2013, we requested feedback from the community on proposed ideas via a social media website. This interactive web-based dialogue solicited comments and provided a forum for discussion of ideas proposed by scientists and stakeholders at the “Innovation Brainstorm: Transforming Discovery into Impact” meeting and from the NIH Institute and Centers (ICs). Interested parties could post comments and discuss posts to help us refine broad conceptual topics into programs that more closely address Common Fund criteria. Since this was not a voting process, comments that were simply "for" or "against" any particular topic were not quantified. We were looking for input about how to shape these ideas so that the Common Fund investment will have a catalytic, transformative impact on a broad range of diseases and conditions.

How Did People Participate? Each Program in the Common Fund started as a broadly stated concept that had to be refined so that it addressed defined, but broadly relevant challenges, or as a narrow topic that could be broadened to have greater impact across diseases. We requested help in this process for the new topics by asking people to consider each idea and ask:

  • If the objectives outlined in this proposal were achieved, would it transform research efforts in many disease areas? If not, but if the idea could be modified or refined to have broad impact, what would those adjustments be?
  • Does the proposal define objectives that Common Fund investments could achieve in a 5-10 year period? If not, how could the specific objectives be clarified or adjusted to meet this criterion for Common Fund support? What WILL be achievable in this time period?
  • Will the idea benefit researchers across the board, such that the Common Fund program will help individual investigators who are funded by the NIH Institutes and Centers?

     

We received more than 290 comments across all ideas. More than 60 percent of these comments, provided in this summary, provide suggestions and concrete recommendations about how to refine or implement the proposed idea as a Common Fund program.

Table of Contents

Artificial Organs as Tools for Translation

Nominator: Innovation Brainstorm participants

Major obstacle/challenge to overcome: Current knowledge of human health and disease is overly dependent on the results of in vitro models, which have variable value in replicating human systems. As an improvement over cell culture and some animal models, many researchers are turning to organ-mimics to perform some of these key experiments. Currently, most organ-mimics recapitulate only a fraction of the function and have little of the structure of the primary organ, but it is clear that both structure and function play a critical role in biological behavior. Experiments using organ-mimics also offer the potential to deepen understanding of the systems biology of organs and tissues.

Emerging scientific opportunity ripe for Common Fund investment: Emerging capabilities for the development of organ-mimics that maintain both structure and function of the original organ offer the opportunity to overcome limitations of in vitro studies that probe human biology. The development of three-dimensional organ-mimics could have short-term impact in drug screening and long-term benefit as replacement organs for the regenerative medicine toolkit.

Common Fund investment that could accelerate scientific progress in this field: Multidisciplinary work and interactions are essential in this field, and thus NIH awards in this area should be developed for both individual investigators and multi-investigator projects. Specific recommendations include:

  • Projects that integrate developmental biology, organ-quality assays, and bioengineering should be supported. The frontiers of developmental biology should be explored so that engineering approaches can be better informed.
  • The NIH should push for technology transfer mechanisms that accelerate tool- and instrumentation sharing.
  • Development of a human induced pluripotent stem (iPS) cell repository would enable studies of disease pathogenesis as well as those that seek to develop organoids from diverse genotypes.
  • The use of iPS cells to develop organoids with diverse genotypes would facilitate the development of more robust drug screening platforms that could distinguish safety/toxicology issues that vary with genotype. For example, these organoids could offer an improvement over the traditional hERG (human Ether-a-go-go Related Gene) test that is used to assess cardiac safety of new drugs.
  • The California Institute for Regenerative Medicine hosts a biorepository of iPS cells for the study of neurodegenerative disease; this sort of resource, on a broader scale, would be very useful to the scientific community.
  • Establish an NIH-funded “Jackson Lab” for organ-mimics.

Potential impact of Common Fund investment: Progress in this area has substantial promise, including a paradigm shift in conducting clinical trials, from costly and slow to cheap and fast. Other benefits of true organ-mimics include: i) pushing the frontier of developmental biology research and ii) accelerating preclinical evaluations (providing a next-generation “immunoassay”). Long-term benefits include the development and use of artificial organs, which are of obvious relevance for thousands of diseases and conditions.

Comments:

I think that this is a wonderful initiative and agree that the timing is perfect for this. To maximize success, it will be critical to consider how best to validate the in vitro assays/models. This is not an easy task, and thus may warrant future discussion. i.e. how can we ensure that the insight gained from these models is predictive of in vivo response? Perhaps this should be a criteria for receiving $ from this initiative - to demonstrate validity by correlating results to relevant in vivo models (preferably those that mimic the human condition)? To maximize the impact of this initiative - perhaps grantees could be assigned a catalyst - someone who has deep experience / knowledge linking in vitro assays/models to relevant in vivo biology - a good model for the “catalyst” is the Deshpande Center at MIT - that requires all grantees be assigned a catalyst who signs an NDA and helps to maximize success of projects by providing high level insight -http://web.mit.edu/deshpandecenter/catalyst.html. It will also be critical for these models to be readily accessible and made available to others – thus key criteria = they will need to be relatively simple and 'transportable' so that they can be easily adopted by others. Developing relevant in vitro models from artificial organs should have an enormous impact on elucidating new biology and developing a new wave of therapeutics - key is for them to be relevant AND be amendable to global distribution/use.

 


A few key aspects make this “the right program at the right time”:

1. there is an growing consensus that (per Doris Taylor at the “Transforming Regenerative Medicine” meeting in 2008), “it seems that if you simply provide the cells with the right environment and then get out of the way, they know what to do.” Several groups now have found that even cell lines highly adapted to artificial growth conditions adopt tissue-appropriate structure and function if provided the right local cues. The trick, of course, is to know which local conditions are critical and which incidental.

2. “organotypic” has a broad meaning. “Function” may be achieved with less “structure” than previously thought necessary. For example, many systems can deliver phenotypes that mimic those found in patients in “2.5D.”Fully reproducing the 3D structure of the target organ may not be necessary.

3. Biomechanics is emerging as a key process in development of organotypic function. Innovative bioengineering technologies that are miniaturized and/or high throughput enable the inclusion of such processes.

4. Another critical consideration is an appropriately structured nanotopography (for some tissues). For example, aligned scaffolds help orient cardiomyocytes to facilitate cell fusions to form a heart syncytium. Capitalizing on recent improvements in electrospinning and or nano-printing and/or lithography empowers the generation of these platforms with high reliability.

5. In vitro test beds that faithfully mirror human organ function may be able to be multiplexed. Adding “organ cassettes” sequentially may offer the opportunity to mimic the entire human response.

6. Understanding the minimum organizational principals to deliver human-like responses characteristic of an organ will enable higher throughput systems. With such systems, scale up to cultures representing a wide array of genotypes within the human population could enable “virtual Phase IV” studies that simulate effects more reflective of the target population for a particular therapy.

7. Complex mimics of organ function will enable in vitro models of disease progression. It should be possible to design systems that represent slight deviations from normal through frank disease in a given tissue or organ.

8. The early stages of tumor formation and metastasis (currently impossible to study) should be amenable to an in vitro system (with real time imaging incorporated) that mimics the structure and function of the target organ.
 


This is an incredibly timely initiative as 3D tissues and organoids are poised to serve as human, “pre-clinical” or “surrogate” tissues that act as a translational modality to provide more meaningful correlations between in vitro screening assays for toxicity and efficacy and in vivo tissue outcomes in human clinical trials. These tools and platforms can now play a pioneering, strategic role in moving discovery science into a new paradigm that provides a novel focus on clinical relevance. Commercialization of an expanded repertoire and library of these powerful new 3D tissue technologies will immediately meet the broad needs of the translational research community. Building additional complexity into these existing paradigms will be a huge step forward towards meeting these broader needs.

At Tufts, we have established the Center for Integrated Tissue Engineering (CITE) through which we provide a service that generates 3D human tissue equivalents for broad use in the product evaluation, drug development and drug discovery processes. We now offer these HTEs to meet standardized performance benchmarks as these models are predictive of human tissue responses. Such 3D tissues are being used by 14 companies as human, “pre-clinical” or “surrogate” tissues that act as a translational modality to provide more meaningful correlations between 2D, in vitro screening assays for toxicity and efficacy and in vivo tissue outcomes in human clinical trials. Responding to the comment above (beginning “I think that this is a wonderful initiative….”; great ideas!!), we have found that industry is not necessarily looking for “validated” models in the classical sense, but models that are more predictive than monolayer culture systems. These tools are already playing a pioneering, strategic role in moving discovery science into a new paradigm that provides a novel focus on clinical relevance. There is a critical mass of science that has been generated by the interface between the fields of regenerative medicine, tissue engineering and the 3D models that already exist. As a result, the field is now primed for the complexity that can now be built into these model systems that will make them even more in vivo-like and robust to generate new repertoires of tissues both for basic science discovery and drug/product development, as well as for many other companies.

This significant demand from numerous biotech, cosmetic and drug development companies make it clear that the following 3D tissue platforms are critically needed:

1. Standardized 3D models for predictive safety and efficacy screening in human tissues – Safety and toxicity testing in the drug and product development space using 3D tissues will allow the large-scale safety testing of new compounds that may have therapeutic potential in humans by providing more reliable correlations between in vitro studies and in vivo outcomes in humans. This will allow us to more efficiently screen the efficacy and safety of new compounds that may have therapeutic potential in humans in more predictive assays including: (1) tissue penetration, (2) tissue metabolism, (3) phototoxicity, and (4) irritation. These currently only exist in 2D monolayer formats. Industry and academic labs need much more predictive 3D tissue assays that will be more highly-predictive screening assays that will direct the transition to Phase I clinical trials.

2. 3D tissues that mimic human cancer as a research tool for target identification and validation in cancer drug discovery and development – 3D human tissues can serve as human cancer “surrogates” that mimic various stages of cancer progression in human tissues. These cancer surrogates will enable a significantly higher predictive value for success in clinical trials than 2D, cell-based assays.

3. Specialized and complex adaptations of 3D tissue models currently used to screen tissue damage nd repair – 3D tissues that mimic human skin have been adapted to test and screen compounds in response to wounding and cellular damage. The addition of new cell types such as immune cells and vascular tissues will be a tremendous addition to these 3D tissue types.

4. 3D tissues that include microvasculature - Complexity needs to be added to 3D tissues and organioids by creating micro-scale bioreactor-like tissues that would provide control over both the physical and biochemical properties required for in vitro tissue engineering. Physical control over the local environment within the tissue will be achieved through the use of micromolding techniques applied to degradable biomaterials, to create fluid networks and scaffold structures engineered to provide appropriate mechanical and fluid flow properties. Biochemical control of the microenvironment would ideally be provided by functionalization of the reactor/tissue surfaces. These are just a few examples of the next generation of complex tissues that can be fabricated in the future.

Up to Top

Artificial Organs: From Lab Bench to the Body (see “Artificial Organs as Tools for Translation” in Innovation Brainstorm ideas)

Nominator: NIDCR

Major obstacle/challenge to overcome: The long-term clinical objective of this Program is to make available artificial organs for in vivo replacement or in situ regeneration of non-functional ones due to aging, tissue degeneration, birth defects, and injury. Common Fund investments could facilitate accomplishing this goal by supporting a research Program in developing powerful in vitro tissue platforms for drug screening, toxicology testing, disease modeling, and diagnostics, as well as for in vivo strategies for organ replacement, by leveraging recent breakthroughs in cellular reprogramming, bioengineering, high-throughput technologies and pharmacogenomics.

Successful building of new organs will require overcoming a number of challenges. The nature of these challenges will depend on the intended application; whether, for example, the organ will be used for the development of simple or sophisticated in vitro platforms, or if it will be used for replacing diseased organs in vivo. The utility of the induced pluripotent stem (iPS) cells for organ building is primarily related to their pluripotent nature, because pluripotency permits virtually unlimited expansion of patient-specific genetically matched undifferentiated cells. However, obtaining fully functional and homogeneous populations of stably -differentiated non-tumorigenic cells from iPS cells is notoriously difficult, and this limitation is considered to be a significant impediment to translational applications of these cells. Therefore, additional approaches are needed for obtaining abundant cell sources that do not posses limitations of the iPS cells.

Emerging scientific opportunity ripe for Common Fund investment: The proposed Program endorses and expands the idea of “Artificial Organs as Tools for Translation” that was derived from the Innovation Brainstorm meeting. However, we believe that building the CF Program solely on fully reprogrammed iPS cells, as currently planned will significantly limit the overall success of the effort. We also argue that the results of in vitro and in vivo work should and will synergistically benefit each other. Therefore, separating them, and focusing primarily on in vitro screening technologies with only minor emphasis on in vivo organ replacement, as currently proposed, will diminish the long-term translational impact of this CF investment.

We propose to initiate a Program to create unprecedented opportunities for basic research, and translational and clinical applications by anchoring on the rapidly-developing direct lineage reprogramming technologies that are widely recognized as paradigm shift approaches for creating artificial organs for in vitro assays and for organ replacement in vivo. Direct reprogramming can overcome limitations of iPS cell technologies, because it involves only partial reversal of terminal differentiation state (as opposed to full, as with iPS cells) thus leading to formation of patient-specific lineage-committed embryonic or adult progenitors. Such cells can be expanded and are amenable to robust differentiation into functional somatic cells. Moreover, with the help of advanced bioengineering tools, it will be possible to achieve safe direct reprogramming in vivo making in situ organ regeneration into a reality. The drawback of directly reprogrammed cells is in their relatively limited expansion capacity which is similar to that of adult stem cells or embryonic progenitors, and this complicates the task of obtaining sufficient cell numbers for building human-size organs. Therefore, the intent of the proposed expansion of the “Artificial Organs as Tools of Translation” Program is to enhance the Program by taking advantage of full and direct reprogramming to benefit translational application of these breakthrough technologies.

Common Fund investment that could accelerate scientific progress in this field: Recommended CF investments in this area (in addition to those proposed in original Program) include: (i) Developing strategies for induction of full and direct reprogramming using novel gene delivery and small molecule approaches that will eliminate the need for genetic modification; (ii) Developing platforms to study epigenetic, proteomic and transcriptomic changes for improved efficiency of reprogramming; (iii) Advancing science and technology for expansion and differentiation of reprogrammed cell; and (iv) Developing strategies for direct reprogramming in vivo.

Potential impact of Common Fund investment: The specific outcomes from this Program will include novel tools and standardized protocols for cellular reprogramming, expansion, controlled differentiation and repository of functional and well- characterized multipotential cell populations for a variety of applications, including functional assays, diagnostics and organ building in vivo and in vitro.

Since many general questions still need to be answered and new technologies, tools and platforms to be developed, this field will greatly benefit from the “incubator space” of the CF mechanisms. Once the original goals of the Program are achieved, the need for the CF involvement will decrease and it will become more appropriate for the individual ICs to carry on with their own Programs to build artificial organs for applications in their respective mission areas.

NIDCR is in excellent position to lead this CF effort because of its trans-NIH and multi-agency shared interests and collaborations through Armed Forces Institute for Regenerative Medicine (AFIRM), Nano Task Force and National Nanotechnology Initiative through the Office of Science and Technology Policy, Multi-Agency Tissue Engineering Science (MATES) Interagency Working Group, and the newly established Intramural National Center for Regenerative Medicine.

Comments:

As one of the panel members from the brainstorming session who worked on the “Artificial Organs as Tools for Translation” proposal, I agree that this initiative should not be restricted to iPS cells and that adding an emphasis on direct reprogramming makes sense. I also agree that in vitro and in vivo work should go hand in hand, but I would like to emphasize that in vivo work should include a strong developmental biology component. Unless we continue to develop our understanding of how organs normally form in vertebrate embryos, we will not be able to take advantage of endogenous regeneration programs to the fullest. Therefore, I think that it is important to interpret part iv (direct reprogramming in vivo) broadly and to include developmental biology as part of this area, so as not to exclude some very exciting avenues to organ regeneration.

 


This is a fantastic initiative. In addition to reprogrammed cells (full and direct), it would also be of interest to consider other cell sources that may provide more immediate solutions including adult stem and progenitor cells such as those listed on http://stemcells.nih.gov/info/basics/basics4.asp (e.g. mesenchymal stem and progenitor cells, neural stem cells, endothelial progenitor cells, epithelial stem cells, skin stem cells...). These can also be married with state of the art approaches in bioengineering, high-throughput technologies and pharmacogenomics towards regeneration or replacement organs.

Up to Top

Beyond GWAS (Genome-Wide Association Studies)

Nominator: Innovation Brainstorm participants

Major obstacle/challenge to overcome: Although GWAS (genome-wide association studies) have uncovered many genetic loci for a range of conditions and diseases, a major challenge is translating this knowledge into functional insights. One key roadblock is the inability to capture precisely various and diverse environmental measurements. Incomplete, non-standardized, and shallow collection of phenotype data contributes to the difficulty of using GWAS data to define mechanisms and/or suggest potential interventions. Insufficient sample sizes prohibit the clarification of the role and relevance of complex traits in health and disease. In some cases, valuable opportunities may be missed, as in harnessing genotyping data from randomized clinical trials that have rich phenotypic data. For the massive amounts of data that already exist, practical and effective strategies for integration lag behind. Possible remedies include new algorithms for performing higher-order ‘omics studies, a repository of rare knockouts, and more complete sharing of data and biospecimens.

Emerging scientific opportunity ripe for Common Fund investment:

Graph Further progress in GWAS requires both persistence and innovation. While GWAS execution is routine and fairly well-established (in the “D” portion of the graph below), others are in a period of rapid growth [in the “C” region of the curve: single trait analysis, expression quantitative trait loci (eQTLs)], and still others require a substantial push to reach their potential (in the “A” and “B” areas below: functional annotation of genetic variants, annotation of a reference genome (ENCODE, the ENCyclopedia Of DNA Elements), whole-genome analyses in unrelateds/families, large-scale phenotyping, and clinical translation). This last group is likely to be the most ripe for Common Fund (CF) investment.

Common Fund investment that could accelerate scientific progress in this field: Three proposed projects (each independent but complementary and potentially synergistic) could overcome some of the current roadblocks in this area.

  • Human Phe-Ge Project. This proposed project represents a very large-scale effort to create a “National Cohort” of people (DNA plus phenotypic data) for discovery research in health and disease. The large sample size (1 million people across the United States) would permit sufficient coverage of the human genome, along with a diversity of participants that reflect the U.S. population make-up. Clinical data would be harvested from electronic medical records (EMRs) (after participants opt in), and the cohort would be followed longitudinally. Web surveys (e.g. 23andwe) could harness the reach and power of social networking to gather data in real settings. Within the larger cohort would be a sub-cohort of approximately 1,000 people, who would be subjected, upon consent, for deep phenotyping and clinical validation. Data sharing would be free and wide, with appropriate consent in place from volunteer participants.
  • Functional Genome Project. This potential project would leverage functional information to find causal variants, employing ENCODE (http://www.genome.gov/10005107), epigenomics, and functional genomics strategies. Functional annotation of 1,000 individuals over multiple cell types / conditions would record transcription, DNA methylation / histone modifications, and DNA sequencing (phased whole-genome sequencing). The project aims to advance GWAS science by yielding a more granular phenotype that will enable faster translation of genomic findings to clinical applications.
  • Multidimensional Analyses for Genomic Studies. To further address the issue of GWAS data integration, this project would strive to provide context for genomic data by accessing environmental measures, incorporating population and family structure, and including epigenetic context. Higher-level interactions could be identified through the capture of functional interactions, pathway analyses, and novel combinatorics approaches. Candidate methodological innovations include such as more flexible analysis methods and study designs, whole-genome sequencing, and computational improvements that speed and expand processing capabilities.

     

Potential impact of Common Fund investment: Moving GWAS beyond its current capability offers faster movement from association to function, which will likely accelerate discovery for multiple traits. Clinical relevance of most GWAS to date is lacking: The proposed projects aim to lead to better clinical decision support, new diagnostics and therapeutics, improved coordination with industry, as well as the realization of meaningful use criteria of the HITECH act (The Health Information Technology for Economic and Clinical Health (HITECH) Act, enacted as part of the American Recovery and Reinvestment Act of 2009, was signed into law on February 17, 2009, to promote the adoption and meaningful use of health information technology).

Comments:

I'm excited about the Phe-Ge project.

1. There needs to be a standardized, complex mechanism for phenotyping humans, esp. race. Esp. given the highly complex genetic variation of Africans, let alone African Americans and the inter-mixing of many cultures and genomics for most Americans.

2. I also think the study should moreso represent the world population, not just the US population, since the US is in the unique position of having economic resources and such genomic diversity in its population, the implications are going to be extend past nation-wide, it will likely affect practices, diagnostics, and therapeutics throughout the world.

You should also take data on economic and environmental factors. The experiences of same race people of different economic backgrounds can greatly affect phenotype because of education and access to resources.

3. I would also suggest studies that don't rely only on medical data from hospitals. Is it feasible to do a Census type thing and solicit data, especially from vulnerable populations? Develop a collaboration with primary care physicians and for all the children that must get checkups each year for school, you can ask them and the parents extra questions and take blood tests.

Maybe even take extra blood tests when kids come in for vaccinations.

4. Also look to biomedical engineers and collaboration to help develop rapid diagnostics for biomarkers or genes of interest.

The ideas listed above are all interesting, but very early in the research spectrum. They are also variants on what is already being done, just do it sooner. We need to stimulate innovation in the application of genomics and GWAS results, as the late stage innovation then drives more purposeful innovation in the discovery, validation, credentialing stages. How about a call for population-level applications of genomics, with direct public health impact? We can't stay stuck in the one way of progressing science (it is too slow).

 


“One key roadblock is the inability to capture precisely various and diverse environmental measurements”

Where is the effort laid out to define and capture such environmental factors? More research is needed to bring the 'omics' of exposure (i.e., quantifying the exposome) up to speed with the omics of traditional health research.

 


Human Phe-Ge Project: What is missing in this proposal is the recognition that exposure and response must be characterized during critical windows of development. These windows are shared for some phenotypes and vary for others. This basic biological concept is no longer controversial, but using it to improve human health PRESENTS SIGNIFICANT LOGISTICAL PROBLEMS THAT CAN BE OVERCOME PROBABLY ONLY WITH A MECHANISM LIKE THE COMMON FUND. The evolving concept is that both the “dose/exposure AND the timing of exposure make the poison” borrowed from Dr. Linda Birnbaum). In some cases, both the exposure and response together, during a particular point in the life span, far removed from the resultant phenotype, constitute the primary risk factor. Prevention may require a timed intervention. Neither the basic science, nor the best time for intervention can be illuminated IN OUR LIFETIMES with a large mega study that begins today--whether there are a million or a billion subjects--because subjects must remain under observation from womb to tomb and so must subsequent generations. There is an approach that can yield answers now: Choose existing study populations where exposure windows of interest can be characterized in combination with genotyping, epigenomics, metabolomics and proteomics. There are many opportunities to make this happen by investing in well characterized birth cohorts in the U.S. and abroad, by lifting them from their struggle of one R01, one disease at a time, and recognizing their universal value to this emerging field, and across all NIH Institutes. Were you to do this, cohorts followed for multiple generations, across points in the life-span when exposures count, where multiple phenotypes have been or can be characterized, can yield answers now--- without waiting 50 or more years for subjects in a newly launched study to mature.

Arrow I agree that, to use humans, complicates these issues in that the study will take 100 years or more and it is difficult to control diet, stress, exercise, genetic variability, as well as exposure to occupational hazards vs. exposure to modest amounts of environmental chemicals and metals. The MOUSE, especially with all of its tools of knockout and knock-in lines, is the best answer (i.e. they have livers-lungs-spleens-kidneys-brains not so different from humans). A FUNDAMENTAL EXPERIMENT should be set up in which whole-genome (“3rd generation” sequencing) is combined with ALL (known) forms of epigenetics [DNA methylation, microRNAs, chromatin remodeling, histone modifications, AND canalization/decanalization as soon as we understand more about this phenomenon], and also metabolite profiles of urine as well as individual tissues or cell types. A healthy control mouse (e.g. C57BL/6J?) should be compared with one variant (e.g. exposure to cigarette smoke?), while keeping everything else constant. The exposure could be carried out in utero and then the generation of pups compared with a 2nd generation (F2) and a 3rd generation---in which no further cigarette smoke or any other perturbation is given (beyond that in utero exposure). Thus, the experiment would look for immediate effects and also epigenetic effects.

 

This one FUNDAMENTAL EXPERIMENT would take at least 5 years to complete. Maybe 10 years. After which other environmental perturbations (a chemical, high-fat diet, a heavy metal, etc.) can be substituted. Also the environmental exposure can be to the weanling or to the sexually mature adult or to the aged mouse, instead of in utero exposure. There are many types of permutations, once one has analyzed the data from the original simple FUNDAMENTAL EXPERIMENT.


Given the ubiquitous influence of sleep and circadian timing on molecular and physiological processes, research aimed to understand how the interaction of genetics and environment leads to the development of disease need to consider the time of day at which the data are collected, and their relationship to the circadian cycle and sleep satiety of the organism.

There are examples of the importance of both circadian and sleep status ranging from cardiovascular parameters (heart rate, blood pressure, vascular resistance, thrombolytic activity) to endocrine (just about every hormone), to metabolic activity, to digestive enzymes, to nearly all, if not all physiological functions. In addition, there is evidence from cancer chemotherapy that timing of administration can reduce side effects and increase effectiveness. The pervasiveness of metabolic, cardiovascular, and cognitive problems in people with restricted sleep and circadian misalignment are key issues germane to the breath of NIH and to the missions of its Institutes and Centers.


While the Phe-Ge project ideas are very exciting, they sound diffuse and unfocused. Finding treatments for complex disorders is a major challenge in health today. Complex disorders are multifactorial in origin requiring scientists from different disciplines to work together collaboratively and develop sensitive models and frameworks that will facilitate translation of therapeutics to the clinical setting. Given the state of EMR currently, using EMR is a bad idea as it has poor sensitivity. For example, information on diet, exercise, substance use, adversity and stress, neighborhood toxic exposure are inconsistent to non-existent in EMRs, but these factors predict poor health outcomes reliably. Using specific complex disorder-based interdisciplinary teams to obtain multifactorial biopsychosocial assessments in humans to identify risk of disease groups and disease course, but also support compound development to target specific phenotype risk profiles with specific therapies may be more feasible and with the best bang for the buck.

Up to Top

Molecular Phenotypes for Genome Function and Disease (see “Beyond Genome-Wide Association Studies, GWAS, in Innovation Brainstorm ideas)

Nominator: NHGRI
Participating IC: NIDA

Major obstacle/challenge to overcome: Understanding how the human genome functions and how it is influenced by genetic variation in health and disease are major challenges of wide interest across NIH. The Innovation Brainstorm meeting suggested this area in “Beyond GWAS”: “Establish a functional genome project that leverages functional information to find causal variants — employing ENCODE, epigenomics, and functional genomics strategies”. Several projects are addressing pieces of these challenges but none in the comprehensive manner required. GWAS studies have found thousands of human genomic regions associated with disease, but definitively identifying which genomic variants and elements in these regions are causal, rather than simply correlated, is a major challenge for the field. Mapping GWAS hits to functional elements catalogued by ENCODE and other efforts are providing some insights, but determining the causal links and understanding the mechanistic underpinnings are still very difficult with current resources.

Several critical gaps exist, including limited knowledge of variability between individuals for a range of molecular phenotypes; the correlations in molecular phenotypes across tissues; variability in somatic genomic changes/mosaicism among tissues within individuals; the influence of environmental exposures (e.g., diet, toxins, stress) on molecular phenotypes; and the molecular phenotypes of cell types in vivo. Furthermore, integration of data across these and other projects (ENCODE, Common Fund Epigenomics, Common Fund GTEx, etc.) and with GWAS and other disease studies is lacking.

The field needs experimentally tractable systems to generate integrated and comprehensive data resources to study gene function and how genetic variation leads to differences in function and disease.

Emerging scientific opportunity ripe for Common Fund investment: Recent improvements in high-throughput molecular assays and the availability of rich model organism resources provide an opportunity to interrogate gene function in vivo at an unprecedented level of detail. The cost of this project is much lower than it would have been even a few years ago since many of the technologies for molecular phenotyping, such as RNA-seq, ChIP-seq, and DNase-seq, are based on sequencing, the cost of which continues to decline rapidly.

This project would be synergistic with existing and new projects, some of which are already supported by the Common Fund and by Institutes and Centers (ICs):

Common Fund investment that could accelerate scientific progress in this field: The Common Fund could invest in the generation and analysis of multiple molecular phenotypes in model systems such as mice, rats, and flies. This resource would include measurement of gene expression and multiple additional molecular phenotypes (epigenomic marks, chromatin accessibility, transcription factor binding sites, etc.) in completely sequenced strains of model organisms. Using model organisms would allow access to a full range of tissues in different developmental, environmental, and disease states. The mouse Collaborative Cross (CC) and Knock-out Mouse Project (KOMP) are two resources upon which one could build this project, but they are not the only ones. The data set would show the correlations among the molecular phenotypes across tissues, to allow predictions based on the more accessible tissues.

The product of this project would be a public data resource to support work to interpret how variants, genomic elements, environmental factors, and molecular phenotypes are related, as well as proof-of-principle examples for predictive models of gene function. With this resource one could predict which genes and genomic elements are causal for phenotypes and how the elements interact. Experiments could test these predictions and determine the response to additional genetic or environmental perturbations in vivo. Relevance to humans could then be examined with focused studies, using resources like the Common Fund Genotype-Tissue Expression (GTEx) project. For example, mouse studies might show that correlated pancreatic, liver, and muscle chromatin states are associated with risk for Type 2 diabetes, in particular genetic strains and dietary environments. These states and associations could then be examined in humans with efficient, narrowly focused, molecular studies in the relevant tissues and donors.

Many strains, cell types, and developmental stages, in a range of environments (such as various diets, smoking, environmental toxins, sun, and psychosocial stress) could be studied. The molecular phenotypes that would be surveyed in the model organisms include:

The data would be freely available to the scientific community. The project would also require the development of improved computational analysis methods for integrating the multiple data types, predicting functional elements, and understanding how variation in function arises from sequence differences. Although the main data production effort would be generating the sequence and molecular phenotypes, some pilot projects would focus on using these data to predict which genomic elements are causal for some diseases or traits that are shared by humans and the model organisms.

This proposal is related to, but distinct from, the GTEx, ENCODE, and Common Fund Epigenomics projects. While GTEx directly studies human tissues, it has limitations on the ability to control post-mortem effects, a limited range of developmental stages that can be studied, and an inability to control and manipulate environmental and genetic factors. The animal models proposed here, on the other hand, allow great flexibility to control and manipulate the genomes and environment in many animals, in order to identify mechanistic relationships between the genome and multiple phenotypes. The ENCODE and Common Fund Epigenomics efforts are focused on developing the reagents and standards for characterizing functional elements in the genome and cataloging them in a small set of reference cell lines and tissues. The project proposed here leverages these efforts by applying them to experimental organisms in which to make causal inferences and testable hypotheses of genome function, by looking at a large set of tissues in many individuals, developmental stages, and in several environments. This proposed project is much more extensive and comprehensive than the current reference projects.

Possible extensions to this project: This project could expand to include:

Potential impact of Common Fund investment: This project would produce a valuable resource of data sets and tools for understanding genome function, disease biology, and risk prediction in experimentally manipulable systems. Having these data sets in model organisms would allow researchers to study which genomic elements are mechanistically causal, not just correlated, for how the genome brings about phenotype. Once causal mechanisms in the model organisms are discovered, focused studies in humans could be carried out to test the predictions. Knowing the causal genomic elements and variants would allow researchers to study how they function in health and disease, to make accurate risk predictions, and, to develop therapies based on this mechanistic understanding.

Comments:
I'm not sure where this would fit in the criteria, but I think an ethics model should also be a core component of this type of research.

When we're basing therapeutics off of a disease that you COULD develop and predicting phenotype based on genetics, I think we need to consider the implications of such.

Also predict ways people may be discriminated based on this (e.g., insurance, cost, access, spiritual beliefs). Also, ways it may need to be regulated, before Congress gets involved.

I think this area has enormous potential, and it would be nice to have standards and protocols of how to act before the researchers and entrepreneurs are confronted with the ethical decisions.

 

Up to Top

Translating Findings on Human Disease Risk Variants into New Interventions: Coordinated Studies for Therapeutic Target Identification (see “Beyond Genome-Wide Association Studies, GWAS, in Innovation Brainstorm ideas)

    • Tens of thousands of human genome sequences are being produced through the 1000 Genomes Project, the Cancer Genome Atlas, the Type 2 Diabetes Project, autism, Framingham, and other projects.
    • The ENCODE and Common Fund Epigenomics projects are providing deep catalogs of molecular phenotyping data in a number of human and mouse cell lines and tissues; the modENCODE project is generating catalogs of functional elements in fly and worm. These catalogs can serve as references as the assays are applied to many individuals. The proposed project would take the lessons learned from the ENCODE and Epigenomics projects and leverage the data standards, analysis, and visualization tools they have developed.
    • GTEx is collecting expression data on several dozen tissues from post-mortem human donors as well as blood, skin, muscle, fat, artery, and peripheral nerve tissue from living surgery donors. The GTEx data and samples would complement the proposed resource by serving as an efficient test set for the functional predictions.
    • The International Knockout Mouse Consortium has knockout strains for 14,000 genes and the Collaborative Cross is a powerful resource of 1,000 recombinant inbred cross mouse strains that could be leveraged for this proposal.
    • The Drosophila Genetic Reference Panel is a set of 192 Drosophila melanogaster strains that have been sequenced and extensively phenotyped.
    • The Common Fund Single Cell Analysis project will develop methods that could be applied to this project.

     

    • DNA sequence, to study both germline differences among individuals and somatic mutation\mosaicism in various cell types within individuals,
    • Comprehensive RNA analysis (e.g., mRNA, ncRNA, and short RNA expression),
    • DNase hypersensitive sites as a marker for DNA accessibility,
    • Higher-order chromatin structure,
    • Epigeneomic features such as DNA methylation and histone modifications and variants, and
    • Transcription factor and RNA-protein binding sites.

     

    • the microbiome, which can be assayed easily by sequencing;
    • an expanded range of specific diseases;
    • differentiated human induced pluripotent stem (iPS) cells.

       

  • Nominator: NIA

    Major obstacle/challenge to overcome: The potential to develop new therapies based on the expanding number of findings on human genetic variants’ relationships to disease risk is a well-recognized aspect of the potential for clinical progress stemming from advances in genomics (e.g., Green ED et al. [2011], Charting a course for genomic medicine from base pairs to bedside; Nature 470; 204). A key early rate-limiting step in this pathway is identification of promising therapeutic targets based on epidemiologic findings on risk variants.
    The Challenge: In a few cases (e.g., Crohn’s Disease), substantial progress has occurred from identification of risk variants to identification of therapeutic targets, providing proof-of-principle for this approach. However, to date, the number of new targets identified by this approach is limited. A major challenge to expanding and accelerating such efforts is the fact that, after an association between a variant and disease risk is established, a critical mass of additional information is needed to determine whether there is a sufficiently promising therapeutic target to justify proceeding with subsequent, generally costly, steps in therapeutics development, e.g., screening small molecules, identifying lead compounds, and pre-clinical studies. The studies needed to identify and evaluate potential targets span a wide range of research areas, including:

     

    • Deep sequencing in the region of the identified risk variant, to examine more closely which sequence variants are related to disease risk and identify additional variants with similar or opposing effects.
    • Intensive clinical and physiologic phenotyping of individuals with and without the variant to ascertain potential pathways to target.
    • Determination of the effects of the sequence variant (either in a gene or in an intergenic region) on gene products, levels of expression, and/or differential responses to factors regulating expression.
    • Identification of tissues in which the gene is expressed and differences among tissues in the degree to which the variant affects gene expression and tissue function.
    • Development and use of in vitro assays or tests in animal models to study variants’ functional effects and effects on responses to drugs.
    • Assessing how the variant may differ in susceptibility to epigenetic modifications and the effects of these differences.

    The breadth of types of studies required to obtain the needed critical mass of information and the need to integrate information from them poses a substantial challenge. The range of expertise required includes genetics, cell biology, physiology, epidemiology, and clinical expertise in specific diseases. Although it is likely that there will be many individual studies that explore one or a few such effects of various genetic variants, it is presently very uncertain that individual investigator-initiated NIH grant applications alone will frequently provide and integrate the critical mass and range of data regarding a specific genetic variant to justify a subsequent drug development effort. Assembling coalitions of investigators spanning the above disciplines and providing the needed infrastructure for data sharing and collaborative analyses is very challenging. Without an NIH initiative, these challenges, coupled with high uncertainty of funding, are likely to deter even experienced investigators from the considerable effort needed to develop applications for such projects.

    This challenge also affects steps in therapeutics development downstream from target identification. There has been increased NIH support for structured programs to provide the infrastructure and coordination needed for small molecule screening and other steps focused on targets that have previously been identified. However, the steps from finding genetic risk variants through target identification have not been supported nearly as much by structured NIH programs, but rather have been left to investigator-initiated projects that generally address only isolated steps in the process. While investigator-initiated research has reflected enormous creativity and will continue to make important contributions, the efficiency of identifying targets for intervention might well be enhanced by support for a more integrated path of discovery. This proposal therefore calls for the testing of a complementary paradigm that supports continuity of research from genetic variant through target identification.
    How to overcome this challenge: This challenge could be addressed by an initiative to support multidisciplinary projects, which would each obtain comprehensive information spanning the types of studies noted above, in regard to one or more variants associated with altered disease risk, and analyze this information to identify potential therapeutic targets and evaluate their potential for further development. This initiative would markedly enhance therapeutics development capabilities by these unique contributions in a crucial and currently unfilled niche.

    Such studies could be supported through one or more Common Fund RFAs, with individual awards supported by the most appropriate IC, or multiple ICs if appropriate. Peer review considerations regarding selection of variants for these studies could include the strength of their relationship to health risk, the public health importance of the condition(s) they affect, and the potential for finding new therapeutic targets, based on current knowledge about functions of the gene in which the variant is found. If more active NIH planning is desired regarding the range of conditions and/or selection of genes on which such projects could be focused, one or more pre-RFA advisory workshops could be convened by a trans-NIH committee to identify particularly important foci. Such a workshop could also recommend criteria by which to evaluate the results of individual projects in regard to decisions about proceeding with subsequent therapeutics development steps after target identification.

    Coordination among projects could be facilitated by annual meetings and a coordinating center. An independent panel could review progress of the projects with regard to established benchmarks and advise on the rationale for their continuation. Based on these reviews, the efficiency of the set of projects could be enhanced by withdrawing resources from studies showing less promise for finding good targets and increasing resources for those with greater promise.
    How the proposed initiative would address this challenge and fill a gap in current efforts: The focused coordinated target identification activities described above would help to increase the rate of discovery of promising therapeutic targets above the current less-than-ideal level, by providing the incentives and organization for their efficient identification and evaluation and a mechanism for focusing on the most promising ones. The initiative would also address the challenge of fulfilling the therapeutic potential of findings on genetic risk variants by promoting substantial progress on the crucial early therapeutic development stage of target identification.

    Further, this structured approach would enhance the potential of current structured programs focused on steps after target identification by increasing the number of targets for consideration at the beginning of their therapeutic developmental “pipelines.” By increasing the number of promising targets, it could also provide synergy with the proposed National Center for Advancing Translational Sciences (NCATS), by enhancing opportunities for NCATS activities focused on subsequent stages of therapeutics development.

    Emerging scientific opportunity ripe for Common Fund investment: The increasing number of genetic risk variants identified by epidemiologic studies provides a well-recognized opportunity for the proposed activities to contribute to therapeutics development. Further, the many large population studies with extensive phenotype data, whose participants have already been genotyped, provide a cost-efficient platform for more detailed studies of specific genetic variants’ relationships to phenotypic differences. The expertise to conduct the proposed types of studies for target identification is available and improving rapidly. The potential contribution of such genetic findings and research expertise to target identification could be greatly enhanced by the proposed coordinated efforts to obtain a critical mass of information about selected variants and their effects.

    Common Fund investment that could accelerate scientific progress in this field: Therapeutic target identification could be accelerated by a Common Fund investment in the types of projects described above. Ideally, these might be supported by a seven-year investment (one year for detailed planning and protocol refinement, five years of data collection, and one year for analyses). An interim evaluation of ongoing projects would be used to inform decisions of subsequent resource allocation, selecting those projects that would continue and those that would be revised or discontinued. Based on costs of the types of studies that would be included in the projects, it is estimated that studies on the effects of 20 variants could be supported by an investment of approximately $30M (direct costs) over seven years. The average annual direct cost would be approximately $4.3M, though first- and last-year costs would likely be lower, with higher costs in the intervening years. It is possible that the studies could be organized as a private-public partnership, which could expand resources and the number of targets identified.

    Potential impact of Common Fund investment: Identification of several new, well-validated, therapeutic targets by this program would have transformative, durable impacts that would persist after the Common Fund support ended. The program would have impacts in at least two domains:

     

    • Most obviously, it could advance subsequent therapeutic development and testing (either by private sector or NIH support) on the targets identified by the Common Fund program after this program was completed. (It is also worth noting that this approach could identify targets that might not have been pursued by the private sector for financial or other reasons, but which nonetheless merit further investigation because of their public health importance.)
    • More generally, the project would allow acquisition of experience with this coordinated approach to target identification and validation, and the formation of research teams with expertise in these new approaches. Such teams, and others modeled on them, could provide important contributions to the translational missions of various individual ICs, which could be supported by these ICs long after the Common Fund project ended.

       

    Comments:

    I think this sounds good. Especially after hearing a talk about combined with knowledge of genes and certain metabolic pathways, a doctor was able to predict which depression meds would work for a patient, given there are many brands and usually doctors seem to use luck and trial and error to prescribe certain meds. Very annoying to the patient moreso than the doc.

    I think proteomics should also be included, as an indicator of metabolic pathways. Or as more biomarkers of disease states are discovered they should be included in the assessment provided tests are available.

     


     

    It will be important to take into consideration the unique features of important disease groups. Variations in genes associated with mitochondrial energy production are important risk factors that can direct therapeutic interventions. Because of the importance of mitochondrial function to metabolic diseases and more common degenerative disorders, the risk variants in both nuclear and mitochondrial DNA genes associated with energy production should be considered.

     

    Up to Top

     

    Bringing Difficult Structures into Reach

    Nominator: Innovation Brainstorm participants

     

    Major obstacle/challenge to overcome: Despite ongoing Common Fund (CF) efforts to develop new technologies and a better understanding of the structural biology of membrane proteins, many other proteins of intense biological interest (e.g., large proteins, multi-subunit proteins, glycosylated proteins, complexes of proteins, conformationally mobile proteins and transient interactions of proteins) remain intractable to structural biological investigation.

    Emerging scientific opportunity ripe for Common Fund investment: Various opportunities and techniques appear ripe for investment as they are “stuck” at the inflection point of scientific progress over time. These include: i) small-angle X-ray scattering in solution (which requires experimental validation); ii) single-particle X-ray analysis (which needs engineering and refinement); iii) tomography (which needs improvement in resolution); and iv) powder and fiber diffraction (which need software and education).

    Common Fund investment that could accelerate scientific progress in this field: Potential CF investments include workshops to define and understand the current limits of emerging technologies, prioritize those for development, and improve access to these techniques as well as education on how to use them. Input from these workshops could inform the development of Requests for Applications (RFAs) for methods- and software development and testing.

    Potential impact of Common Fund investment: Advances in protein structure determination will provide greater availability of biologically relevant protein structures and complexes across diseases and NIH Institutes and Centers (ICs). Expanding the protein structure universe will also yield new templates for drug design, as well as three-dimensional maps for understanding protein function and mapping genomic variation.

     

    Comments:
    Under this heading the Common Fund should consider targeted support of the development of X-ray Free Electron Lasers (XFEL) for structural biology. The first applications of an XFEL to structural biology was heralded in two 2011 Nature papers (1,2), each with dozens of co-authors. One demonstrated diffraction and structure determination of a challenging membrane protein nanocrystal consisting of just a few dozen unit cells. The other provided results of snapshot imaging of a single virus particle. These studies demonstrate the potential of XFELs to biology and chemistry, providing a game-changing new experimental technique with unprecedented time and spatial resolution.

    The paradigm change introduced by the XFEL depends on two factors: (1) Nanocrystals form far more readily than their macroscopic counterparts but are not useable, even at the most brilliant synchrotron sources. (2) The XFEL produces photon pulses with lifetimes as short as 10 femtosecond, yet containing 10**12 photons, which can produce observable diffraction from such nanocrystals. The pulses are so short that diffraction takes place before radiation damages occurs.

    The issues of technology and specimen development for the XFEL are huge, and worthy of collective support. These first two projects were brute-forced through, and almost everything that is needed to make the methodology routine and accessible remains to be done.

    1. Chapman HN, et al. Femtosecond X-ray protein nanocrystallography. Nature 2011;470(7332):73-81.

    2. Seibert MM, et al.. Nature 2011;470(7332):78-86.

     


     

    It seems I am building a lot of molecular engineering buildings these days. Facilities are designed to explore the intersection of molecular, biological, and physical sciences. The people who work in these labs are focused on a number of relevant topics ranging from tissue scaffolding, synthetic biology, protein structures, nanofabrication, materials science, condensed matter physics, optical, the list goes on...some are looking for the next great tool, others are engaged directly in therapeutic pursuits. But the thing that strikes me is the need for more basic science at what I like to think of as the “transdisciplinary” level. To make new discoveries and insights, need to train scientists who can think simultaneously in multiple disciplines. This is where quantum physics meets molecular biology. To truly exploit this field, I think we need a new network of centers - similar to the National Nanotechnology Initiative - where grants where created to establish major hubs or centers in a new scientific field. I think the time has come for such a network in Molecular Engineering. I know of a half dozen projects being planned domestically and two or three around the world. I think if a larger scale enterprise were formed out of this vector, progress would be much swifter in the fields of regenerative medicine, biomedical engineering, biomedical devices, replacement organs, and neurosciences.

     

    Arrow I agree with this comment. There is a need to bring together several different disciplines to better define and characterize protein structures under a broad range of physical conditions, that is how the protein structure changes from its relaxed state (removed from its surroundings) to states in which it will be deformed, such as mechanically, chemically, etc. This also involves interactions such as protein-protein and protein-cell. Computational biology addresses some of these issues, but validation of models needs to be performed via experiments, and knowing the crystal structure of the molecule of interest is vital. An interdisciplinary approach should include all fields, e.g. physics, engineering, biology, chemistry, medicine etc. New paradigms for structural biology will be needed that fully understand the complex interactions between cells, proteins, enzymes, cytokines, and drugs, to name but a few. This will be a new area of research for the next 10-20 years which needs to be addressed. It can be defined as molecular engineering or even biomolecular bioengineering, and both computational and experimental sides need to be combined interactively. Studying cells and molecular structures in isolation of their surrounding extracellular matrix (ECM) has run its course and needs to be studied in a more physiological approach. Molecular dynamics and steered molecular dynamics have the ability to do this in a computational manner.

     

    Up to Top

     

    Cross-Cutting Issues in Computation and Informatics

    Nominator: Innovation Brainstorm participants

    Major obstacle/challenge to overcome: One common thread of nearly all the topics discussed at the Innovation Brainstorm meeting is data overload. In particular, there is an urgent need for integration of data sets, approaches, as well as of inquiry that addresses multiple states of health and disease. Improved data sharing, as well as access to secondary data sets, is paramount to progress.

    Emerging scientific opportunity ripe for Common Fund investment: More interdisciplinary opportunities are required to tackle the data challenges in biomedicine, and as such, all efforts to ease these interactions would be well-spent.

    One example of an underused opportunity is cloud computing, which provides shared computational resources on demand via a computer network. This approach could be broadened within the biomedical realm, although some fields (e.g. protein folding) have already implemented it. Since data users submit tasks without possessing the software or hardware locally, the approach promotes cost and labor efficiencies.

    Developing new tools and opportunities for multi-disciplinary interactions will help integrate genomic and phenotypic data sets as well as advance the study and understanding of the broadly based “environment.”
    Common Fund investment that could accelerate scientific progress in this field: Currently, NIH-supported resources in this area are helpful but not sufficiently broad and/or powerful enough to address the growing need to integrate multiple data sets. The NIH could “democratize” this area of research by:

     

    • Creating innovation centers
    • Lowering the entry barrier for quantitative scientists
    • Presenting “prediction challenge” data sets for teams to solve
    • Developing and/or hosting a software commons for existing computational biologists
    • Increasing the usability of web-based computational tools for biologists
    • Creating an Office of Cyberinfrastructure within the Common Fund for trans-NIH oversight

    Potential impact of Common Fund investment: Developing and sharing broad-based computational tools and making them freely available to the scientific community has the potential to vastly increase the interoperability of data sets currently being generated in ‘omics studies. Doing so is necessary for full integration of knowledge that can apply across NIH Institutes and Centers (ICs) and disciplines.

    Comments:

    I find the list of topic areas pretty uninspired and uninspiring. While I recognize the tsunami of data that is washing over of all of us, I don't think any of these address the real research questions: How do we make sense of the data?

    The “innovation center” approach hasn't worked well in the past. What it does is it concentrates a large amount of funding in a small number of groups but the impact of those groups has been rather small. Why, you ask? First and foremost, innovation isn't concentrated in one or two places. Second, standards cannot be applied top-down--the best arise because people find them to be useful. And the center concept was built around the idea that a few innovative groups would produce tools that everyone would use. The sad fact is that it never really happened. By and large, the software and tools we develop and use come from many places--and those tools and places evolve over time as new methods replace older ones. Now if you want to give me an innovation center grant, well, that's a different story. But honestly, I think innovation centers have limited innovation by sucking up lots of money is a small number of places.

    What about prediction challenges? They are great, but are you going to pay groups to do the analysis? Most of us are not sitting around waiting for a challenge dataset to work on. And what is the real reward. There are a lot of challenges and they haven't produced huge innovations.

    A software commons? Sourceforge seems to work pretty well, and it doesn't cost the NIH a dime.

    Increasing the usefulness of web-based tools is a great idea. But it isn't something most academics do well. If you want, provide webinars on UI design and user-centered design, but don't think that paying scientists to make prettier web pages is going to transform the world--we'll get pretty sites that are still not well-suited to users in most instances.

    And what is an Office of Cyberinfrastructure going to do? No, save the money and invest in research.

    And “The Cloud” as a solution. Let's not even go there. The problem isn't storing data. It's not running code. It's about knowing what to do with the data.

    I run the Center for Cancer Computational Biology (http://cccb.dfci.harvard.edu), a data analysis service center at Dana-Farber and we've tried to script and standardize analyses. We've been able to do that for normalizing array data and for doing some QC on sequence data. And beyond that, each project and each analysis is different enough from the last one, that we can't just run things through the pipeline. Each analysis, sad to say, is a custom creation at some point in the process. We need good methods and smart people more than the newest buzz-word technology to be the focus of our investment.

    I also think that the bullet points presented above lack an innovative approach to motivating people to meet the stated goals. Maybe it is time to rethink how NIH invests its dollars and how we reward people.

    So what's a better plan? Well, here are my humble thoughts--feel free to rip them apart.

    1. Require that all grants generating genomic data have a data sharing and analysis plan, with some fixed portion of the budget devoted to it. Require that they have credible people on the project to do the analysis. And then require that when the budget is scaled back, the information management and analysis portions are not the first things cut. Make reporting on the analysis part of the annual reporting and really scrutinize it.

    2. Create Requests for Applications (RFAs) specifically for new methods development focused on integrative data analysis. Make the RFAs broad, but require that they take one or more types of data and do something interesting. Gene expression and epigenetic data. Literature mining and single nucleotide polymorphism (SNP) profiling for network inference. Clinical data and genomic data. Quantitative cycle number (CT) and gene expression. Be creative. The challenge we all see is that any single data type alone can only get us so far and what we really need are methods that can help constrain, guide, and validate our analysis. We know transcription factors influence expression. We have ChIP-seq and RNA-seq data (or the array-equivalents). But we have nothing that lets us predict anything from this in a generalizable way. Set aside 20% of the Common Fund to support innovative computational approaches.

    3. Reward those who really are doing this work. Since we want software tools that are both used and useful, give a percentile “bump” to proposals that are multi-institutional, that are multi-disciplinary, that span boundaries--but don't make it a check-off piece, make it a review criteria that is evaluated by a study-section and scored.

    4. Reward those who produce useful tools. If someone has a tool that is widely used or cited, give them a five or ten percentile bump on their next grant application and let the study section with the program officer review and score grants based on previous impact. Want to encourage useful software that is freely available? Make it easier for people to get their next grant if they produce software that is both useful and used. That will solve the interface design problem and the software availability problem better than anything else.

    5. Create funds for computational biology training grants and hold them to high standards. This isn't going to solve the problem tomorrow, but it will start to address the problem that is coming.

    6. Fund training workshops with serious cash, but extremely high standards. If you want to fund a center, fund one that will do broad-based training and give them a lot of money to get good instructors. The Canadians do it well. Look at the Canadian Bioinformatics Workshops (http://www.bioinformatics.ca/). Develop a curriculum that will be taught in cities across the country, for which the materials will be available, and where all the software has to be freely available to others. Mandate that the presentations as videos are on the web. Create resources to educate the next generation.

     


     

    The topic notes that a major obstacle to overcome is data overload. It appears that the proposed approach is focusing on the data overload problem among basic researchers. While this is obviously a critical problem, please also consider the data overload that is faced in the clinical arena, where medical knowledge is expanding at an exponential rate, results of definitive randomized clinical trials (RCTs) can still take over a decade to reach routine clinical practice, and U.S. adults only receive about a half of recommended care. I would posit that even greater than the challenge of more efficiently generating knowledge on what should be done in clinical practice is the challenge of more effectively making sure that the vast amount of knowledge we already have on medical best practices is implemented in routine clinical care. In particular, it may be worth investing in the development and maintenance of open-source, standards-based knowledge management and clinical decision support infrastructure that can curate evidence-based medical best practices and be interfaced with a variety of electronic health record systems to provide appropriate, patient-specific care assessments and recommendations within clinicians' workflows. Such a public resource would go a long way in beginning to address the data overload faced by practicing clinicians on a daily basis.

     


     

    Some of the clear bottlenecks we are currently facing in microbiome analysis and molecular evolution are (i) lack of availability of datasets (the Sequence Read Archive, SRA, has not been able to meet the community's needs for microbiome analysis and community-led alternatives such as MG-RAST are emerging slowly, in part due to lack of resources); (ii) difficulty of transferring large datasets among sites; and (iii) lack of access by researchers to high performance computing (HPC) resources. Cloud computing, and especially frameworks for hooking new analysis tools up to the cloud directly, could have a huge impact here, and could also assist with the problem that many new tools are not available to end users in convenient interfaces and/or not scalable. I don't necessarily think that NIH should set up its own cloud -- the infrastructure is very hard to set up even for supercomputing centers and large companies -- but exploiting commercial cloud resources (and especially setting up standard application programming interfaces, APIs, so that even students can easily plug in their novel analyses and apply them to vast datasets), together with appropriate training initiatives, could have a disproportionate impact. Additionally, support for open-source widget toolkits that produce scientific data displays that can easily be linked in the cloud and displayed in the browser could be really influential. The focus should be on open-source frameworks that reduce time-to-result both for investigators and for software developers, and that encourage ongoing iterative interactions between generators and analyzers of data.

     


     

    The focus here seems to be on “there's a lot of data, and we need to analyze it/share it.” But the issue is bigger than that: we need to understand it. Computational modeling, and not just cloud computing, is where it's valuable to engage quantitative and computational scientists. The description above hints at this need for computational modeling when it talks about “prediction challenge” data sets, but it is not obvious from the “Major Obstacle.”

    Enabling participation by quantitative scientists involved in cross-cutting data analysis and computational modeling is an important role for Common Fund investment. The standard R01 funding mechanism strongly encourages the gathering of new data, rather than more unifying analysis and modeling. Of course, perhaps this just means that the R01 mechanism needs to be fixed to be more friendly to proposals involving unifying data analysis and modeling.

     


     

    As indicated by previous comments, the emphasis should be on developing transformative applications of existing and new computational models to the analysis of biological data.

    Creating software commons, improving usability, and deploying computing clusters do not really rise to the significance of a Common Fund Program. There are plenty of all three types of entities available. The example of cloud computing, and its impact in our understanding of protein folding, has in fact little to do with the use of a distributed cluster (Folding@Home from Vijay Pande's group at Stanford and Rosetta@Home from David Baker's group at UWashington), but with the development of new and improved computational models and algorithms for protein folding. Indeed, for researchers with compelling computational problems, there is a plethora of computing resources available, including www.xsede.org which used to be known as teragrid and as the name suggests has teraflops of computing available.

    The current barriers lie not with the objectives as currently stated, but instead with developing models and applications to analyze and integrate highly dimensioned biological data.

     


     

    I would specifically like to comment on the following item:

    1. Developing and/or hosting a software commons for existing computational biologists

    I believe the NIH has done this in some measure using www.nitrc.org website. However, what I find
    completely lacking in most of the NIH grants is the ability to reproduce and verify that indeed the goals of the grant were met (broadly speaking). In my experience, I have not seen a way in which those who get the grant have shown without any doubt that what they proposed has indeed worked
    or if it did not, then why it failed. Measuring the success of grants by looking at the number of papers
    is a very poor way of judging the performance of the principal investigator (PI).

    As such, I would suggest a web-site where people post data and algorithms which can be verified by the Reviewers the next time they see a grant from the same PI.

     


     

    For me, the bigger issue for data sharing is the NIH policy limiting access to data sets. While it is possible to identify some research subjects through bioinformatics, it is unclear what the error rate is and the effect of epigenetics and environment on phenotype. Is this a practical risk mitigated by the Genetic Non-discrimination Act passed subsequently? Is the policy doing more harm than good, in terms of the research not done? Since NIH employees cannot access the data without permission, they also cannot analyze databases to check published results. Perhaps this has led to increasing number of retractions in the literature and contradictory findings. Let's not spend more money on cloud storage without a policy change.

     


     

    I strongly recommend including a focus on geospatial data, methods, and tools within this critical area for Common Fund investment. An initiative in this rapidly-growing field has the potential to benefit researchers in the social, behavioral, and biomedical sciences and across the diverse institutes and centers of NIH.

    Three key trends in spatial and spatiotemporal data analysis and modeling and geographic information science (GIScience) are relevant to cross-cutting issues in computation and informatics at NIH: 1) the explosion of real-time, spatiotemporal data from global positioning system (GPS)-enabled devices, distributed environmental sensor systems, satellite remote sensing, and (potentially) from geographically tagged electronic medical records; 2) development of new tools and methods for analyzing spatiotemporal data, including methods of geo-visualization, dynamic spatiotemporal modeling, and modeling of human mobility at scales ranging from the everyday to the life course; and 3) advances in computing technologies, service-oriented architectures, and cyberinfrastructure that are fueling the growth of distributed and collaborative services known as the geospatial web. These technological advances are enabling the scientific community to capture, analyze, and visualize large volumes of high-resolution, real-time data and provide opportunities for innovation in health and biomedical research.

    With the help of Common Fund support, the NIH could, for example, develop a data-sharing platform for research funded by the agency within a 5-10 year period. This concept would build on existing data-sharing requirements and NIH has the leverage to prospectively develop an open-access data repository, perhaps starting in a few key areas and then expanding it to cover all NIH-funded research. With respect to geospatial data, in particular, the platform would involve geo-referencing where the data were collected, providing a geospatial search mechanism for studies and databases, and providing access to the data in the depository. This would include geospatial data either as it was collected by the researchers and/or perhaps geocoded through the initiative.

    Critically, the platform would have to be designed to share geospatial data from studies in a way that would preserve the confidentiality and security of participants. Part of the initiative would entail research and related efforts to understand and implement approaches that enable access to geospatial data without compromising privacy. Training and information sessions would be needed to explain how the NIH data sharing platform would work, how to contribute to it, data and metadata standards, how to access data, and how to access and use geospatial tools for analysis and visualization.

    The opportunity to seize on public-health and biomedical related advances in GIScience is too promising to let pass. I strongly recommend Common Fund investment in this interdisciplinary, cutting-edge field.

     


     

    On lowering the entry barrier for quantitative scientists:
    Open innovation approaches such as competitions, fairs and hack-a-thons could be very efficient in engaging computer scientists and software developers in biomedical problem solving.

    Repurposing of cutting edge technology developed by other mission-driven agencies like NASA, CIA, IARPA, DoD can provide another interesting venue for bringing in quantitative scientists. In particular, some of these agencies developed techniques to work with high dimensional data.

    On data overload:
    I agree with all the comments here that say that a lot of data is not what we need to focus on – we need to understand it. We also need to understand how the data we have fit together, what is useful among these data, how we can/should use it, and what is missing.

    Developing personalized information delivery and UI design that employ artificial intelligence and other machine learning approaches might be helpful in both understanding data and consumers (as well as engage the latter).

    In view of software commons, I would like to revive the concept of knowledge environments for biomedical research (KEBR). NIH’s National Library of Medicine (NLM) led this initiative a few years ago. Data and software are parts of a bigger picture. We are already facing the problems of too many tools, too many ontologies, in addition to too much data. Building an environment that enables us to choose a proper tool for the task and the datasets we have might be useful.

    On Innovation centers:
    Innovation is constantly evolving and therefore requires a very dynamic structure to support it. It would make sense to build an environment, where innovation is recognized, supported and included in the research process regardless of where it is coming from.

    Developing a system of work and academic credits where every contribution counts (similar to film industry) that include credits for ideas, data sets, software, etc. may facilitate the advancement of innovation and knowledge environments.

     

    Up to Top

     

    Meeting the Challenge of Big Data in Biomedical and Translational Science (see “Cross-Cutting Issues in Computation and Informatics” in Innovation Brainstorm ideas)

    Nominator: NLM
    Participating IC: NIBIB, NIGMS

    Major obstacle/challenge to overcome: The complexity of human biology in health and illness is increasingly being taken into account by research design, with individual studies collecting genomic, image, biosensor, and clinical data, along with information about sociocultural and environmental factors. And, these large amounts of diverse data are almost always collected in digital form. Thus, modern biomedicine is confronted at once by great opportunity and great challenge. The opportunity presented by collecting multiple measures is to understand disease and gain insight to its prevention, treatment and cure, from a broad, encompassing perspective more likely to bear fruit than from studies limited to a small number of measures. The opportunity presented by collecting digital data is the ability to share, compare, reaggregate, reuse, and integrate data, as well as to use these data for models and simulations in ways that have been heretofore impossible. The challenge, however, is to be able to organize, present, analyze and manage these data to fully realize such opportunities. The challenge is one of “big data,” where handling and working with complex data at large scale is both quantitatively and qualitatively different than at a smaller scale.

    Emerging scientific opportunity ripe for Common Fund investment: As the translation of biomedical research results into improved human health accelerates, and as the diversity of clinically-relevant measures grows to include those of basic biology, new approaches to big data, drawing from information science, informatics, computer science, and computational biology, must be developed and used to maximize the return on the research investment. Advancing the science of big data and developing associated tools requires test-beds to stimulate and shape conceptual progress and its reduction to practice. While this has happened in other fields, such as astronomy (which has benefited greatly as a consequence), a concerted effort to move these ideas and tools forward has not yet been made in translational biomedicine. The number, size, and scope of biomedical and translational research projects collecting large amounts of different types of data is now sufficient to offer numerous test-beds that would be demanding enough to move forward the science of big data and developing a big data research environment. The time is right for seizing the opportunity these test beds offer to drive the development of big data approaches in the context and service of translational science.

    Common Fund investment that could accelerate scientific progress in this field: This initiative would support the research and development of a big data research environment for each of several sites hosting translational science projects, collectively spanning analyses that might include genomic, image, sensor, clinical, sociocultural, environmental, and electronic medical record data. Some examples of elements likely to be developed under a given award include, but are not limited to:

     

    • Scientific foundation – e.g., meaningful analysis of multiple diverse data sets, scalable algorithms.
    • Informational foundation - e.g., identification/development of vocabularies, ontologies, metadata.
    • Technology and technical infrastructure - e.g., to move personal biosensor data to the environment .
    • Approaches for management of the data – e.g., semi-automated annotation, data compression and decompression, crawlers to associate particular data points.
    • Approaches for the use of the data – e.g., a synthesis platform that could be used to conduct “preliminary clinical trials” in silico to be used with adaptive trial methods, methods to evaluate the contribution of multidimensional measures to particular clinical and health outcomes.
    • Approaches, technical and cultural, to share and compare data across research groups.
    • Training in the science, development, and use of big data and its technology.

    Awards would be made to support such integrated efforts to advance the science of big data and build a big data research environment associated with sites at which large clinical research projects and clinical trials are typically ongoing. While this initiative could be implemented in any of a number of ways, one possible implementation would be the use of cooperative agreements. The methods, results, progress, setbacks, and lessons learned would be shared among all of these cooperative agreements in an ongoing way so as to allow for an adaptive project process.

    Potential impact of Common Fund investment: Informatics approaches currently used have largely been developed in the context of more limited data types and amounts than large translational science projects are now producing. A big data research environment built around, and assuming, such large, multidimensional studies producing gargantuan amounts of complex data would represent not only a quantitatively different understanding, but a qualitatively different understanding of the basic biology of health and disease. This new understanding, based on an integrative perspective from ‘omics to the environment, would, in turn, provide new insights to improve human health, as well as clinical and public health decision-making.

    Comments:
    ASCO (American Society of Clinical Oncology) is rolling out a Real Time Oncology Knowledge Network, to be formally announced on Sept 16. By making the world's evidence base for oncology available, in real time, to researchers, clinicians, and patients, it has the potential to vastly accelerate many areas of research, while also accelerating translation of research quickly into national clinical practice.
    This model could also be developed, supported & applied by NIH for many other disease and research areas. For example, all NIH supported (and FDA required) clinical research study databases could be archived, for real-time access, in NIH national research database(s), registries and networks.

     


     

    Methods to manage and analyze “big data” are going to be needed in order to accommodate the converging themes in healthcare - namely comparative effectiveness research, patient-centered care, personalized medicine, healthcare redesign, and quality which all ultimately leading to systems of rapid learning health care. This has been described by the Institute of Medicine (2007) and many others (e.g., J Clin Oncol. Sep 20 2010;28(27):4268-4274.), and actioned through initiatives like the American Society of Clinical Oncology’s Rapid Learning System initiative described above. In a rapid learning system, data generated on a daily basis through routine clinical care feed into an ever-growing coordinated data systems. Using adaptive design and other approaches, the system learns by routinely analyzing captured information, iteratively generating evidence, and constantly implementing new insights into subsequent care. Each new patient’s care is informed by the treatment, outcomes, and experiences of large numbers of similar patients who preceded him/her in time, and individual patient care is reinvested into this overall system of data.

    Critical to this process is linked information, informatics, and big data solutions. As a part of this proposal, consider an agenda item that specifically promotes innovation within the area of facilitating rapid learning systems through big data infrastructure and analytics. This includes development of approaches and algorithms for clinical decision support, methods for analyzing data in real-time in the field matched with individual patient data, and demonstration of bidirectional flow of research data to inform clinical care and clinical care to inform research. It also includes development of secure systems and approaches that link data generated in the basic sciences with the clinical sciences, and the culture to support it. Practical demonstrations through “use cases” are needed in order to understand logistical challenges, stakeholder perspectives, and areas for innovation.

    Of note, use of patient-reported outcomes approaches provide the opportunity to develop and test solutions in patient centric environments (see Med Care. Jun 2010;48(6 Suppl):S32-38.).

     

    Up to Top

    Group Effects

    Nominator: Innovation Brainstorm participants

    Major obstacle/challenge to overcome: Exposures are highly variable and dynamic throughout the lifetime of an individual. Needed are systematic, unbiased screens for studying how multiple factors (e.g. microbiological, chemical, lifestyle and dietary exposures) interact to contribute to susceptibility to disease, disease progression, and treatment outcomes. In addition to curating/annotating data obtained using current models, improved testing systems are needed that are equipped to analyze multi-factorial issues.

    Emerging scientific opportunity ripe for Common Fund investment: Several opportunities exist to address the need for better models and analytic tools. These include the availability of inexpensive exposure screening tools (e.g. virochip, protein adducts) and bioinformatic techniques that can handle large, clinical datasets to track exposures. The development of screening tools, methods, and model systems that are particularly well suited for studying mechanisms of environmental influence also provide opportunities in this area. Point-of-care tools are likely to be especially useful to monitor exposure in global and other low-resource settings. Expanding these toward multiplex capability is another opportunity.

    Common Fund investment that could accelerate scientific progress in this field: The Common Fund (CF) could shift the curve to accelerate progress by expanding the number and quality of tools to systematically measure multiple exposures and by supporting the development of computational tools that will support multifactorial research: viral, bacterial, chemical, and dietary. Data handling for these types of studies is an enormous challenge. A database that catalogs and characterizes model systems that are suitable for studying multifactorial research would also be helpful.

    Potential impact of Common Fund investment: Implementing projects in this area could have significant impact in helping to better clarify the age-old question of the relative influences of “nature and nurture;” yet, it would go further by ultimately explaining how complex mixtures of genetic loci and environmental exposures influence health and disease susceptibility. In time, these insights will point to preventive strategies that help to fulfill the goals of personalized medicine.

    Comments:
    Dietary exposures are especially difficult to quantify since nutrients are eaten in foods and therefore inherently “group” together. Also, when more of one food is consumed, less of another food is eaten. Often these exposures, which are large, are ignored when evaluating other exposures.

     

    Up to Top

    Microbiome, Part 2

    Nominator: Innovation Brainstorm participants

    Major obstacle/challenge to overcome: A substantial Common Fund (CF) effort to study the human microbiome has yielded many findings and insights. However, moving beyond sequencing to a functional understanding of the microbiome is now the challenge.

    Emerging scientific opportunity ripe for Common Fund investment: The emerging opportunity to understand the function of the microbiome has been generated by the sequencing data and the early correlation studies funded by the Human Microbiome Project. There are challenges that need to be overcome to exploit these data, but the opportunity exists to fundamentally shift the paradigm that describes the human-microbiome relationship and the impact that the microbiome has on the health of many tissues.

    Common Fund investment that could accelerate scientific progress in this field: The CF could accelerate the potential transformative impact of the microbiome sequencing data by investing in the following:

     

    • Computational tools that will allow deconvolution of complex data sets.
    • Functional assays that go beyond sequencing to understand the impact of the microbiome, including metabolomics, glycan-processing readouts, and short-chain fatty acid biosynthetic pathway analysis.
    • Functional model systems to understand bacteria-host interactions, bacteria-bacteria interactions, and bacteria-virus interactions. These model systems might include induced pluripotent stem (iPS) cell -derived organ cultures as well as animal models.
    • In vitro systems for microbiology and genetics of commensal bacteria.
    • Functional analysis of small molecules derived from microbes.
    • Systemic impact of the gut microbiome on other organs.
    • GWAS-MWAS (Genome Wide Association Studies-Microbiome Wide Association Studies) research.
    • Mother-child microbiome effects.
    • Any NIH-sponsored efforts in this area should be multidisciplinary to realize the fullest impact of potential.

    Potential impact of Common Fund investment: Systems biology-derived “designer” probiotics may offer an inexpensive, holistic approach to disease prevention and treatment, although much research is necessary to realize this potential. It should be noted that probiotics have been used for decades, but systematic analyses of their efficacy and safety have not been conducted.

    Comments:
    While there is clearly a need for more work on the microbiome, the title “Microbiome part 2” is perhaps suboptimal. What we need is not more of the same, but rather a transformative new project that builds on the resource provided by the Common Fund Human Microbiome Project (HMP) while at the same time exploiting the unique opportunity we now have to explore the pervasive effects of the microbiome on human health and disease, while at the same time supporting detailed mechanistic studies in animal models.

    One important question is whether this should be a Common Fund activity or should be relegated to the individual Institutes and Centers (ICs). There are several ways in which a Common Fund activity could be truly transformative.

    First, enrolling a large cohort for detailed phenotypic characterization, including the microbiome, and/or coordinating across many existing studies to allow a microbiome component could greatly expand our knowledge of which clinical indications are most likely to be microbiome-associated, would jumpstart biomarker discovery efforts, and would allow more targeted public health recommendations. Coordinating with projects that already have approval for large-scale genotype and phenotype analysis, including public release of the resulting data, such as the Personal Genome Project (www.personalgenomes.org), could be especially effective (for full disclosure, I should mention that I am a collaborator of the PGP although not funded by that project).

    Second, a Common Fund-supported infrastructure could highlight the commonalities in microbial communities that span diseases and body sites by providing a common standardized suite of laboratory and analytical techniques. One clear need is to provide a mechanism by which investigators, especially clinical investigators, can provide interesting specimens and get back interpretable results without themselves having to become experts in microbial ecology, next-generation sequencing, bioinformatics, and cloud computing. As the cost of sequencing continues to decline, the value of such a processing center and associated biospecimen repository, which could be interrogated with increasing resolution as methods continue to advance. Balkanization of microbiome studies into individual ICs would make it much more difficult to compare studies by different investigators for whom the microbiome is not their primary focus, but could provide fascinating additional insight. A strong centralized portal that provides analysis at the whole-community level to non-experts, while allowing experts to tweak analyses to satisfy more advanced methodological questions, perhaps based in the cloud, could be really useful.

    Third, complementing human studies with mechanistic studies, including animal model and in vitro studies, and computational predictive models, will be essential to taking studies of the microbiome from their current descriptive state to the point where they can influence clinical outcomes. In particular, strong support for biochemical and genetic studies of non-model microbes (to enable systems biology), rules for microbial community assembly, and understanding of the different effects of the same species of microbe in the context of different host species and genetic backgrounds, is essential. In this context, a much broader sampling of human microbial diversity, especially in populations with unusual genetic backgrounds and lifestyles, would be incredibly valuable and would inform studies across many ICs.

    Taken together, there is a strong need for a centralized microbiome effort that is able to coordinate and aggregate the microbiome sequence data that will be produced by many laboratories as sequencing instruments and techniques become increasingly democratized. The NIH needs to support the microbial equivalent of Flickr, the imaging and video hosting website, where instead of data being kept in the equivalent of a shoebox in each investigator's attic, we instead have an integrated view of all sorts of features all over the planet that is searchable and usable by non-experts. This system will not be like the sequence read archive (SRA) with its focus on sequences, but rather designed to serve a broad user constituency with an emphasis on understanding the biological phenomena rather than the data structure.

     


     

    My concerns relate to data visualization, multivariate pattern recognition and experimental design.

    Often the importance of resourcing these aspects is strongly underestimated, at least at the beginning planning stage. They should be distinguished from bioinformatics which is a separate and different need. Multivariate pattern recognition or data mining is required to simplify the large datasets, determining, for example, which species or communities or operational taxonomical units (OTUs) or groups of microbes are potential markers, to safely validate conclusions, and to relate for example physiological/clinical state or descriptors to the community of microbes.

    Multivariate methods should be built in at the beginning, too often they are an afterthought, or are farmed out at the end to information scientists who have had limited input in the planning and scientific postulation. From the beginning it is important to consider issues such as sample sizes, standardization between labs, whether data are adequately quantitative – often requiring detailed feedback with experimentalists, and regular quality control monitoring of assays. Once data are being collected, multivariate methods for data mining for calibration (relating the multivariate microbial fingerprint to physiological, clinical etc. state), exploratory data analysis, and classification, are required. There should be strong interaction with experimentalists as data are being acquired so that the preliminary results of pattern recognition have an input into the later experiments.

    Multivariate methods should be incorporated into the proposal at the start, involving close communication between the experimentalists and those that will analyze their data. The data analysis group(s) should be fully involved in meetings and discussions with experimentalists and should have some sort of grasp as to the needs and problems of the experimenters, rather than isolated as a separate group.

    Too many large projects end up with big databases that lie in dusty computers, often although large in size not suitably planned, or certainly not adequately exploited given the large volume of data available.

     


     

    The Human Microbiome Project (HMP) is an important step in our understanding of the biology and medical significance of the human microbiome and metagenome. Substantial progress has been made in developing their tools for inquiry and defining the overarching concepts to move the field forward. However, the subject is vast, and the implications for human health and disease are profound. We only are at the beginning of the story and further focus will permit both better understanding of the underlying biology as well as applications to all areas of human medicine. The study of human materials and model animal systems with strong phenotypes is essential for making progress in this field of applied genetics. Although a focus on bacteria is obvious and important, expansion of inquiry to include eukaryotes, viruses, retroviruses, etc., also is critical.

    Here are 10 areas of inquiry that should be pursued:

    1. Understanding microbiome characteristics in relation to families: what is inherited and what is not?

    2. Understanding secular trends in microbiome composition: what has been lost or gained?

    3. For diseases that have changed markedly in incidence in recent decades (there are several notable Examples: childhood-onset asthma, food allergies, type 1 diabetes, obesity, inflammatory bowel disease, autism), are changes in the microbiome playing a role?

    4. Do particular signatures of the metagenome predict risk for specific human cancers and other diseases associated with aging? Can these signatures be pursued to better understand oncogenesis (work on H. pylori provides a clear example of this)

    5. How do antibiotics perturb the microbiome--short-term and long term? Does the route of administration matter?

    6. How does the microbiome affect the pharmacology of medications? Can we type people to improve pharmacokinetics and/or reduce toxicity? Can we manipulate the microbiome to improve pharmacokinetic stability?

    7. Can we harness knowledge of the microbiome to improve diagnostics for disease status and susceptibility?

    8. Can we harness the microbiota (organisms that well-understand human biology) for the creation of new drugs?

    9. Can we harness the microbiota specifically to create new narrow-spectrum antibiotics?

    10. Can we use knowledge of the microbiota to develop true probiotics (and prebiotics)?

     


     

    The roles of all microbes in the human microbiome need to be considered, particularly in light of the importance of host-microbe interactions in human health and disease. Bacteria, fungi, viruses, and protists all need to be considered because they are all part of the microbiome. I disagree that we are ready to move “beyond sequencing” into the next phase. There is more sequencing to be done in order to include critical microbes in the analysis, and a CF is imperative in the next phase. Sequencing of other microbes should not be viewed as an “after thought” to dot i's and cross t's outside of a CF. This extended sequencing effort is ESSENTIAL and should be viewed as taking the analysis to the next level. While we cannot guarantee a complete picture of the human microbiome, the bias in the current analysis toward bacteria severely limits any understanding of the complexities of the host environment and the roles of ANY microbes in humans.

     


     

    First, it's premature to underestimate the contribution of the Human Microbiome Project (HMP) before its results have been fully published. It's shown us what we know and, critically, don't know (yet) about the normal range of the human microbiome, and it's established a resource base comparable to GOS, TCGA, or other large datasets that can be mined by other studies over the course of years. Additionally, it's created a pool of computational and experimental expertise now ready to tackle a wide range of microbiome research.

    Biologically, I'd like to add my voice to the comments supporting a large, prospective cohort that could be followed epidemiologically with respect to microbiota, genetics, environment, and lifestyle. We've been closely involved with the NHS/HPFS/Framingham/etc. cohorts, and it's easy to see how they've been transformative with respect to genome-wide association studies (GWAS) and environmental disease risk; the Common Fund could invest resources that would create such a cohort for study of the microbiome in a broad population, possibly funded subsequently by individual NIH Institutes and Centers (ICs), over a wide range of conditions, with extensive phenotypic characterization, and over an extended period of time.

    Computationally, there have been several comments here contrasting 16S and shotgun metagenomic/metatranscriptomic approaches, and in most cases this no longer a necessary “tradeoff.” Shotgun sequencing is cheap, continues to get cheaper, and the bioinformatics for inferring taxonomic abundances from it are constantly improving. And of course, as stated, it has the advantage of describing eukaryotes, viruses, and organisms not well captured by 16S. I would hope any “Microbiome 2” project would indeed focus on parallel systems-level data when possible, e.g. DNA, RNA, proteins, and metabolites. Also like TCGA, it's compelling to combine this with host information in a GWAS- or MWAS-like manner (also becoming more bioinformatically accessible). With the thought of moving microbiome diagnostics or therapeutics closer to practicality, I'd also suggest that something along the lines of MAQC be included in a “Microbiome 2,”so that we can leverage the HMP to begin establishing formal quality control guidelines for human microbiome assays.

     


     

    While I can appreciate that adding complexity to work being done on the current ‘microbiome’ project makes it more difficult and cumbersome, it currently does not give the whole picture. Unless we include all microbes occupying human environments (viruses, bacteria, eukaryotic fungal and protozoan protists), models of interactions derived from primary data being collected will not be relevant.

     


     

    While I commend the human microbiome work, the use of 16S rRNA primers to analyze the bacteria on humans is missing all eukaryotic microbes, including those in the mouth, GI tract, vagina, and skin. Humans are colonized with many eukaryotic microbes including fungi, parasites and even worms and these are present at all times in most humans. The human microbiome has in reality been a human bacteriome. If we are to start studying function, it is imperative that the eukaryotic microbes be counted and be studied for their interactions with the bacteria and the humans. A microbiome functional analysis without a third of the teams involved is not relevant. To date, there have been a few studies of the fungi in the mouth and vagina. To be sure there are problems with detecting all of the eukaryotic microbes using polymerase chain reaction (PCR) of LSU rRNA. And the fungi are particularly difficult to get DNA from. However, techniques such as deep sequencing of body locations will clearly show the important (or lack thereof) of eukaryotic pathogens in these body locations, once the frequencies of hits are adjusted for size of genomes. It is also important to keep in mind that fungi are experts at secondary metabolism producing antibiotics, antimicrobials and carcinogens. To exclude these potential drug manufacturers in their known ecological niches is inappropriate. If the human microbiome is to truly go forward, it needs to include the eukaryotic commensal microbes. If it continues to only analyze the bacteria, it should justifiably be labeled the human bacteriome and suffer from lack of completeness.

     


     

    The major obstacle is a comprehensive paradigm encompassing ecological modeling and fuller understanding of eukaryotic microbes and mechanisms that can elucidate and allow the de-convolution of host-symbiont relationships. A shift in the paradigm in the Microbiome: Part II absolutely needs to move from structure to a greater emphasis on functional predictive models that can enhance individualized health physiology and nutritional states, as well as lead to promising biotherapies and novel vaccine approaches. Metagenomic, proteomic and genomic technologies need to integrate standardization that will accelerate and identify metagenomic, cellular, immunologic interactions between host and eukaryotic microbes. Such 'systems biology' and validation tool development in a fundamentally shifted paradigm that de-convolutes human-microbiome relationships poses significant challenges. Functional assays that go beyond sequencing of the NIH Human Microbiome, the European MetaHIT, and other collaborative cross-agency efforts need to shared, validated and standardized. Shifting the context will yield clearer biosynthetic pathway analysis, metabolomic, glycan and short-chain fatty acid processes, new avenues for novel 'designer' probiotics of broad use in agricultural and aquaculture farms, veterinary medicine and in human health and disease prevention and treatment, including pluripotent stem cell approaches.

     


     

    I believe that the project is too bacteria-centric and that we need to aspire to really fully understand our metagenome bacterial and otherwise. A huge burden of human disease comes from mammalian viruses. Thus, the program is too narrow in focus. We need to see high goals and a full picture as the aspiration here. Bacteria are fundamentally important, but the 'microbiome' includes non-bacterial members. Tools are being developed to address this broader issue of the full microbiome, this should be a major goal. If we believe that communities on or in us are important, we should study the entire community. I would like to see this explicitly stated as it relates this potentially great idea to major areas of human enteric disease.

    I believe that the interaction between the full microbiome and our own genomes including our individual variations will be essential to unravel in order to make optimal use of even the bacteria-alone data being generated. Thus I believe that the goal of the program should be metagenecis- that is aspire to take into account all aspects of our metagenome which includes us as well and the bacterial and non-bacterial members of our bodies as well as our own genomes. The GWAS portion of the description encompasses this, but there are other approaches to human genetics that need to be integrated with the metagenome.

     


     

    I agree with many of the above comments regarding the dearth of information about viruses and fungi in the microbiome project. It is clear that there are very important balances between these organisms in their microenvironments that we can’t begin to address without a more thorough understanding of all the players involved.

     


     

    Diet, history of antibacterial use, family, and local environment are factors in humans that ought to be potently related to human microbiome. Ecological study at the whole-organism, family, and local community levels ought to be part of any such initiative. There is little potential benefit in understanding gene sequences or biochemical products of microbes without knowing how the symbiosis between microbes and mammals may help us to adapt to potentially cataclysmic changes in climate, availability of food and fossil fuel energy.

     


     

    We agree with the sentiment expressed in other comments above that Microbiome Part 2 should not be limited to bacteria only – instead, it should be expanded to include the non-bacterial members of the microbiome, namely fungi and viruses. Such expansion of focus is necessary since alteration in one component of the microbiome is known to influence the other components at the same site (oral, gut, skin, etc.). A stark example of how members of the microbiome are influenced by other members can be gleaned from the fact that removal of bacteria (e.g. by antibiotics) allows fungal overgrowth leading to overt fungal infections. Such a situation is commonly seen during antibiotic use among women, which leads to eradication of bacteria but increases the incidence of vaginal fungal infections. Therefore, it will be counterproductive to investigate the vaginal microbiome by focusing solely on the bacteria while ignoring the fungi. This is only an example of the possible interactions between microbiome members. Similar instances of interactions between bacteria, fungi and viruses can be derived from other patient populations including immune-suppressed individuals (e.g. cancer, HIV), emphasizing the need for studying bacteria, fungi, and viruses in same samples in health and disease.

    While the first phase of Microbiome has provided substantial information and significant insight into the bacterial communities inhabiting humans, this research endeavor will be successful in its overarching goal only if Microbiome Part 2 [as part the Common Fund (CF) effort] addresses the entire microbiota, including bacteria, fungi, and viruses.

     


     

    Expansion of the MicroBiome project to include the other taxons is scientifically necessary if we are to understand the interactions, dynamics, and consequences of human microbial communities. In addition, it would be a terrific initiative to involve younger scientists and science students in data collection and analysis in a way analogous to the HHMI mycobacteriophage project. I recognize that there are Human Subjects, safety, and privacy issues to be addressed, but these should be soluble.

     


     

    All of the above comments seem like incremental changes to an existing project and perhaps the suggestions are that Human Microbiome Project (HMP) be extended for another round with the aim of balancing areas that have not been covered in the original project. This, however, does not seem like a compelling argument.

    Most, perhaps all, of the activity in human microbiome research can now be undertaken by individual Institutes and Centers (ICs), and this appears to be what is happening. Some Institutes that have their own microbiome projects, e.g. NIDCR and NHLBI, and others support many microbiome related grants. Disease related demonstration projects whose funding was not extended by the Common Fund have largely been picked up by individual Institutes, which would seem to show that microbiome research does not require Common Fund support to be carried out and the ICs are a more appropriate home.

    These latter projects that have transitioned from Common Fund support to IC support are still complex with large patient cohorts, disease focused, and multidisciplinary with clinical, genomic, and computational specialists. This is what is being called for by many of the commentors above, and it seems to be already happening.

    It may also be more appropriate for disease oriented microbiome research to reside in the ICs now, since every tissue is different with its own challenges, and a one size fits all model has possibly been played out in the HMP. Having projects housed in Institutes with the requisite expertise for their own disease- or tissue-focused microbiome interests may be necessary to move into the level of detail that may be required to make the correlations between disease and the microbiome that is sought.

    The underrepresentation of virus and eukaryote components of the microbiome in previous studies would seem to be manageable in future research supported by ICs, and does not in itself seem a complete justification for a Common Fund initiative. That is not to say that there could not be Common Fund initiatives for virus or eukaryote research in general.
    Finally, little mention has been made of understanding the impact of the host genotype on the microbiome. Although this is a large area, given the GWAS and other work going on with respect to microbiome related diseases such as Crohn's, there may be much learned in this area from the work going on outside of the Common Fund.

    It would seem that the HMP has accomplished many of the goals that were set for it, but it is not clear that there is a strong case for another 5 years unless something more than incremental changes in direction can be proposed.

     


     

    Microbiome, Part 2 is an excellent opportunity to expand the bacteriome to a true microbiome: bacteria, fungi, parasites, and viruses. Most, maybe all humans are colonized with fungi that are highly adapted to their mammalian host, and probably also to their bacterial competitors. Changes in one (e.g., hormonal shifts, immunosuppression, diet, antimicrobial therapy) likely lead to changes in the other - this warrants further study. Also, it is clear that some fungi are difficult or impossible to culture - how many more have we missed altogether? The same is likely to be true for protozoan parasites; probably even more so when we consider the microbiome of humans living in resource poor regions of the world.

     


     

    I agree with most of the above comments. Having identified what is there, whether they are bacteria, fungi, or viruses, we need to move forward and investigate what factors control their composition and activity (e.g. food, endogenous secretions, microclimate) , and how they interact with the mucosa and body as a whole to promote health or illness. As no microbe, or class of microbes exists on its own, complex mathematical analysis will be needed to predict what the net effect of a specific perturbation (e.g. antibiotic, probiotic, nutrient) might have on the host.

     


     

    The GWAS-MWAS studies seem very important to me, because the gut flora alter nutrients and make toxins that could affect health more than our genomes. For decades, this area has been overlooked, but funding will take more than 10 years. It would be more like a Framingham study that tracks outcomes over 50 years. One may also have to collect “Immunobiome” data that reflect the interactions between humans and microbes.

     


     

    Given the evolutionary antiquity of microbiome associations with animals and plants there are likely to be common mechanistic principles that define critical characteristics (metabolic integration, population dynamics e.g.) of these associations, and these principles and mechanisms would translate to understanding the human microbiome complex. Currently it is exceedingly difficult to describe and analyze the dynamics of collective gene activity even in a single organism, and microbiomes comprise hundreds or thousands of organisms. The prospects for determining the basis for physiological and
    species’ stability in complex populations of microbes interacting with host organisms are daunting. Ultimately it will be necessary to translate high throughput data into biochemical pathways and signaling relationships among circuits that regulate growth and effector production in the assemblage. Likely, there will be other, unanticipated parameters that will have to be accounted for to obtain a useful working model of a microbiome.

    To address these issues new perspectives and technologies will likely be required to obtain appropriate, population level data and to analyze it as a complex system. It is highly likely that the necessary insights and technological advances will not arise from studies of problematic and variable human systems; rather, these tools and insights will emerge from studies of tractable model organisms and model microbiomes.

    Progress will require:

    1. Development of relatively simple model organism systems that can be used to pilot the development of new technologies and approaches;

    2. Development of new analytical capabilities for quantitating, e.g., metabolic flux and signaling component fluctuations through microbiomes and their hosts, particularly under perturbation; and

    3. Development of systems theoretical tools, in an ecological context, to create and populate in silico models of complex assemblage dynamics.

    A Common Fund program targeted at encouraging and supporting these capabilities could generate knowledge of basic principles that could then inform experimentation and evaluation of parameters of therapeutic significance. Since it is becoming very clear as a result of current Human Microbiome efforts that health and disease states are linked with Microbiome activity, the applications of fundamental principles could be both widespread and fruitful.

     


     

    A Common Fund investment in a future microbiome initiative will pay the biggest dividends in the long term if the appropriate bioinformatic and computational infrastructure is created to support this data-intensive subject area. Whether we are talking about microbiome sequence/protein/metabolomic/visualization data or host genotype/disease phenotype/immunological data or other data types, it will be important to a) establish minimum common data standards for these data types, b) lower the barrier to data access and c) broadly distribute the tools for data analysis to the larger scientific and medical communities. In this emerging world of personalized and genomic medicine, full and productive use of the datasets will require a flexible and friendly infrastructure which links the various microbiome datasets and the host datasets with appropriate, ready-to use computational tools that are accessible to all regardless of one's own access to bioinformatic resources or computational expertise. In fact, it will be the routine access and use of this network of data and tools by the broader community which will move this field towards translational outcomes as microbiome and related data are applied to the prevention and treatment of disease and in the support of health. The creation of a fundamental community resource which will support the needs of both the research community and the medical community and which paves the way for access by the public will be crucial to the integration of the microbiome into a full understanding of health and of disease. Even if the Common Fund chooses not to support a microbiome initiative in the future, it is imperative that the Common Fund consider the support of the infrastructure needed to capitalize on the current HMP investment and on the microbiome studies across the NIH.

     


     

    My sentiments are represented by the above comments. Please expand this project to include fungi and viruses and other microbes that could be of import.

    Up to Top

    Human Microbiome, Part II - Microbial Product Characterization (see “Microbiome: Part 2” in Innovation Brainstorm ideas)

    Nominator: NCCAM
    Participating IC: Office of Dietary Supplements

    Major obstacle/challenge to overcome: The Human Microbiome Project has dramatically increased our understanding of the diversity and genetics of the microbial constituents of the human body. The next and even bigger step is to move beyond sequencing and organism identification to functional understanding of the microbiome and its relationship to a broad spectrum of autoimmune, infectious, metabolic, and other diseases. At a big-picture level, the challenge is to understand how commensal bacteria, probiotics, and their products may act and interact at mucosal surfaces and throughout the body to alter physiology and pathogenesis. For instance, it is known that intestinal microbes produce vitamins and minerals and provide them to the host, and that carbohydrate fermentation produces short-chain fatty acids that can alter host absorption of calcium, magnesium, and phosphorus. Intriguingly, microbial products from one organism also can alter the gene expression and the functional metabolome of other resident microbes. As an example, in a model gut system the exogenously administered probiotic Bifidobacterium longum was shown to expand the variety of polysaccharides targeted for degradation by Bacteroides thetaiotaomicron, which is a prominent member of the adult human gut microbiota.1 These and other examples speak to the importance of identifying and understanding the protein, small molecule, and metabolite products of those microbes, and the relationship of these to beneficial or adverse functions of the microbiome and probiotics administered with an intention of altering it.

    Emerging scientific opportunity ripe for Common Fund investment: The Human Microbiome Project has provided the capacity to determine the composition of the microbiome and observe associations between the microbiome and a spectrum of disorders ranging from intestinal disease to obesity and from cystic fibrosis to cancer. This has set the stage for movement toward a more functional understanding of the microbiome-human relationship. A detailed characterization, including both identification and determination of function, of the small molecule, protein, and metabolic products of human microflora has become necessary to advance the field. Characterization of microbial products would focus initially those produced by the most common oral and gut commensals. Additionally, early on it is vital to study the organisms used as probiotics with the intent of altering the oral and gut flora because altered microflora may concurrently alter gut micro-ecology, intestinal barrier function, metabolic activity of the host, and modulate mucosal and systemic immune function. A comprehensive analysis of microbial products, beginning now with organisms identified in the first stage of the HMP as being associated with health or disease and then expanded over time, is required in order to appreciate and predict the effects of changes in the microbiome resulting from such causes as antibiotic use, disease, or probiotic administration.

    Common Fund investment that could accelerate scientific progress in this field: The Common Fund could accelerate the potential transformative impact of the microbiome sequencing data by establishing projects that would focus on the microbial products of both common human flora and potential probiotic organisms. This initiative would encompass the following undertakings:

     

    • Establish repository capacity and production facilities for strains of organisms that would be made available to investigators upon request.
    • Determine sequences at the strain level since different strains of the same bacterial species have been shown to exhibit differential metabolic and immunoregulatory activity.
    • Determine and catalog microbial small molecule and protein products.
    • Assess strain-specific metabolomics.
    • Undertake functional analysis including toxicity of microbial products and metabolites. This would involve assembly and conduct of a panel of screening assays for effects on other microbes such as changes in their metabolome or the total gut metabolome or effects on the host such as changes in gut barrier function or immune activity.
    • Provide information for regulatory agencies thus facilitating work in humans.

     

    Potential impact of Common Fund investment: Understanding of the organisms and their products and metabolites, and providing characterized and standardized resources to investigators, would be invaluable. There would be clarity as to genotype, phenotype, and function of microbes under study. Such comprehensive analyses of commensal organism and probiotic products are a prerequisite for elucidating how the oral and intestinal mucosa respond to their resident and transient flora. Detailed assessment of a repertoire of products of common microbes also provides a method for rational study of the resultant mucosal and systemic effects (both beneficial and adverse) of conditions such as antibiotic use, weight loss/gain, and disease. Further, it is crucial for the eventual understanding of the potential impact of probiotics on health since one might match the correct probiotic strain with the desired clinical outcome, be it immune function, barrier function or metabolism. Finally, these analyses may yield new therapeutic targets or insights into dietary interventions; and understanding variation in microbial products will shed light on how an imbalance in our microbiota may contribute to disease.

    1Sonnenburg JL, Chen CT, Gordon JI Genomic and metabolic studies of the impact of probiotics on a model gut symbiont and host. PLoS Biol. 4(12):e413, 2006.

    Comments:
    None

     

    Up to Top

    Molecular Classification of Disease

    Nominator: Innovation Brainstorm participants

    Major obstacle/challenge to overcome: Currently, “clinical syndromes” are often used to classify disease. The problem with this approach is that a given patient syndrome may contain significant heterogeneity with regard to molecular mechanisms of pathogenesis. As a result, the ability to identify pathogenic mechanisms in population studies is limited, as is the ability to quickly and efficiently identify who will benefit from therapeutic interventions. Thus, new approaches are needed for classifying patients and disease states that are more tied to the molecular basis of disease. Intermediate markers or “endophenotypes” may be helpful in this regard. Another obstacle to translation is a general lack of willingness to challenge dogma, which can perpetuate stale thinking and practice.

    Emerging scientific opportunity ripe for investment by the Common Fund: Progress in this area promises to fill gaps between molecular characterization and patient disease states, as well as to identify heterogeneity in classical clinical syndrome classifications. Recent advances in technologies that allow comprehensive profiling of patients at the molecular level and association of these profiles with clinical data provide an opportunity to completely redefine the way we think about and understand disease. However, these capabilities need to be developed further and expanded for regular use in the clinic.

    Common Fund investment that could accelerate scientific progress in this field: Innovation is needed in the way in which we classify patients. Examples include:

     

    • Expression analysis of patient samples
    • Epigenetic analyses of distinct cell types
    • Classification and measurement of behavioral symptoms
    • Integration of different phenotypes
    • Methods to measure the response to various perturbations

    The NIH could establish a well-characterized, central sample database to encourage data sharing and integration. New approaches to finding “lenses” to view complex biomedical problems could include funding coherent, high risk programs, as well as considering the relevance and ability of existing networks to pursue this work [e.g., Clinical and Translational Science Awards (CTSAs)].
    Potential impact of Common Fund investment: Molecular characterization of disease has obvious benefit across the board for diagnosis and treatment of all diseases. In addition, progress in this area would catalyze the transition from one-size-fits-all medicine to personalized medicine. Clinical trials could be done more quickly and efficiently, and the resources harbored by population studies may be better utilized.
     

    Finally, encouraging a mandate to challenge dogma would likely introduce broader thinking that will undoubtedly open new avenues for exploration.

    Comments:
    Unfortunately this reads like “more of the same.” What the NIH should be looking at is going beyond just doing expression profiling and tackling the underlying issues with biomarker discovery. How many patients have we profiled? Thousands. How many signatures have made it to clinical and translational application? Two or three?

    I think there is a significant need to change the way we approach biomarker discovery. This starts with creating a real mandate for data and methodologies to be available so that the biomarkers can be validated and so the data can be reused by others.

    This is going to require that the data are well annotated--not minimally annotated, but well-described with appropriate clinical and outcome data. A breast cancer study without molecular subtype or outcome data just isn't all that useful. It should also require that source code for the methods be made provided. Both of these will help assure that the biomarkers, sitting at the nexus between measurement and algorithmics, can be validated on other data sets and may have a chance of being applied in other settings.

    Second, we need to invest in systems biology approaches to finding biomarkers. It isn't single genes that tell the difference between disease subtypes or phenotypes, but alterations in pathways. At present, we do not have good methods to really find the pathways that distinguish phenotypes.

    So it would be wonderful to see programs that encourage the generation of integrated data (genomic, transcriptomic, epigenomic) from phenotypes and the creation of resources to make those data, together with clinical data, to the community. Second, we need an investment, through Requests for Applications (RFAs), in the development of methodologies focused on new approaches to biomarker discovery.

    This is a great opportunity to move the field forward by using a forward-looking approach.

     


     

    This proposal seems to be going down the path of personalized medicine. While laudable, it will be very expensive, resource-intensive, and difficult to achieve in a short time frame. Instead of classifying people with different disorders, we can put them in one universal category: aging. Rather than study 30,000 genes, we have perhaps a dozen stem cell genes, including Dr. Collin's progerin, to focus on. While there is an NIA and Institute or Center (IC)-specific stem cell research, perhaps Common Fund support could increase the feasibility and progress of this approach.

     

    Up to Top

    Single Cell Analysis

    Nominator: Innovation Brainstorm participants

    Major obstacle/challenge to overcome: Population heterogeneity among cells in a given tissue is a critical issue whose importance bridges many areas of biomedicine: cancer, infectious disease, developmental processes, organs, and immune responses. However, it is well-known that current approaches are quite limited in that they can only achieve approximate ensemble analyses of cell populations. Roadblocks to progress in this area are biological and technological: Molecular and systems level description (and quantitation) of cells, organs, and disease processes requires a greater understanding of the behaviors of individual cells and the overall composition of the population.

    Emerging scientific opportunity ripe for Common Fund investment: Advances in engineering and nanotechnology provide the opportunity for transformative methods in single-cell and population-based analyses. The need for ultra-sensitive analytical methods and sophisticated computational tools calls for expertise from physicists, engineers, and computer scientists. It is possible that existing theorems on organizational behavior could be re-purposed for single-cell studies.

    Common Fund investment that could accelerate scientific progress in this field: Potential Common Fund (CF) investments in this area would go beyond most of the current emphasis on microscopic and imaging techniques (although those approaches are also useful and necessary). Potential new investments could be in mapping a single cell’s epigenome, proteome, and metabolome. In addition, CF investment is needed to extend recent proof-of-principle work in single-cell genome sequencing and transcriptomics that is highly innovative, but low-throughput and far from practice. CF investments should emphasize approaches that capture living (or recently living) cells in vivo without need for overexpression or artificial constructs.

    Potential impact of Common Fund investment: The ultimate motivation for more research in single-cell analysis is the potential for in vivo application to disease. Developing a robust set of tools to assess (and ultimately manipulate) single cells in situ is a key step toward achieving that goal. This achievement would have broad applicability across biomedicine: both for basic studies and for clinical use.

    Comments:
    Analyzing various parameters of single cells (kinase signaling states, epigenetic states, metabolomic states, etc) would be truly transformative and lead to many breakthroughs in the way we study biological systems. Cellular heterogeneity is ubiquitous in both in vitro cultures and in vivo or primary tissues, but currently there are no tools available (perhaps except the live-cell imaging) to address this. While the targets to be detected are diverse (proteins, DNA, metabolites etc.) the problem of detection actually share the common technical challenge, which is small sample volume and low abundance of targets. It is my opinion that the field of microfluidics has been accumulating (for the last 2 decades) enough tools in the chest to adequately address this problem.

     


     

    There is tremendous heterogeneity in the extent and progression of cellular damage in various ocular diseases including age-related macular degeneration, diabetic retinopathy, and glaucoma. The molecular and genetic bases for this cellular heterogeneity are unclear. There is increasing recognition that perturbations of the genomic architecture by mobile elements such as retrotransposons are involved in the pathogenesis of numerous human diseases. The insertional profiles of mobile elements are highly variable across individual cells even in the same organ or tissue, and dependent on the disease state, exposure to environmental factors, stochastic fluctuations, and various other causes. Understanding the spatial and temporal relationship of the mobile element genetic architecture in individual cells to the state of their function or dysfunction in various stages of disease can facilitate molecular diagnosis and help predict disease progression and outcome. This information can also serve as a platform for testing interventions and enable the development of personalized medicine.

     

    Up to Top

    Targeting the Dynamic Complexome

    Nominator: Innovation Brainstorm participants

    Major obstacle/challenge to overcome: The spatial and temporal dynamics of protein complexes and complex-drug interactions are difficult to characterize (and predict). In part, this contributes to the well known in vitro/in vivo discrepancy between predicted and actual drug action and efficacy. Primarily, this is due to current limitations of in vivo validation processes. Experimental mapping of the dynamic complexome in normal and disease states would add significantly to overcoming this obstacle. Much more rational design and screening methods are important for developing safe and effective drugs that specifically target complexes.

    Emerging scientific opportunity ripe for Common Fund investment: Recent progress in the development of tools and methods to map dynamic protein-protein interactions provide a mechanism through which disease pathogenesis can be better understood and new drugs can be designed. Specific challenges must be overcome for these possibilities to become reality.

    Common Fund investment that could accelerate scientific progress in this field: Common Fund (CF) investments in the following would have a transformative impact on the identification of new drugs, the functional annotation of existing drugs, and the identification and testing of candidates for polypharmacologic approaches:

    • Experimental mapping of the dynamic complexome in normal and disease states.
    • Development of computational tools and algorithms that allow predictive models for protein-protein and protein-drug interaction to be established and tested.
    • Development of drugs that target dynamic protein complexes through rational drug design and through screening approaches.
    • Development of novel methods to structurally characterize spatially and temporally dynamic complexes.

    Potential impact of Common Fund investment: Investment in this area would benefit basic and applied studies. Identification of new functionally distinct complexes that define cellular pathways will increase knowledge about pathways and signaling mechanisms shared among diseases and conditions. Mapping the complexome may have as much potential for distinguishing disease states as does mapping the genome. Progress in this area will also likely yield small molecules as probes for clinical samples and tissue engineering models. Clinically relevant impact also includes the identification of new drugs that are specific, effective, and which have better side-effect profiles than most currently used therapeutics.
     

    Comments:
    The analysis of complexes in living cells is one of the great challenges for imaging, and I think that this would be a high-impact funding opportunity with significant multiplier effect capability. It seems that there are two sides to complexes, complex assembly and complex disassembly, both of which can be targeted by drugs. It would be great to see some tools developed that could address both sides of these dynamic events.

    The concept of “in-vivo” is alluring, but is likely to slow the analysis to such an extent that it should be more broadly defined, for example, to include live cells and complex tissue-like environments—this preserves some reasonable level of analytical throughput, and provides important feasibility for these sorts of analyses in living systems, potentially animals.

    Since many of these methods will be used to screen for complexes, it would make sense to start in genetically accessible model systems, like yeast, where gene knock-out and replacement is scaleable. Hopefully the transcription activator-like effector nucleases (TALENs) and zinc finger nuclease (ZFN) technologies make these approaches almost as easy in mammalian and biomedically important systems in the near future, but facile chromosomal genetic manipulation is going to be essential for any approach that seeks to identify complexes under native concentrations and control.

     


     

    It would be incredibly valuable to be able to monitor spatial and temporal changes in the individual components of regulatory complexes within and on the surface of living cells. This knowledge is essential to understand normal and diseased cell behavior. Tools that enable such measurements with good sensitivity and time resolution would be invaluable for basic understanding and for construction of drug discovery assays. Fluorescence is currently one of the most sensitive and fast detection technologies. But we can’t afford to create fluorescent protein labeled components that function properly in complexes because the ~32KD labels would interfere with most of the relevant interactions. Even labeling purified components with dyes and microinjecting into cells would be impractical. New concepts for rapid and sensitive detection of the addition and loss of components from are needed. Such detection systems could be incorporated into cultured cell lines and would be adaptable to high throughput screening.

     


     

    We strongly agree with these ideas. Indeed the development of such tools is the explicit goal of our National Center for Dynamic Interactome Research. We cannot expect rapid and efficient drug discovery without a thorough understanding of the interactomes we are trying to target. We have repeatedly seen that appropriate tool development provides a tremendous impetus to multiple individual research projects, and that the discovery of new dynamic interactions very often immediately suggests new leads for therapeutic intervention. We also think that the proposal is timely, as never before have there been so many opportunities for applying new technologies to the exploration of the complexome; but crucially we need to hone in on which technologies to apply, how to apply them, how to integrate the data obtained into dynamic complexome “maps”, and how best to read these maps to find the best sites for drug targeting. Such an initiative is clearly beyond the resources of the standard R01 grant model and must be supported by a mechanism that funds multidisciplinary, multi-lab collaborations formed with a clear mandate.

     


     

    The time is right to aggressively pursue a realistic, quantitative understanding of complex, dynamic molecular interactions across scales. It is evident from the recent literature and the evolution of some NIH programs that this can be achieved through a program emphasizing two important ideas: (1) the dynamic, transient nature of the molecular interactions that drive cell biology and physiology, and (2) the importance of integrated experimental approaches that cross artificial boundaries driven by technological constraints. One goal of such a program should be a suite of technologies and methods that can be applied at will to interrogate complex systems in support of questions in biophysics, cell biology, and drug discovery.

    There are many recent instances in the literature that point out both the need for such a program and the potential impact of this approach. For example, the advanced methods developed in the National Center for Dynamic Interactome Research (www.NCDIR.org) have resulted in significant insights into nuclear transport, the viral infection process, and cellular trafficking. Chemoproteomic experiments can radically improve our ability to define specificity of drug candidate binding to targets expressed not as single proteins but complexes (Bantscheff, et al. Nat. Biotech. 2011). Biophysical methods such as PLIMSTEX (Gross et al, Biochemistry, 2011) can rapidly and quantitatively characterize complex binding events in protein complexes for both small and large molecules. These emerging approaches are just the beginning of what can be accomplished in attacking the problem of dynamic molecular interactions.

    The development of robust methods for study of dynamic interactions across molecular scales will radically alter the way we think of systems biology experimental design, and will have a direct and immediate impact on translational research. In the systems biology community, there has been increased attention to the need for integration across technical boundaries. There is now a strong desire to integrate systems-level data for proteins, lipids, carbohydrates, and a broad range of metabolites. This is driven by a broad recognition of both the value of this approach and the nascent ability to accomplish it.

    Scientists obviously understand that large and small molecules interact in complex, dynamic ways to drive biological processes. However, we are all more or less hostages to the tools at our disposal. These technical constraints drive compromises in both our experimental approach and our thinking about the systems we study. New technologies enable new thinking because they allow us to ask new questions. Improved tools for realistic, integrated approaches will enable more sophisticated experimental design as well as new ideas about disease mechanisms, targets, and treatments.

     


     

    I think these are all exceptional ideas that speak to the core issue of studying cellular networks. I would like to see further resources directed to this area of science. My only concern, and this is the same with all “new” NIH projects, is that a large amount of money will be poured into a few projects, that may or may not be successful. I propose setting up a smaller pilot grant program that is accessible to many investigators. The research that emerges as most feasible and innovative would then be selected for further funding. This will stimulate innovation. The current approach is to throw large amounts of money at a few projects, many which fail because there was too much inherent risk and lack of validation of the approach. We have tried this approach over and over and it has done nothing but drain existing R01 resources. Thanks for considering my suggestions.

     

    Up to Top

     

    NIH Award Strategies

    Nominator: Innovation Brainstorm participants

    Major obstacle/challenge to overcome: A common theme during the online discussion prior to the Innovation Brainstorm meeting and at the meeting itself was how to bring together disparate fields of science. Despite recent Common Fund (CF) efforts and programs across the NIH, the formation of teams and integration of multiple disciplines remains a major barrier.

    Common Fund investment that could accelerate scientific progress in this field: Potential CF Investments include:

     

    • Student Training: Create pilot Ph.D. programs in emerging, cross-disciplinary areas (e.g., stem cell biology and bioengineering). Such a program should receive CF investment for 10 years, to develop a curriculum and engage existing faculty. Students would perform rotations in cross-disciplinary labs.
    • Postdoctoral and Faculty Training Fellowships: Postdoctoral fellowship awards could be targeted to physical scientists and engineers, and trainees would be co-advised and trained in the different fields. Another idea is a faculty fellowship program that would fund salary and supplies for periods of three to nine months in which mid-career scientists and engineers would work in clinical labs or vice-versa.
    • Facilitating Ph.D. and M.D. Interactions: Topical workshops could bring together clinicians and professional scientists and engineers. One idea, modeled after the National Academies Keck Futures Initiative (http://www.keckfutures.org/ ), is to host a mini-course, in which attendees work in teams (8-10 people) to solve grand challenges, and then present and summarize their results. Workshop participants could also apply for seed funding afterward with awards of $100,000 and $250,000 for two years. If successful after the pilot period, awardee teams could then compete for a much larger pool of funds, around $1.5 million for five years.
    • Creating Mechanisms for Small Team Proposals: Fund small, interdisciplinary team proposals (of three to four investigators partnered with companies) that are larger than multi-PI projects but smaller than Center awards. Another idea is funding a Defense Advanced Research Projects Agency (DARPA)-like mechanism in which program officers facilitate team building and which allows innovative proposals that might not have fared well in traditional study sections to get done.
    • Low Hanging Fruit for Immediate Impact: The NIH could fund an intermediate stage of funding that validates university or donor-sponsored initiatives. This would encourage real investments outside of NIH and provide a federal “stamp-of-approval” for innovative or cross-cutting ideas that emerge. Another idea is to provide funding to support start-up companies, enabling them to establish an infrastructure that is a pre-requisite for attracting other investors.

    Comments:
    Today's (Sept 2, 2011) jobs report shows no growth in jobs. The president plans on giving a speech next week on jobs. The NIH can help by putting a significant amount (say $500 million) in SBIR Phase II +, for clinical studies and initial hospital testing, half to devices and half to drugs. Small businesses and major hospitals will benefit. Most importantly, the country will benefit from the new products entering the market and resulting jobs.

    Also, the NIH, Food and Drug Administration (FDA), Center for Medicare and Medicaid Services (CMS), Veterans Administration (VA), and Department of Defense (DoD) should meet quarterly to develop strategies to get these new products into commerce to:

    1. Improve the health of warfighters and veterans.
    2. Improve the health of all Americans and the World.
    3. Lower health care costs.
    4. Improve the American Economy.
    5. Create new jobs for Americans.

     

    Up to Top

    Biomarkers for chronic pain using functional brain connectivity

    Nominator: NIDA
    Participating IC: NCCAM

    Major obstacle/challenge to overcome: Chronic pain is a debilitating condition affecting at least 116 million American adults, resulting in significantly reduced quality of life and an estimated annual cost of $560 – 635 billion1. Unfortunately, its assessment is based solely on subjective self-report, using limited scales or measures, which are unsuitable for elucidating the different types and causes of pain (i.e., pain endophenotypes) and for rigorously evaluating the impact of targeted interventions. Self-report measures also hamper progress in the monitoring required to precisely dose a medication and then evaluate its comparative effectiveness among different individuals. Also, importantly, the field of pain management has been long challenged by the twin fears of undertreating pain in those who are suffering vs. triggering or facilitating a drug problem. Because of all these obstacles, there is a pressing need for a standardized, brief and simple measurement that can translate, or at least reproducibly correlate, subjective pain experience into objective and quantitative readings for both clinical and research purposes.

    Emerging scientific opportunity ripe for Common Fund investment: In functional neuroimaging, there has been a recent explosion of findings on functional connectivity (FC) between brain regions, especially in the resting-state (RSFC), which is defined as the signal coherence between discrete brain regions in the absence of a cognitive task. RSFC has uncovered discrete functional networks, where the strength or activity coherence can be quantified. Based on recent reports of differences in intrinsic brain network connectivity between patients with chronic pain and controls, it has been suggested that RSFC could be a suitable platform to develop objective biomarkers of pain. Moreover, recent expansion of neuroimage data-sharing, especially of RSFC data in the 1000 Functional Connectomes Project, has demonstrated that data from different sources can be pooled to define subtypes of populations stratified by age, gender, medical conditions, and other variables, to enhance statistical power for discovery. If this level of between-labs consistency turns out to also apply to pain related measurements, RSFC could revolutionize the field of pain research and management.

    Common Fund investment that could accelerate scientific progress in this field: Chronic pain is a clinical condition characteristic of a wide range of physical syndromes that collectively span the programmatic purview of many different Institutes and Centers (ICs). A request for applications (RFA) on this topic to fund five or six research project grants would enable multi-disciplinary teams (comprised of pain clinicians, functional neuroimagers, and computational/network neuroscientists) to 1) develop techniques for image time-series analysis to identify brain RSFC signatures of different types of chronic pain, and 2) test the value of said signatures in a clinical context. For example, R21/R33 phased-innovation awards would enable initial collaborations to assess basic cross-sectional differences between controls and patients with different syndromes of chronic pain, and to develop and optimize new analytical tools for better identification of sensitive and specific RSFC biomarkers of pain. The common fund program concept would also enable comparative effectiveness research, data harmonization across funded projects, foster a consortium on pain RSFC biomarkers, and inform prospective evidence-based, personalized care of pain.

    Potential impact of Common Fund investment: Advances in image acquisition and data analytic approaches could yield a level of objectivity, sensitivity, and specificity that would be unprecedented for chronic pain. In theory, a single resting-state functional MRI scan could serve as a diagnostic procedure akin to a head MRI for brain cancer or other neurological diagnoses. Data derived from that scan may not only provide an objective and reliable marker, but also help identify optimum therapeutic approaches, lowering the costs and loss of productivity associated with ineffective pain treatments. Validation of pain biomarkers is critical in the development of pain medications and for the adequate use of prescription analgesics matched to the needs of individuals. When the proposed program achieves its objectives, the collaborative effort among funded projects will complete the characterization and validation phase of functional brain connectivity as biomarkers for chronic pain, helping to bring evidence-based, personalized management of pain closer to reality.

    1 Relieving Pain in America: A Blueprint for Transforming Prevention, Care, Education
    and Research. Committee on Advancing Pain Research, Care, and Education Board on Health Sciences Policy. Institute of medicine of the national Academies (2011).

    Comments:
    This is a very timely concept. For many years, stimulus-driven fMRI has dominated the search for markers of pain in humans. With recent advances in DTI, resting-state MRI, evoked potentials, and MEG, the field has been opened up and accelerated. I think that it may be a mistake to limit the concept of pain biomarkers to brain activity however - recent biochemical findings are likely to contribute also - albeit ideally in concert with physiology.

     


     

    Pain “biomarker” is an oxymoron in the common vernacular. The quest for an “objective” measure of an inherently subjective phenomenon (pain) is likely to be a fruitless one. Those who portray self-report of pain as “unreliable” are mistaking variability for unreliability. First, pain is variable, it is not a static trait-like experience. Second, there are lots of reasons for both between person, and within person variability. There are many studies showing that the variability in self-report is predictable by other variables (50-60% of the variance, in fact). Therefore, by definition, the variability is not error, or unreliability. Finally, we must remember that brain imaging studies of pain were validated against self-report. There are reasons to do brain imaging (and CNS imaging of pain), but it is not to find a biomarker. The biomarker is not what brings patients to the clinic, nor will it be the reason they are satisfied with their care. Embrace self-report and explain variability as your research goals.

     


     

    An objective measure of pain is the Holy Grail of our field. Pain, by definition, is phenomenal experience that encompasses not only sensory awareness but negative emotion and cognition as well. It will not reduce conveniently to pure sensation. It may prove difficult to identify a brain biomarker for pain that is not also a biomarker for somatically focused threat or attention.

    Biomarkers, like other measurement tools, must pass muster on two criteria: validity and reliability. A biomarker is valid if, and only if, it truly measures what it purports to measure, namely pain. Subjective and other behavioral measures are the only criteria we have for pain validation. A biomarker is reliable to the extent that it performs consistently between and within individuals. There are multiple forms of validity and reliability. I hope the proposal evolves to emphasize the central importance of systematically satisfying these criteria.

     


     

    Agreed on both the need for physiologic measures of pain and the potential utility of RSFC as an objective biomarker. Combined with other trait based imaging modalities (e.g. voxel-based morphometry,VBM, or diffusion tensor imaging, DTI) may provide for greater accuracy.

    The R21/R33 lends itself well for this mechanism as previous comments have mentioned. This can also be addressed with traditional R01 mechanisms as well.

     


     

    This is a particularly important idea. I would encourage the proposal to go beyond just using the image as a biomarker or predictive tool, but to use the image, and the regions that are dysfunctional, as a map to dictate which of the new non-invasive brain stimulation therapies might be tried, and where they should be focused. There is exciting work suggesting that transcranial magnetic stimulation (TMS) or transcranial direct current stimulation (tDCS) have acute and chronic antinociceptive properties.

    This proposal might also be integrated with a new technology called biofeedback fMRI, where subjects are given a real-time display of their brain activity, and then are trained in how to change the signal. We are using this now in the area of craving, but using it in pain is another likely fruitful area.

     

    Arrow I believe that this is an outstanding proposal because it goes to the core of the problem that is the subjectivity of the pain symptom. It is not unusual for care to be denied to patients because there is not objective findings to support the patient’s complain. The proposed strategy might as well assist with chronic therapies because changes in function, excitability and plasticity related to pain improvement could be documented over time. As some colleagues comment, the recent development of non-invasive technology including tDCS and TMS would benefit from these strategies as well.

     


     

    I am strongly in favor of this proposal and the focus on functional connectivity is crucial. The neuroimaging field is rapidly moving toward identifying functional brain networks in health and disease, whereas until recently - and still common to most pain neuroimaging research - the goal has been to identify the roles of individual regions rather than networks. The proposed program would enhance the focus on brain networks in pain and could lead to new treatment approaches. I agree with the comment above that the addition of TMS and tDCS to manipulate these networks would also be very useful as well.

    I agree with the above comments and would like to add the importance of examining functional connectivity in animal models of pain. Recently there has been concern about the predictive validity of current animal models of behavioral disorders. Emphasis has been placed on going beyond behavioral endpoints and deconstructing pain symptom based syndromes into biological endophenotypes. Functional brain mapping, and specifically connectivity analysis, has been proposed as such an endophenotype. The possibility of a cross-translation of brain endophenotypes from humans into rodents may guide new conceptualizations of core pain syndromes. The development of animal models that mimic pain disorders at the circuit rather than behavioral level may also facilitate new therapeutic strategies.

     


     

    I would first say that the proposed idea identifies a laudable goal, and I expect many of us pain imagers feel this way. If we could really identify a neural “code” for the experience of pain, we’d have a conceptual breakthrough, and a great tool for many lines of research. However, several of the caveats offered are warranted. Restricting focus to functional connectivity, even more so to RSN connectivity, is too narrow. I’ll point out that this is only the latest of several physiological measures that has been offered as an “objective measure of pain” over the last few decades. Maybe we are closer to this holy grail this time, but I’d say we’re not close enough to put all our money on one horse.

     


     

    Given the difficulty in identifying a physiological or psychological etiology in many patients, especially older adults, the American Geriatrics Society hopes that this research will further explore chronic pain as a diagnosis in its own right.

    Proposed Common Fund Research Idea: Biomarkers for chronic pain using functional brain connectivity

    Proposed Common Fund Research Idea: Chronic Pain Conditions: A Transformative Classification for Stimulating Research, Improving Diagnosis, and Personalizing Treatment

    • AGS Feedback: This proposal puts excessive emphasis on functional neuroimaging, and looks at serological and genomic markers associated with pain which must be emphasized in such a project. AGS recommends that this research idea be melded with markers of frailty in older adults as a way to gauge changes in pain associated with aging.
    • Our recommendation is to consider older adults, especially those with multiple chronic and often complex conditions as a focus of any research that attempts to reclassify the diagnosis and treatment of pain.
    • Proposed Common Fund Research Idea: Venture Fund for Research and Development of New Medications to Treat Chronic Pain (see “NIH Award Strategies” in Innovation Brainstorm ideas)
    • AGS Feedback: Because this proposal indicates that a bio-psychosocial definition is needed for pain, AGS is concerned that this type of approach may overlook the vulnerable and multi-comorbid aging population. Psychosocial models may involve pain, but to overemphasize such in the determination of biological/physiological origins may render psychiatric patients vulnerable to treatments that do not reflect organic causes of pain. Vulnerable elders may be particularly susceptible to such bias.

     

    AGS recognizes the need for collaboration between academia and industry moving forward in relation to research and development around new medications to treat chronic pain. Any such collaborative programs must have inclusion of aging populations and take into account special issues relevant to the geriatric population, like multi-morbidity, as part of such efforts.

     


     

    Aside from exploring the possibility of novel biomarkers for pain, this proposal has the potential to shed significant insight into the underlying mechanisms. One concern is that it is limited to chronic pain. An expansion to include acute, experimental pain would provide a foundation for the development of these novel approaches using carefully controlled stimulation. Given the tremendous heterogeneity of symptoms within a single chronic pain disorder, having a relatively clean pain state appears critical for assessment of the feasibility and validation of these techniques.

     


     

    Excellent and timely proposal. The ideal pain biomarker should be non-invasive, practical, relatively cheap, objective, and preferably targeted to the brain, especially for chronic neuropathic conditions:

    1. Why not include field potential measurements, such as EEG, which could be coupled to quantitative analysis for low- and high-band frequencies? It meets all the above criteria, particularly being cheaper and more practical than imaging?

    2. Why disregard animal studies? The field of pain research is suffering from unreliable models while attrition rate of clinical studies is near 90%. A pain biomarker that could measure spontaneous and evoked pain in animal pain models would be a huge benefit for researchers and clinicians, academia and industry.

     


     

    Despite the recognition of this topic as relevant, important and timely, not one of the above scientists has suggested using actual pain patients as models. Individuals suffering from chronic pain are desperate to be heard and taken seriously by the medical community. Asking their opinions and gathering and analyzing the data (such as has been done in public health focus groups) would be beneficial to both the patients and the researchers. A well-designed clinical trial devoted to chronic pain management would be enlightening and possibly present a breakthrough. I'm disappointed that all comments disregard self-reports as unreliable and prefer biomarkers to patient input. Seems like a disconnect.

     


     

    I am glad to see that efforts are underway by the NIH to find additional ways of funding pain research given the tremendous impact on U.S. health. While subjective report of pain is clearly important, there are certain populations (e.g., non-verbal patients) for which biomarkers could play an important role in the effective assessment and treatment of pain. Biomarkers also have the potential to help us identify who be at risk for developing chronic pain. I would personally like to see a more general call for pain biomarkers that includes RSFC/fMRI, but also other physiological techniques.

     

    Up to Top

    Centers for Research and Training in Quantitative and Systems Pharmacology

    Nominator: NIGMS

    Major obstacle/challenge to overcome: The focus of most of modern drug discovery has been on creating Ehrlich’s “magic bullet,” a drug that would hit only one target in the body effecting the desired change. This model has been enormously helpful and led to many drug successes. Yet even today, ninety percent of investigational drugs—i.e., those approved for trial—fail before being approved for use in patients. Many of these failures occur at the Phase II clinical trial level of development, making the failures very costly, and the majority fail for reasons of lack of efficacy. Furthermore, all drugs have unintended effects as well as intended ones, and drugs show person to person variability in their effectiveness and toxicities. Clearly, we can identify potential targets and make high affinity ligands for those, but we lack 1) a comprehensive knowledge of the role of these targets in human physiology and disease, 2) a quantitative and multi-scale understanding of how the targets modulate each other, and 3) how hitting more than one target sums to produce an observable phenotypic change. Industry scientists confirm that these are major impediments for drug discovery and development in most therapeutic areas.

    We submit that there is a profound need to make a major shift in our approach to drug discovery and understanding drug action to fill in the context within which targets operate and how they produce their therapeutic action and side effects. This proposal is not meant to encompass all of systems biology or pharmacology, but to add a more quantitative and integrative perspective to allow a systems level understanding of drug action. Academic pharmacology, for the past thirty to forty years, has been largely focused on molecular pharmacology, providing an in depth understanding of individual molecular targets within the body. There has also been a diminished level of academic research in clinical pharmacology, the discipline most aligned with understanding drug action. There is now a timely and urgent need to stimulate Quantitative and Systems Pharmacology (QSP), an emerging discipline that proposes to build from an understanding of a drug's molecular interactions to an understanding of its temporal and dynamic modulation of cellular networks, impact on human pathophysiology, and optimal use in the clinic. QSP builds upon classical and molecular pharmacology by adding omics approaches not available in earlier periods and recent modeling approaches that enable the deciphering of high volume data analysis. It adds the horizontal integration and numerical quantitation of biological processes and mechanisms provided by systems biology and the vertical integration and statistical approaches characterized by pharmacodynamic/pharmacokinetic (PD/PK) modeling and clinical pharmacology. It is necessarily multi-disciplinary and highly integrative, operating across the biological hierarchy from biochemistry and cells to tissues and whole organs to animal studies and human patients. Furthermore, for research to move rapidly in this emerging area, there must be a scientific workforce to drive it; currently, this is also an underserved need.

    Emerging scientific opportunity ripe for Common Fund investment: This recommendation arises from two workshops and a follow up white paper held at the NIH in 2008 and 2010 (http://meetings.nigms.nih.gov/?ID=8316) with participants from academia, industry and government. The stimulus for the workshops was the lack of integration taking place between the pharmacology and the systems biology being supported by NIH and the need to address the poor success rate in drug discovery and development. The workshops brought together researchers in pharmacology, systems biology, pharmacokinetics/pharmacodynamics, computer modeling and related areas with a focus on how systems biology was contributing to drug discovery and understanding of drug action now and in the future. The major result of the first meeting was a strong recommendation to repeat the workshop. The attendees recognized that they had something to offer each other, but felt they were currently far apart. The second workshop focused on three different therapeutic areas with the idea that they could learn from each other’s successes and failures. That QSP encompasses cells, organs, and virtually all therapeutic areas, with underlying principles to be discovered that span all these makes it highly appropriate for a Common Fund proposal. The opportunity now exists to bring together researchers in these various areas in a common effort to expand our knowledge of drug action beyond drug target interactions to an understanding of how to use drugs alone or in combination to control biological systems that can produce shifts between disease and healthy phenotypes.

    Common Fund investment that could accelerate scientific progress in this field: The purpose of this Common Fund idea is to promote the use of QSP approaches for the study and elimination of a major roadblock in drug discovery and development, the complexity of drug targeting. The centers mechanism is recommended to facilitate collaborative development of pioneering research, research training, and outreach programs in this emerging area and therefore stimulate the field as a whole. The focus of the centers should be the generation and testing of new ideas in QSP. The primary justification for centers is the need for integration of research plus training, and integration across levels of biological organization, across scientific disciplines, and across therapeutic areas. We believe that a community is emerging from the cognate disciplines that is highly motivated and ready for this effort to begin. Initially, there will be limiting factors as outlined below in areas for exploration, but there exists the opportunity to package what exists and leverage it to greatly boost research in this area. It is also suggested that the opportunity exists to employ industry academic partnerships to fully engage the expertise and creativity of industrial partners in idea generation and testing that will benefit all sectors.

    Some of the immediate needs in QSP include the following areas thought to be ripe for exploration via collaborative/integrative centers: 1) the quantitative characterization of drug targets; 2) factors affecting patient response variability; 3) better animal and tissue models; 4) re-connection of medicinal chemistry and tissue pharmacology; 5) information exchange formats extending from chemistry to electronic medical records; 6) better computational models with pharmacological mechanisms; 7) development of systems approaches to failure analysis; 8) defining of core competencies for training in QSP; 9) development of pedagogical resources; 10) novel formats for training, including that for established investigators; and 11) novel academic industry partnerships covering research and training.

    Potential impact of Common Fund investment: The primary results of a successful QSP centers program are expected to include: 1) major advances in the fundamental understanding of how drugs act; 2) more direct translation of discoveries made in cells to patients; 3) improved biomarkers that assay directly the effects of drugs in tissues and patients; 4) a stronger scientific basis for multi-drug therapy and re-purposing of existing drugs and drug candidates abandoned in development; 5) a more rational basis for polypharmacy and predicting drug-drug interactions; 6) a higher success rate for new drug candidates successfully entering the market place with acceptable toxicities and predictive variability among patient types; 7) higher rates of success of clinical trials; and 8) a stronger investigator pool for academia and industry and a new generation of leaders in academic and industrial pharmacology.

    Comments:
    I enthusiastically support this idea and the opportunity it presents for furthering knowledge on the influences of circadian rhythms in disease and treatment. Inherent in Systems Pharmacology is the study of daily rhythms in drug action as well as daily rhythms in the signs and symptoms of disease. A valuable feature of this proposed idea will be enhanced study and training in time-of-day factors that affect drug action and disease manifestations. For centuries, it has been recognized that mortalities and morbid events associated with most diseases do not occur randomly throughout the 24-hour day. Rather, such occurrences tend to cluster within the circadian rhythm in predictable intervals over each 24-hour day. Examples are the 2-3 fold increase in the risk of heart attacks in the morning after getting out of bed and the 3-fold increase in the risk of stroke between 6 AM and noon. Round the clock studies of drug action, sometimes termed, ‘chronopharmacology’, have demonstrated that the effects of drugs also wax and wane throughout the 24-hour day. For example, from one time point to another in the circadian rhythm, it is common to find 3-fold differences in the toxicity of certain medications. Furthermore, genes, physiological systems, and cellular metabolic systems all show periodicities throughout the 24-hour day characterized by greater and lower activity and sensitivity to regulatory factors. It stands to reason that the efficacy of pharmacological agents can vary with temporally changing properties of the target system. The rudimentary chronopharmacologic data presently available indicate that pharmacotherapies are more effective if therapeutic drug action is temporally concentrated at the times of day when it is most needed. However, efforts to translate chronopharmacologic knowledge into the practice of medicine have been infrequent and limited. Drug delivery methods, such as timed-release formulations and programmable pumps, are among the few current approaches to temporally focus drug action. Major obstacles for translating chronopharmacology into mainstream medicine are the patient’s circadian rhythms in response to the agent’s therapeutic effects and its toxic effects, the need for more precise tissue targets for drugs, and the need for drugs with more precise effects on abnormally functioning tissues. Another obstacle is a general lack of willingness to challenge dogma that drug effects are constant throughout the 24-hour day and to undertake the increased expense of studying efficacy and toxicity hour by hour round the clock.

     


     

    Progress in identifying the genetic control of circadian rhythms promises to fill gaps in mechanistic understanding of daily peaks and troughs in disease-related adverse events. Modern circadian methods can now identify specific tissue targets and temporal patterns of responsiveness to therapeutic intervention. Recent technological advances allow comprehensive profiling of patients’ circadian patterns at the molecular level and identifying optimum times of day for pharmacotherpeutic intervention. However, these capabilities need to be expanded for regular use in drug development and clinical practice.

    Innovation is needed in the way in which we study drug effects. Examples include:

      • AGS Feedback: This proposal states that “a significant amount of dollars is being invested in testing medications to treat chronic pain but the results of the studies have not yielded any significant progress in the treatment of this condition.” We would like to draw particular attention to the fact that the research community has made great strides in controlling neuropathic pain and that opioids have helped tremendously in conjunction with other interventions. We believe it is important to identify improvements for the future, but to also acknowledge those strides that have already been made with topical and oral interventions and to build upon these efforts.

       

       


       

      This is an area that is likely to high impact in the future. Although we are currently going through a difficult phase in drug development research, the future looks quite bright. The resources used to build databases like TCGA and COSMIC for cancers and the other genome-wide association (GWA) studies now available through dbGAP offer a rich source for network building and analysis to identify both drug targets and predict adverse events. The information obtained from these large data sets by statistical analysis and network modeling needs to be translated into experimentally testable predictions for new drug targets. For this we will need to integrate the network models with dynamical modeling to understand drug disposition in the body and drug action in the context of different combinations of genomic and environmental backgrounds. Taking classical pharmacological analyses such as PK/PD models to the next level by integrating quantitative and systems pharmacology approaches has the potential to be a real game changer in converting biological knowledge into useful therapeutics for personalized medicine. The proposed centers can be a great vehicle to catalyze the conversion of knowledge from systems biology to new drug targets and drugs for many complex diseases.

       


       

      I enthusiastically support this proposal for all the reasons noted in the description and amplified by other comments. In addition, I would highlight the opportunities the program would create for training-education innovation. These programs would provide a platform for preparing a new generation of scientists with comfort and expertise in designing and executing projects on complex diseases and therapeutic strategies using quantitative, computational and engineering principles and systems approaches to work with the diverse and large sets of data that can be obtained from experimental and clinical sciences.

       


       

      A federal focus QSP is badly needed, and should live outside of organ-specific institutes. A more complete understanding of the systems that drugs interact with will yield better predictions about what those drugs will do in living systems. But as Yogi Berra may have said, “It's tough to make predictions, especially about the future.” Thus, these predictions will always need to be tested, so translational nature QSP needs to extend to studies of drug effects in populations. An added benefit is that population studies often yield hypotheses to be studied in the lab.

       


       

      This is probable one of the most important proposals among all in the Common Fund. Improvement of human health and treatment of disease are central to the NIH mission, and many of the successes of modern medicine have come from the discovery of effective drugs. Without diminishing the significance of the proposal as described, I would stress that insofar as most drugs work by perturbing molecular cellular systems, a Center in Quantitative Systems Pharmacology must facilitate approaches that seek to 1) describe in quantitative and relational detail the drug response of disease specific molecular cellular systems, and 2) develop analytical models to distill these complex phenomena into clinically relevant concepts.

       

      Arrow I could not agree more with this comment. Analytical models, computationally based, are needed to accelerate new drug design for specific diseases, at the molecular-structure-cell level. These interactions are very complex, way too complex to be studied only with in vitro and in vivo experiments. Realistic quantitative predictive models are needed to discover new drug targets. Computer based models will speed up the time to discovery, be more cost efficient, and open up new paradigms for treating diseases. A shift needs to be made away from drugs only targeting the cell towards the interaction between the cell and the extracellular matrix (ECM), such as in tumor metastasis. Analytical models have the ability to do this, and do it well. However validation will require experimental approaches but these can be limited to promising drug targets defined by the predictive models.

       


       

      For success it will be important to support the training of individuals who can study the effects of drugs in man (in the broadest sense) in a multidisciplinary way. The existing NIH T32 training programs in clinical pharmacology through NIGMS are a great start, and if expanded, could form the foundation for this training.

       


       

      The key to success of this proposal is enabling the measurement of drug effects in various tissues (and cell types within those tissues) throughout the body, not computationally predicting those effects. The reference to Ehrlich in the introduction to this proposal is insightful, because we have learned since that time the huge impact of the context of drug-receptor interaction, and we have learned how little we understand about that context.

      By developing biomarker and imaging techniques to quantify the effect of a drug (PK/PD) on relevant pathways--not just the receptor--throughout the body, we will generate a better understanding of how that drug affects physiology differently in different cells/tissues in the body. This is how we will better understand the relationship between dose, efficacy, and undesired effects. This requires bringing together experts in pharmacology, physiology, cell biology, medicine, and technical experts to enable the advances required to make accurate and meaningful measurements over time, not at one instant. Doing so will be truly impactful to our understanding of pharmacology.

      Computational approaches will become important only after real data are generated. Each assumption that must be made when generating a model must be considered nothing more than a guess--and more likely than not an inaccurate guess--until the assumption can be tested. Therefore, any computational modeling must wait until the technology to test assumptions in vivo has matured. Systems biology must be studied in vivo, not in silico.

       


       

      This is one of the most timely ideas in the proposed “common funds.” It is trans-disease and hence would qualify to be trans-NIH. Traditional pharmacology has been successfully rooted on the notion of targeting a single molecule in physiology to alleviate pathology. While most drugs have worked in an exemplary manner in acting on the target molecule, there is increasing recognition that most drugs also do “more than” alleviate the targeted pathology, and some of this “more than” can be deleterious. This is increasingly leading to FDA drug withdrawals and warnings. A second consideration is the increasingly apparent concept that “one size (one drug), does not fit all.” And to understand what is a good drug and at what doses, it is essential that we have a well-defined notion of responders and non-responders. With increasing genotyping and phenotyping capabilities, this is within the realm of characterization.

      The above considerations recommend a course of action for “future pharmacology” along the following lines: the design and use of drugs will entail the integration of multi-scale normal physiology (molecular pathways, cellular networks, tissue morphology and remodeling and organ function) with multi-scale ‘omics and phenotype measurements and the alterations of normal physiology in an individual’s context and such integration will provide the framework for both designing drugs that will intervene at the systems level and titrating it to suit an individual’s response to the extent the detailed analysis supports. Most importantly such considerations will be carried out at a quantitative or semi-quantitative level as opposed to only qualitative levels.

       


       

      To make the most of the animal data that there is, we also need to become adept at performing systematic reviews of the experimental literature. Even better, to commit funding that would establish an ever-expanding database of such reviews, which would be (like the Cochrane library for clinical data) freely search-able and available.

       

      Up to Top

      Chronic Pain Conditions: A Transformative Classification for Stimulating Research, Improving Diagnosis, and Personalizing Treatment

      • 24-hour exploration of therapeutic and toxic effects of currently available drugs and new agents.
      • More precise identification of specific tissue targets for pharmacotherapies.
      • Improved drug delivery systems that optimize therapeutic effect and minimize adverse and toxic effect throughout the 24-hour day.

       

      Nominator: NIDCR
      Participating IC: Pain Consortium

      Major obstacle/challenge to overcome: Chronic pain conditions afflict as many as one-third of the US population and incur $560-635 billion per year in incremental healthcare costs and lost productivity (IOM Report June 29, 2011). The long term clinical goal in alleviating chronic pain is to develop targeted therapies and identify patients responsive to these therapies, both of which are supported by etiological- and mechanism-based case definitions and diagnostic criteria of disease. A major challenge in the field is the lack of a mechanism-based case definition and diagnostic criteria for multiple chronic pain conditions. Common Fund investments could facilitate the development of a new objective, biopsychosocial classification system for chronic pain disorders to overcome this major obstacle. This new system will accelerate research by standardizing research diagnoses used across laboratories, enhance clinical diagnoses by developing more objective, mechanism-based measures of disease, and identify subjects responsive to new therapies by developing novel biomarkers of disease and clinical outcomes.

      Emerging scientific opportunity ripe for Common Fund investment: We propose a research program to develop a new, comprehensive, mechanism-based, biopsychosocial classification of chronic pain conditions. Three opportunities are ready for Common fund investment. This proposal endorses the ideas and sharpens the focus of “Molecular Classification of Disease”, a topic that emerged from the Innovation Brainstorm meeting, and takes on sophisticated data management and analysis elements of the topics on “Beyond GWAS” and “Cross-Cutting Issues in Computation and Informatics”.

      Common Fund investment that could accelerate scientific progress in this field: This program would create a centralized data bank/repository containing information from a large chronic pain cohort to include study subjects with Temporomandibular Joint Disorders, Fibromyalgia, Chronic Fatigue Syndrome, Vulvodynia, Endometriosis, Irritable Bowel Syndrome, Interstitial Cystitis, Headache, Low Back Pain, Arthritis, etc., recruited and identified using today’s best diagnostic criteria. Many of these subjects will have multiple, comorbid chronic pain conditions. This cohort would be genotyped as well as phenotyped extensively using molecular, imaging and psychosocial methodologies. All data would be agnostically analyzed via pathway analyses and new algorithms for lumping and splitting in order to subtype and re-classify these chronic pain patients. Results emerging from the Common Fund incubator space would lead to a breakdown in the current “walls” separating these disorders (and researchers) and a transformation of diagnostic criteria based on a completely new classification of chronic pain conditions. After an intense 5 year effort, the data bank/repository and analytical tool set would become self sustaining with support from Pharma, the genotyping industry, and the NIH Pain Consortium.

      Potential impact of Common Fund investment: The outcome of this project will be a completely new way of discovery and management of chronic pain conditions: researchers currently housed in different laboratories collaborating in multidisciplinary teams to study pain, rapid discovery of therapeutic targets, development of novel analgesic therapies based on common mechanisms of disease, introduction of individualized medical treatments and identification of those likely to respond to therapy. Ultimately, results from this project will lead to an overall reduction in the burden of chronic pain, currently $560-$635 billion/year in the US in incremental healthcare costs and lost productivity.
      Chronic pain should be thought of as a disease unto itself like other chronic conditions such as diabetes and heart disease, and not merely a symptom of disease. Research approaches to and management of chronic pain conditions must consider that, like other chronic conditions, disease progression and complexity, early identification and intervention, and effective therapies, all influence patient burden and economic costs of disease. A transformative classification of chronic pain conditions will ultimately reduce long-term morbidity and decrease the economic impact of these wide-spread conditions.

      Comments:
      A shift from a disease to a mechanism-based classification of pain could potentially be transformative and is to be very strongly encouraged. This will require a large multicenter collaborative effort, including precompetitive involvement of industry but needs to be fully transparent with easy, open access to all data, even as the study progresses. A major goal needs to be to identify who is at risk and identify responders to treatment.

      The challenge will be in standardizing the phenotyping of patients and in identifying potential confounding factors as well as in assembling large enough cohorts with appropriately matched controls.

      The start of such an effort will need to be to develop methodology to measure the pain phenotype using unbiased techniques – not the existing scales - and to validate their sensitivity and specificity for detecting or reflecting pain mechanisms.

      Critical will be which patients to begin with, for such a large cohort of chronic pain. I would suggest that Temporomandibular Joint Disorders, Fibromyalgia, Chronic Fatigue Syndrome, Vulvodynia, Endometriosis, Irritable Bowel Syndrome, Interstitial Cystitis, Low Back Pain should be the very last. These are conditions where the etiology and pathophysiology is poorly understood and the diagnostic criteria vague or uncertain. Surely it would be much more instructive to look at conditions like post herpetic neuralgia, post surgical pain, peripheral diabetic neuropathy, osteoarthritis etc where the psychosocial elements are relatively less and the link between defined disorders and disturbances in the function of the nervous system are likely to be much easier to identify, as well as genotypic associations. Then will be the time to tackle these other pain hypersensitivity syndromes.

      In addition to looking for common polymorphisms that may contribute to pain sensitivity, the susceptibility of developing chronic pain and response to treatment, a capacity for identifying patients with rare genetic mutations that produce a large phenotype must be included with whole exome sequencing.

      Given the advances in next generation sequencing it is time to move beyond SNP analyses and capacity must be retained to perform functional studies on immortalized lymphoblasts/fibroblasts for induced pluripotent stem (iPS) cells.

       

      Arrow Having been part of some of the brainstorming meetings which have ultimately led to the proposed idea, I respectfully disagree with the preceding comment. In my opinion it is particularly the various persistent pain syndromes which would most benefit from a shift from a symptom criteria based to an endophenotype-based approach. It is becoming increasingly clear to all involved parties (NIH, industry, investigators) that little “actionable” progress has been made in this area, in which each syndrome is myopically viewed from a particular subspecialty, while the commonality of the syndromes is being ignored. Extensive behavioral and biomedical endophenotyping of a large cohort of such patients (composed of a wide variety of the symptom-classified syndromes), followed by advanced biomathematical modeling aimed at detecting endophenotype based clusters seems to be the only way to go beyond the current stalemate. Such an approach would generate new disease entities, which can then be studied in depth with hypothesis driven genetic and molecular approaches

      I must also respectfully disagree with these comments. I believe he has it exactly backwards. Those conditions with the greatest variability in diagnostic signs, complicated etiologies, and controversial patho-psychophysiologies are the perfect candidates for the proposed project.

      FROM ORIGINAL POSTER: My issue is not that persistent pain syndromes need study, including attempts at endophenotyping and eventually genotyping, but that our knowledge base for them is so much less than many other chronic pain conditions that it is perhaps premature to use limited resources to study them now, when they remain so poorly characterized clinically and even more so pathophysiologically. If there are no gold standard clinical diagnostic criteria it may be very difficult/impossible to collect the kind of homogenous cohorts that are required for a GWAS. There are I would like to gently argue, lower lying fruit that may be much more suitable for initial phenotype-genotype pain studies, ones with less of a behavioral/psychosocial component and where phenotypic clusters have already been revealed in several independent study, most notably the German Neuropathic Pain cohort. We need I argue, similar pilot data to show the nature of any endophenotype clusters for the persistent pain syndromes, as well as more epidemiological data on their co-morbidity and heritability before investing in large cohort studies.

       


       

      Currently evidence based medicine is used to manage people with chronic pain i.e. each individual’s clinical decision making process is based on the outcomes from other peoples treatments and outcomes. The flaw in this approach is that risk of chronic pain and response to treatment varies from person to person due to variations in genetic makeup, environmental exposure and insult to the body.

      “Personalized medicine” is the new model emphasizing the customization of healthcare, with all decisions and practices being tailored to the individual patient based on their biological characteristics. Personalized medicine is now being implemented from research where innovations are designed to customize care and move toward a predictive, preventive and personalized healthcare system. The success of any type of personalized medicine is critically dependent on the reliability and increased precision diagnosis and monitoring therapy.

      Our international consortium is now translating proton magnetic resonance (MR) innovations to allow personalized medicine to be effective for clinical use in areas including cancer, pain and head injury. The MR spectroscopy approach allows the brain chemistry of individuals to be monitored over time. For those with chronic pain there are unique markers reflective of long term pain. Once sufficient numbers of patients are examined in each pain category the MRS method can be analyzed by robust mathematical methods thus removing the need for a reader.

      Thus with the capacity to monitor these biomarkers a watershed moment exits for chronic pain sufferers. A personalized medicine approach will allow an understanding of pain biology. It will also crucially assist those clinical trials aimed at testing new therapies using a scientific approach that the FDA will find acceptable. Such an outcome measure will greatly improve the chances of success for new drugs; and likewise, it will help limit the use of potentially harmful and ineffective ones in the future.

      Currently we are able to look at one brain region at a time using regions identified by other methods such as fMRI. However since it is becoming accepted that MRS can identify the earliest changes in human biology, we are gearing our technology development towards collecting this data from small voxels in a grid covering the entire brain. This will provide an independent assessment of the biological changes during the pain process and therapy.

      Using a different type of spectroscopy the carbon 13 nucleus there are methods in place to determine exactly where in the Krebs cycle these biochemical dysfunctions in the brain occur. This technology is already in operation to identify biochemical dysfunction for other neurological diseases.

      In conclusion our group supports the prior comment that “using other techniques such as DTI or MR spectroscopy will more likely go beyond the question of “where”-- and answer the deeper question of “why” posed by patients and clinicians alike.

       


       

      Would it be possible from the beginning of this project to build in opportunities for follow-up of the cohort and to also assure some appropriate representativeness of the group so that longitudinal epidemiologic research might be optimally be applied to the questions epidemiologic methods can address? These questions range from basic natural history of chronic pain conditions to the exploration of biologic risk factors for progression, resolution and long term response to therapies. There is no question that chronic pain has received woefully inadequate attention and as a result those who typically treat the condition are left with very few options accessible to patients besides opioids, for which there is also woefully little evidence on benefits and harms. This initiative could be a great first step in promoting research on pain and pain treatments.

       


       

      There is a bi-directional relationship between pain and sleep. Chronic pain is sleep disruptive and deficient sleep (quantity and/or quality) also leads to a lowering of pain thresholds. Therefore, important phenotypes to include target sleep timing, duration and quality. There are several available previously validated sleep-related subjective tools and questionnaires, as well as previously validated sleep-related objective tools including assays of melatonin (in saliva and blood) for characterizing circadian phase, activity measurement devices, and ambulatory electrophysiologic sleep measurement devices.

       


       

      Large multicenter initiatives to conduct biopsychosocial mechanism-based research in these complex and overlapping pain conditions are sorely needed. While investigators often agree that the biopsychosocial model is the most appropriate conceptualization of chronic pain, few studies actually attempt to implement the model. Rather, individual studies are designed to address specific components of the model (e.g. molecular mechanisms, psychosocial predictors). However, a key tenet of the model is that biological and psychosocial factors INTERACT to influence the pain experience, and research designed to capture and characterize these interactions is paramount. Indeed, new information regarding molecular mechanisms whereby psychosocial and environmental exposures are transduced could be of tremendous relevance to the pain field. It is imperative to perform large-scale studies to identify the unique and common biopsychosocial mechanisms that confer increased risk for developing these disabling pain conditions. Of particular importance will be careful calibration of research teams to ensure consistency in phenotyping methods across study centers.

       


       

      There seems to be a persistent, organic, reductionistic, focus on investigation of potential underlying pathophysiology as the only route to better understanding and treating chronic pain disorders. A complimentary way of investigating chronic pain is to add the richness of the experiential and functional impact of pain on the sufferer. Pain and its functional sequale are complex psychophysical events which define pain and suffering as much more than an unpleasant sensory experience, or avoidance of contact. The fact that pain often intrusively defines the sufferer, has profound, usually functionally limiting consequences, further traps the sufferer in a self-perpetuating spiral of pain, suffering, and dysfunction. The restoration of life to those suffering from chronic pain cannot be accomplished solely by the removal of the pain. It must also involve the restoration of a sense of integrity, worthiness, and functional capacity to seek and achieve meaningful life goals. I believe that any transformative classification of chronic pain conditions, for stimulating research, improving diagnosis, and personalizing treatment must incorporate the restoration of quality of life experience and functional capacity along with normalization of underlying anatomic and physiologic processes. An investment by the Common Fund could accelerate scientific progress by promoting the integration of alternative, patient centered approaches, to the traditional medical approaches for diagnosing and treating chronic pain.

       

      Arrow

      I agree; reducing the burden of chronic pain most certainly will require an approach that avoids dichotomizing the mind-body continuum. Additional advantages of the large cohort study, if it includes banking of biologic samples, would be the ability to re-analyze data in light of subsequent biomarker and genetic hypotheses which may arise from such a cross-disciplinary effort. Seeking pathophysiologic commonalities across the broad spectrum of chronic pain anatomic diagnoses seems to me the only rational way to move the field forward.

       


       

      Of the over 100,000 individuals in the US with Sickle Cell Disease, chronic pain is clinically a typical feature of most adults and some children, and significantly impacts their quality of life and contributes to the substantial healthcare utilization, medication usage, and economic disadvantages of this patient population. Despite its clinical and patient-reported importance, very little is known about chronic pain in this disorder as the NIH-funded natural history studies of sickle cell disease in the 1970-80s did not study this aspect of pain symptomology. Modest progress is being made in understanding the contribution of other genetic factors to the variable severity of complications of this monogenic disorder, but the lack of pain phenotyping by mechanistic, endophenotype, or biopsychosocial approaches has largely prevented the scientific study of this predominant symptom complex in sickle disorders. Common Fund investment would be crucial in advancing this field by supporting collaborations between basic scientists, pain experts, and sickle cell investigators who traditionally do not have the financial support of clinical revenues, and frequently care for patients at resource challenged inner-city or rural medical centers. Given the modest rarity of sickle cell disease, clinical trials of pain treatments for this disorder would benefit from testing therapies demonstrating effectiveness in other chronic pain disorders, but that by necessity will require large comparative studies to demonstrate similarities and differences across different chronic pain disorders-again demonstrating a strong need for Common Fund support.

       

      Up to Top

      Developmental Origins of Health and Disease: Disease Prevention Across Generations

      Nominator: NICHD, NIEHS
      Participating IC: NIMH, NIMHD, ORWH

       

      Major obstacle/challenge to overcome: It is clear that many complex diseases and conditions result from a combination of genetics and environment. What is not clear is when and how this interaction of genetics and environment actually leads to disease. The concept of developmental origins of health and disease (DOHaD) is a fundamental principle underlying many chronic diseases and conditions in children and adults. Decades of DOHaD studies suggest that a wide variety of early exposures occurring during periods of time where tissues and organ systems are developing markedly increase risk for (or even cause) disease across the life course. These “environmental” exposures are varied and include drugs, nutrition, chemicals, stress, microbes or viral infections. Examples of these non-communicable diseases and disorders (NCDs) include obesity, type II diabetes, insulin resistance, asthma, cardiovascular diseases, dyslipidemia, cognitive and behavioral disorders, neurodegenerative diseases, a variety of cancers, and reproductive disorders. Disadvantaged populations may experience greater exposure to these hazards and exhibit higher rates of disease incidence, morbidity and mortality. Understanding and modulating this risk in humans during critical windows of development offers the promise of primary prevention for many of these NCDs and may result in reducing health disparities.

      Although the range of diseases and conditions believed to result (at least in part) from early life exposures spans nearly all of the NIH Institutes and Centers (ICs), the concept of developmental origins has yet to be broadly adopted as a new research paradigm. A trans-NIH program funded by the Common Fund (CF) will support research to 1) characterize early life exposures and their health effects in a comprehensive way, 2) encourage cellular and molecular research on the mechanisms of these triggers (or stressors) on development, and 3) encourage the development of targeted interventional studies in human subjects. The initiative would result in a new awareness among researchers funded across the NIH to consider the role of developmental stressors in triggering the diseases and conditions they study. This would also jumpstart the field of transgenerational inheritance, i.e., the transmission of environmentally-induced phenotypes to subsequent generations without direct exposure. This phenomenon is well-described in non-mammalian systems, but despite the existence of several published examples, it remains highly controversial whether it truly occurs in humans or rodents, or how common it might be. Finally, the initiative would build the body of science needed to strengthen prevention research, an important element of health care reform.

      Emerging scientific opportunity ripe for Common Fund investment: No single IC has the capability or the expertise to integrate all the approaches and technologies needed to assess how genetics and multiple environmental triggers (or stressors) combine to affect health across the lifespan and even across generations. Furthermore, early life exposures even to a single stressor can lead to adverse health outcomes affecting multiple tissues and organ systems that are not readily appreciated in traditional single institute research programs. For these reasons, adopting a coordinated, trans-NIH approach is a critical step in changing existing paradigms about the etiology of a variety of diseases and conditions, and transforming this information into knowledge that targets the most advantageous times, and possible interventions, to prevent their occurrence. This program would also leverage and integrate current CF and trans-NIH initiatives such as those on epigenetics, exposure biology, genetics/genomics, the microbiome, bioinformatics, developmental biology and programming, stem cell development and differentiation, and animal, epidemiologic and clinical assessment of disease.

      Common Fund investment that could accelerate scientific progress in this field: We propose a CF program that will transform thinking about disease prevention: a comprehensive investigation of the developmental basis of a wide range of diseases and conditions. Such a program would enable the NIH to identify a host of environmental stressors that increase disease risk, and the mechanisms by which these exposures alter normal developmental programs, manifesting in disease or conditions, years and even generations later. Moreover, a comprehensive research program focused on the developmental origins of disease would enable scientists to pinpoint susceptibility windows – unique developmental time points at which humans are most susceptible to the combined effects of environmental exposures and genetic factors. Identifying these developmental windows and developing predictive biomarkers of exposure will dramatically increase our ability not only to understand disease etiology, but also to develop intervention strategies that will ultimately prevent disease, by reducing exposure. Possible investments include:

      1. Develop centralized, well-characterized, novel models and clinical research designs and analytic techniques that would promote effective multigenerational analysis and would leverage existing CF investments and infrastructure, such as that in mouse phenotyping (KOMP2).
      2. Identify stressors (nutritional, environmental, social) and investigate early gene-environment interactions that may perturb the normative development of various tissues and organ systems (such as the cardiovascular, neurological, immune, gastrointestinal, skeletal, endocrine, reproductive systems), increasing the risk of disease or conditions later in life and across generations. Particular focus would be placed on diseases, conditions or syndromes that have been steadily increasing in incidence and where health disparities are apparent in the United States. A focus on exposure characterization during early life would be imperative and would include vulnerable populations defined by race, ethnicity, and socioeconomic status.
      3. Apply state-of-the-art sequencing technologies to investigate epigenetic and genetic mechanisms by which early life events lead to developmental reprogramming, impacting disease risk both in both children and adults (e.g., somatic or cognitive changes) long after the stressor is gone, and how increased risk is transmitted to subsequent generations through various mechanisms (i.e., germline, mitochondrial, or other changes).
      4. Use birth cohorts in human subjects, to identify sex-specific developmental susceptibility windows that are specific to common diseases and conditions in early development.
      5. Identify biomarkers of developmental stress for single exposures and combinations that predict susceptibility to specific diseases and conditions later in life that could also be used to target and develop preventive interventions.
      6. Develop bioinformatics and statistical programs to allow the assessment and integration of developmental exposure to a variety of stresses and their importance in the development of disease outcomes.

      Potential impact of Common Fund investment: This would be the first CF program specifically designed to focus on disease prevention, and it will do it in an integrated and transformative manner. Once it is clear how certain diseases or conditions originate in early development – including which stressors or combinations of stressors are responsible for the altered programming, and during which developmental stages humans are most sensitive to these effects – effective strategies can be developed both to reduce exposure to the stressors and or intervene to reduce disease incidence. Such strategies have the potential not only to reduce the overall societal burden of disease but to reduce or eliminate health disparities.

      Comments:

      We agree that a comprehensive centralized approach to the study of developmental origins of disease with a goal of pinpointing “susceptibility windows” and identifying “stressors” or other agents that may perturb normal development at critical times, could have major impact on understanding the complex chronic diseases that span the missions of all NIH Institutes and Centers. A trans-NIH approach is imperative not only because of the multiple diseases involved, but also because of the need for: (a) “big team science” approaches that are beyond the scope of any one IC; (b) interactions among a variety of basic scientists, clinicians and behaviorists; and (c) working across the lifespan. A “systems” approach that transcends the often narrow focus on a particular disease or stage in life is needed in order to identify the environmental influences and risk factors that lead to disease, as well as critical therapeutic windows. Such an approach would also leverage previous NIH investment in metabolomics, epigenetics, and the microbiome. Progress made within the 5-10 year timeframe of the Common Fund initiative would provide the basis for future studies to define mechanisms underlying the developmental origins of disease and elucidate new avenues for therapeutic interventions. Outlined below are those concepts that apply to the need for a common fund initiative on the developmental origins of health and disease:

      1. While it is becoming increasingly appreciated that exposures in utero have profound effects on the development of non-communicable diseases in the offspring throughout the lifespan, we are currently unable to effectively integrate these results in such a way that hypotheses are easily formed or interventions proposed. Thus, building a “framework” that provides the basis for future research will be instrumental in moving the field forward. For example, the construction of a developmental atlas spanning the potential susceptibility windows of preconception/conception/in utero/birth/neonatal/infant/early childhood/childhood/puberty/fertility/aging could be a lifespan framework usable by all investigators. Development of this atlas, as well as of a catalogue of exposure types and a measurement toolkit will lead to use of consistent terminology and variables, allowing comparisons between data obtained and enhancement of the contributions of individual studies. The developmental atlas would also facilitate characterization of timing of onset of disease-specific phenotypes in susceptible individuals. With this combination of overarching framework and tools, the initiative should provide the means by which to train and mentor future investigators across multi-disciplines in order to address the issues inherent in understanding the multi-faceted aspects of developmental origins of health and disease.

      2. Building a platform to enable communication among investigators, the sharing of results, and the linking of databases will facilitate collaborative efforts that would not otherwise be possible. Reaching this goal will require the interaction between sophisticated bioinformatics and social networking experts and multidisciplinary DOHaD investigators. The barriers that must be overcome to reach this type of collaborative effort would not easily be handled by a single IC and would benefit from the common fund mechanism. Inherent in this concept is the need for global outreach in order to maximize the opportunity to link with existing unique and extensive datasets and cohorts from across the world and in developing nations.

      3. While the need for the cataloguing and bio-banking of human specimens and tissues will be important, the challenge of obtaining samples from newborns may limit certain aspects of this goal. Nevertheless, the availability of “pregnancy tissues” such as placenta and umbilical cord and blood should be considered, along with the development of new measurement tools such as non-invasive and safe biomonitoring methods, as well as the development of methods for “omics” analysis of difficult to obtain or limited sample size of human tissues.

      4. While barriers inherent to the treatment of women during pregnancy and of offspring in early postnatal life are high, there is great potential to have a major impact on public health. Defining and developing safe yet effective interventions that could be used during identified therapeutic windows provide a unique and important opportunity for translation.

       


       

      Given the robustness of evidence that health and disease processes have their roots in early life, the transformative potential of research on developmental origins of health and disease cannot be overstated. Discovery in this area has the capacity to profoundly advance prevention and improve health because it enables intervention well prior to the onset of frank disease, targeting of unfolding disease risk processes and casting a net that focuses on a broad swath of health rather than narrow, categorically defined disease entities. Although “early development” can be defined in many ways, a focus on the prenatal origins of health and disease has distinct advantage because there is strong evidence of its powerful and long-lasting influence, it provides a common ground across paradigms, complex intervening influences that transpire after the exposure can be measured as part of a cascading pathway and, developmental sequelae can be systematically measured from birth on.

      However, while this is an enormously promising area of scientific focus, substantial barriers impede necessary progress. These are due, in part, to the artificial partitioning of foci that occurs via individual Institute priorities and paradigms and corollary funding constraints. A Common Fund initiative therefore has high likelihood of breakthrough discovery and high impact yield. Highlighted below are key gaps which can be effectively addressed via the scale and breadth of a Common Fund initiative in this area:

      1. THE CHASM BETWEEN HEALTH DISPARITIES AND NEUROBIOLOGIC MECHANISMS RESEARCH IS A CRITICAL IMPEDIMENT TO SCIENTIFIC PROGRESS. Early life exposures are often deeply embedded in social and racial/ethnic inequalities. However, studies of neurobiologic mechanisms typically give scant attention to this issue in sampling or design and racial/ethnic minorities are underrepresented in these studies. In contrast, health disparities studies traditionally focus on social determinants of health and are not designed to specify the influence of social inequalities on biologic mechanisms of health and disease. In addition, rigorously bridging these perspectives requires great breadth of disciplinary expertise and a scale of funding that exceeds that of traditional grant mechanisms. In order to be transformative, a Common Fund initiative must build capacity for highly integrative studies that blend the core strengths of these perspectives in novel approaches and designs.

      2. FOCUS ON SPECIFIC, MODIFIABLE AND MEASURABLE EXPOSURES IS KEY. Research on the developmental origins of health and disease has been extremely variable in characterization of exposure including crude and limited exposure measurement, absence of prospective, repeated multi-method measurement of exposure and failure to embed exposure in the context of co-occurring risks (thus making it impossible to isolate true “exposure effects.”) Further, while it is intriguing that a range of exposures can alter health and disease pathways (ranging from catastrophic earthquakes to maternal anxiety), the greatest yield will come from a scientific focus on preventable prenatal exposures that are common and have well-developed measurement methods. Although very large general studies that characterize a range of exposures and outcomes are useful, they preclude the depth of exposure measurement and testing of specific mechanistic actions and well-specified outcomes that are key for intervention development. Prospective, multi-method exposure measurement with an emphasis on preventable exposures that are characterized “in context” is essential to the development of high impact innovative prevention.

      3. A PARADIGM SHIFT IS NEEDED TO ILLUMINATE THE “BLACK BOX” OF DEVELOPMENT IN
      DEVELOPMENTAL ORIGINS RESEARCH. Much of the evidence of the developmental origins of disease is based on studies that measure disease onset decades after the exposure. Fetal programming has been extensively theorized but its defining features and developmental expressions are not empirically established. No matter how strong the evidence that the origins of health and disease reside in early life, as long as development remains a “black box” in this pathway, the public health impact of this information will be severely constrained. For example, Gene x Environment interactions that are part of the causal chain from early life exposures to later disease are likely to cause “vulnerability” to disease not the disease proper. Clearly, many exposed individuals do not develop the diseases for which the exposure puts them at risk. Without measuring susceptibility and its interaction with ensuing risks (both protective and amplifying), it is not possible to identify how and for whom early life exposures lead to disease. Identifying multi-level susceptibility phenotypes in humans, as they manifest from early in life and across development, and their interaction with other risks as an unfolding process, is absolutely critical to transformative research in this area. Recent advances in developmental science that enable meaningful measurement of phenotypes in the first years of life now enable this pathway to be measured “from the beginning.” This is key to establishing specific susceptibility processes that are distinctly linked to a disease pathways (rather than a general developmental vulnerability), pinpointing the phase of the disease process in which prevention will be most efficacious and specifying the point in development when susceptibility is transformed to frank disease. Characterization of developmentally specified phenotypes also provides the essential target for measuring whether prevention effectively alters disease trajectories.

      The structure of standard NIH grant mechanisms (i.e. 5 year cycles) also artificially constraints the capacity to capture exposure, its mechanisms and impact on health and disease within the context of development. For example, if prenatal exposure is well-characterized prospectively, it is virtually impossible to examine outcomes beyond the age of two years. Initiatives that focus on precise characterization of “developmental phenotypes” across developmental periods and their modifiability are likely to be of highest yield for generating innovative preventions. A novel funding structure (e.g. developing a funding period that extends beyond the usual five year cycle with the flexibility to follow promising leads discovered in each phase) would also exponentially amplify scientific yield and impact.

      Launching the next generation of research on developmental origins of health and disease with emphasis on prevention is a high impact line of investigation with tremendous potential for reduction of public health burden. Common Fund initiatives would uniquely catalyze discovery and enable the field to move beyond incremental steps to bold and transformative science in this arena.

       


       

      This initiative presents a very important opportunity. Linking basic scientists to human population scientists will lead to testable hypotheses, identification of mechanisms, and translation to public health interventions. Rewarding trans-disciplinary collaboration should be an important feature of this initiative. Trans-disciplinary collaboration can slow individual productivity and place careers in jeopardy unless investigators are somehow encouraged to undertake the risk of collaboration. There are other immediate benefits: DOHaD research will be expanded to more fully incorporate environmental risk factors, which have not been emphasized in this field, despite their clear relevance to epigenetic processes. This initiative is also likely to open the field further to incorporate social and behavior factors that are currently the subject of intense investigation in animal research. The role of DoHaD in establishing or maintaining health disparities is another fruitful area that would not doubt benefit from this initiative.

       


       

      Combined with wider availability of genetic mapping tools, this is one of the most promising areas of discovery in my mind. If we use the dramatic spike in Autism Spectrum Disorder (ASD) diagnoses over the last 30 years it seems clear that some set of environmental factors must contribute to the increasing incidence rates in ASDs. The proposed research enables us to accelerate our understanding of these links and begin to tease out the root-cause of disease expression unique to individuals and their environments. I think there is also great potential for harm if genetic predisposition and environmental exposure change the actuarial risk analysis on an individual to the point where we know they are more likely to have a disease but less likely to be insured due to market/profit considerations. I believe the NIH is well suited to manage the bioethical considerations that this research brings.

       


       

      I strongly support this initiative. I suggest expanding the definition of early life exposures to include the parents' exposures before the child's conception which may have genetic and perhaps epigenetic effects. The effect of exposures on new germline mutation is well established in animals. Detection has been difficult but there are now suggestions that exposures induce new germline mutations in humans as well. The interaction of genotype and environment is virtually unstudied. The study of preconception exposures would greatly benefit from the proposed initiative for the reasons discussed by others.

       


       

      In order to address preventative aspects of both communicable and non-communicable diseases while thoroughly focusing on their developmental origins, I feel that research involving non-human primate models has lots to offer. Traditionally, this NIH resource was utilized predominantly in the context of AIDS research where it contributed substantially. During recent decade or so however, an extensive body of evidence was published with other “non-AIDS” non-human primate disease models. Thus, when formulating novel translational ideas, these newly emerging models could provide opportunities and to fill the gaps between basic lab research, research with genetically defined mice and clinical trials. Just a few examples include gene therapy, autoimmunity and emerging viral infections.

       


       

      It will be important to take into consideration the unique features of important disease groups. Mitochondrial genes tend to evolve faster than nuclear ones. However, nuclear genes that interact with mitochondrial genes also co-evolve faster. Because of the importance of mitochondrial function to metabolic diseases and more common degenerative disorders, the complexity of the dual genome co-evolution should be considered.

       


       

      To the list of environmental exposures we suggest adding “sleep deficiency,” which refers to insufficient or poor quality sleep. A sleep deficiency experienced at a given stage in development may have effects on multiple systems, depending on genetic and other factors. Insufficient or inadequate sleep due to poor conditions for sleep (due to medical conditions, environmental or behavioral factors), has profound effects on health and cognitive function, including learning and memory, mood regulation, immune, metabolic and cardiovascular function. There is strong scientific evidence that sleep deficiency and circadian misalignment both increase the vulnerability for the development of cognitive impairment, depression, and cardiovascular, respiratory and metabolic disorders which influence morbidity and mortality. Given the ubiquitous influence of sleep and circadian timing on molecular and physiological processes, research aimed to understand how the interaction of genetics and environment leads to the development of disease need to consider the time of day at which the data are collected, and their relationship to the circadian cycle and sleep satiety of the organism. There are examples of the importance of both circadian and sleep status ranging from cardiovascular parameters (heart rate, blood pressure, vascular resistance, thrombolytic activity) to endocrine (just about every hormone), to metabolic activity, to digestive enzymes, to nearly all, if not all physiological functions. In addition, there is evidence from cancer chemotherapy that timing of administration can reduce side effects and increase effectiveness. The pervasiveness of metabolic, cardiovascular, and cognitive problems in people with restricted sleep and circadian misalignment are key issues germane to the breath of NIH and to the missions of its Institutes and Centers.

       


       

      This very thoughtful discussion on developmental origins of disease highlights the importance and challenges of gathering, linking and interpreting multifaceted information across long periods of time. For example, information would be needed about inherent genetic and gender susceptibility factors as well as complex environmental exposure factors which in turn may act directly on target tissues or through epigenetic mechanisms. Hypothesis driven research will require consideration of multiple routes of exposure which, in turn, are dependent upon age and developmental stage, i.e. maternal placental and lactational function, and infant and child age-dependent behaviors. Longitudinal studies like the National Children's Study could have great value in this regard. As stated by others, systems approaches and interdisciplinary expertise and cooperation will be needed. Prevention of chronic diseases through early, pre-conceptual and prenatal interventions would have far-reaching economic and social benefits, and obvious implications for maintaining rigorous environmental protection. It also begs an interesting public health question: when is it too late for behavioral or other interventions to matter if much of our health is predetermined during gestation?

       


       

      This is an excellent idea and previous comments further explicate the key points. Developmental origins of complex diseases are known but we need sensitive basic models identifying the subgroups of individuals with specific susceptibilities using genetic, epigenetic, biological, physiological, cognitive, behavioral and social factors. As mentioned earlier, such research requires large scale support in order to bring interdisciplinary teams of scientists to support collaborative team science on this extremely important topic. It requires resources to conduct large scale science with the support of large scale infrastructures such as the CF supported Clinical and Translational Science Awards (CTSAs) The NIH Common Fund (CF) has previously provided support for such interdisciplinary collaborative research through its interdisciplinary research consortia but this support is no longer there. There is urgent need to continue and expand such support to develop sensitive and specific multifactorial individual profiles of complex disease risk. In one example of an interdisciplinary research consortium effort supported by the CF, in the short period for 4 years, several specific neurobiological and RNA and micro RNA changes have been detected related to obesity, addiction and diseases related to these conditions as well as testing of new medication compounds to develop as new therapeutics has been identified. Such combined collaborative efforts would not have been possible without the CF but such efforts need continued support and expansion. This particular idea described above particularly lends itself to such initiatives.

       


       

      I strongly support research on the developmental origins of health and disease. This area of research will almost certainly emphasize the use of CF support for interdisciplinary research so as to integrate knowledge across genetic, biological, behavioral, and population-based factors relevant to health and illness. As a result, this ambitious and extraordinarily promising scientific initiative will require thoughtful interdisciplinary collaboration, perhaps in interdisciplinary scientific teams. It is critical that the work of those teams, and any centers or consortia in which they are embedded, be evaluated for their ability to work collaboratively and effectively, and to produce accelerated and innovative science. Although this may require setting aside 10-15% of the funding allocation for the evaluation of interdisciplinary research, the knowledge generated from this allocation will have two potential direct benefits to science: 1) it will focus funded studies on implementing those structures and processes known to promote effective interdisciplinary scientific collaboration, and 2) it will enhance knowledge to strengthen future interdisciplinary scientific collaborations. As a result, the overall initiative will not only produce scientific breakthroughs on the developmental origins of health and disease, but also build the knowledge based on how best to conduct interdisciplinary science that leads to further discoveries.

       

      Up to Top

      Disruptive Proteomics Technologies: Comprehensive Protein Identification in Clinical Samples

      Nominator: NHGRI
       

      Major obstacle/challenge to overcome: Our ability to detect and quantify proteins in complex (e.g., clinical) samples is progressing steadily, but it is increasingly clear that order-of-magnitude improvements in the associated technologies would enable very significant advances over a range of biomedical research areas. In other word, the current state-of-the-art is good, but limiting. A few of the specific limits are:

      Although NIH does fund some technology development in this area, there are not programs specifically aimed at development of so-called “disruptive” technologies, i.e., those that could afford very rapid, very significant gains, similar to those that occurred in DNA sequencing technology.

      Emerging scientific opportunity ripe for Common Fund investment: The history of technology development for genome sequencing teaches that successfully fostering very significant technological advances in basic methods and instrumentation requires a sustained effort, significant funds, encouragement of diverse approaches, a tolerance for taking risks (moderated by ongoing evaluation across the portfolio) and very focused, precisely articulated, assessable program goals.

      We propose an analogous technology development effort that aims to produce order-of-magnitude improvements in the detection, identification, and quantification of proteins in complex samples. Moreover, the effort would explicitly emphasize an end-point relevant to clinical applications.
      Several NIH institutes do fund technology development in this area. However, the program proposed here is justified as a Common Fund effort both because its benefits will cut across NIH (see below) and because it requires concerted management of all the grants under one program towards
      precise program goals (see below) to maximize the chances for success. Similarly, it is important that this proposed program not be combined with other technology development efforts.

      Common Fund investment that could accelerate scientific progress in this field: In a long-term technology development effort such as the one proposed here, it is difficult to anticipate what basic methodology holds the best potential for very significant improvement. The current dominant methodology for high-throughput detection and quantification of proteins is mass spectrometry (MS); it holds good potential for further incremental improvement, and it is possible that order-of-magnitude improvements could be stimulated by a well-targeted program. In addition, there are other technologies that hold promise for significant improvements, though they are currently less developed than MS, and not well-supported. We therefore propose projects covering both MS and non-MS approaches.

      Specific funding components proposed:

      FOA 1: Technology Development: MS-based protein ID and quantitation . (Years 1-5)
      Goals include:
      i. 10-fold decrease in instrumentation cost (e.g., a $50,000 mass spectrometer)
      ii. 100-fold or 1000-fold increase in dynamic range
      iii. 10-fold increase in throughput

      FOA 2: Technology Development: Non-MS-based protein ID and quantitation. )
      Goals include:
      i. Develop protein ID/quant technologies that approach/exceed MS-based methods with respect to: accuracy, dynamic range, throughput, cost, and ability to analyze PTMs.
      ii. Demonstrate orders-of-magnitude improvements with respect to dynamic range, throughput, cost.

      FOAs 1 and 2 would need to justify their approaches relative to eventual advantages for translational or clinical use, for example:

      - Improved discovery and/or assessment of biomarkers
      - Rapid sample turn-around time
      - Small input volume (1 mL of blood, etc.)
      - Stored or banked samples, resilience to sample handling variability
      - Analysis of clinically relevant sample sizes (100s to 1000’s), with no loss of specificity

      For FOAs 1 and 2 it is likely that a phased approach with milestones will be advantageous for incenting rapid development and managing risks. For example these FOAs might encourage many applications and a high level of risk/reward with an initial three-year period, followed by an option to renew for a larger amount of funds contingent on reaching milestones.

      One issue not explicitly considered above is the development of computational tools for data analysis and integration for large protein datasets. There are advantages in asking that this be integrated into the development of the technologies, and also the alternate approach of writing a separate FOA. Staff will need to research this issue.

      If FOAs 1 and 2 are successful, a follow-on FOA focusing on specific clinical applications would be considered.

      Potential impact of Common Fund investment: Orders-of-magnitude improvements in this area would enable very significant advances across the NIH portfolio. In basic research, it would enable the assessment of all proteins in a mixture; which in turn would enable, for example, more comprehensive assessment of gene expression, now largely inferred indirectly from RNA expression. In discovery research, it would enable a more comprehensive assessment of the molecular consequences of variation (e.g., an addition to GWAS, GTEx, large cohort studies); for translational research it is likely to afford many advantages for disease biomarker discovery and assessment. Finally, if all the goals are realized, there are clear ramifications for the clinic (patient sample testing, drug response/disease progression, etc.).

      Comments:
      Regarding Funding Opportunity Announcement (FOA)1, I am not sure it is worth too much effort chasing 10x higher throughput. The data analysis is the current bottleneck, not the mass spectrometry (MS) throughput. I can acquire a week’s worth of data and spend months working out what it all means.

      On the other hand, increases in sensitivity and dynamic range, and decrease in instrument costs, are all very worthwhile goals!

       


       

      This may be one of the most far-reaching Common Fund initiatives, but the aims as stated will be difficult to accomplish. Specifically, FOA1 describes largely technical advances, many of which are well under way such as SWATH MS from Ruedi Aebersold's group and long liquid chromatography (LC) Orbitrap MS from Matthias Mann's group among many others. Translating these advances into lower instrument and experimental costs is really beyond the scope of what NIH grants can accomplish.

      For example, consider the invention of the Orbitrap, which is far less costly to manufacture as compared to the superconducting magnet Fourier transform ion cyclotron resonance (FT-ICR) instruments, and offers improved performance for complex mixture analysis due to its high mass accuracy and fast sampling. Yet the cost of Orbitrap as compared to magnetic FT-ICR instruments remains quite high.

      I also agree with the previous comment (“Regarding FOA1….”): shortchanged data analysis plagues most MS studies. The majority of recorded spectra still remain unidentified in most current complex mixture MS studies. Indeed, in the case of SWATH MS, its performance is fundamentally due to improved computational analysis (mProphet: automated data processing and statistical validation for large-scale SRM experiments. Nature methods 2011, 8(5):430-435).

      An alternative way to structure this initiative is to fund specific research with the goal of discovery and functional annotation of complex proteomes with clinical significance. These may include plasma and urine proteomes of patients with sepsis, heart disease, liver disease etc. Such focused projects will necessarily involve technical and analytical innovations and improvements required for the accurate, comprehensive, and functionally annotated proteomics using MS and other methods. Study of specific clinical conditions will be directly applicable to human disease, as the research results will lead to directly to improved biologic understanding, novel biomarkers and therapeutic targets. The choice of clinical conditions to study will be critical, both for the clinical impact, and potential success. Much has been written about this, and FOAs can be written accordingly.

       


       

      I agree with the other comments, especially “Regarding FOA1…” that data analysis and interpretation is a critical aspect for large scale and high throughput proteomics studies. The computational implications of this proposed Common Fund Investment must be considered carefully.

      Finally, I think limiting this scope to only clinical samples is a mistake. What if a plant or yeast research group, for example, has the best idea for an amazing innovation in proteomics sample analysis. Yeast, in particular has been an amazing proving ground for technology in proteomics and genomics. These developments have had major impacts on all of biological research. The focus solely on clinical samples and biomarkers will possibly limit the number of exceptional applications that could have major long term impact on all NIH objectives.

       


       

      I believe that this is an important area of investment. Currently, there is really only one recurring study section (Enabling Bioanalytical and Imaging Technologies, EBT) that consistently results in funded proteomics applications. This study section and its predecessor have resulted in the funding of excellent proteomics technologies. However, EBT has a broader scope than protein mass spectrometry/proteomics. Other areas like genomics and structural biology have had far more investment, for example. With the advances in proteomics and mass spectrometry that have been occurring over the past few years, this Common Fund investment is overdue since major innovations may in fact be possible but have not been tested or developed by many due to lack of resources.

       


       

      From what I gather from the comments and my personal experience, there are several specific advancements that would be helpful in overcoming the obstacles:

      1. Some sort of readily available MS protein spectral database to not only help assess “standard” MS protein spectra, also to help with standardization of data between different models of MS devices. It could also help with data analysis and collaboration with statisticians, esp. multivariate techniques and identifying and correlating spectra with proteins or disease states.

      2. Protein purification techniques or some sort of microfluidic, high throughput device that can help automate sample prep and help decrease the time to perform sample prep.

      3. Also quick and easy biological fluid filtering techniques, which I think microfluidics along with biochemical techniques could help automate and standardize the way proteins are purified to help standardize protein identification

      Raman spectroscopy could also provide spectra specific to proteins. The spectral database is even moreso lacking than MS. It will also suffer from the previously discussed issues. I think any form of spectroscopy to identify multiple biomarkers in complex media could benefit from the previous three aims.

      So, not only focusing on sample collection, also focusing on device design and implementation is a great idea.

       

      Up to Top

      Exploring the Extracellular Space

      • Expensive technology/instrumentation
      • Because of the above, labs have limited access, but demand is high
      • Current technology is not capable of proteome-wide measurements

      Nominator: NINDS
      Participating IC: NCI

      Major obstacle/challenge to overcome: The Extracellular Space (ES) occupies the space between cells, outside their plasma membrane. The ES consists of the extracellular matrix (ECM) and the interstitial fluid. It is filled with an ionic solution of mainly NaCl and contains a complex cocktail of molecules necessary for: 1) cellular survival (that includes glucose, amino acids, lipids, etc.), 2) tissue integrity (involving macromolecules such as collagen, lipoproteins, proteoglycans, glycoproteins, etc.), 3) physiological function of cells and tissues (such as growth factors, cytokines, hormones, neurotransmitters, metabolites, cholesterol, protease, protease inhibitors, etc.), and 4) transducing mechanical strain for proper tissue function. The volume, pH and composition of the ES can differ significantly between tissues and are altered dramatically upon pathological processes. Although there has been much focus in tools and technology development for the investigation of intracellular processes, proteins, and molecules, to date little attention has been devoted to the ES and how the cellular microenvironment contributes to health and disease.

      Emerging scientific opportunity ripe for Common Fund investment: Emerging tools and technologies already funded by the NIH Common Fund and other agencies that are being used to investigate and interrogate cellular processes will greatly accelerate understanding and control of the extracellular milieu in health and disease. The convergence of the indicated fields below and the collaborative efforts of practitioners from these various disciplines will greatly advance our knowledge base of ES.

      • Single cell analysis
      • Glycomics
      • Proteomics
      • Metabolomics
      • Glycopoteomics
      • Lipidomics
      • Molecular probes
      • Induced pluripotent stem cells (iPSCs) generation and differentiation

      NIH Common Fund investment would galvanize efforts towards a multidisciplinary approach towards determining the influence of the ES in health and disease.

      Common Fund investment that could accelerate scientific progress in this field: The NIH has some ongoing efforts in investigating the biology of extracellular space. The majority of these activities are focused on the role of ECM stiffness – a feature found in cancer and most diseases. A more synergistic approach catalyzed by Common Fund investments and utilizing cutting edge technologies as described above will provide a more global and comprehensive understanding of ES physiology and function and how these are perturbed in disease. In addition, Common Fund investments will also provide impetus for the less appreciated information on the flow of gradients and soluble factors in tissue microenvironment. While we know of signaling of many growth factors and cytokines, we know very little how these molecules are organized in the ES space – after protease digestion for example, or during cell migration or morphogenesis. Moreover Common Fund investments will open up new areas of research on the exciting and novel roles of exosomes produced by cells of different origins. Recent findings indicate that tumor cells (and likely various other cell types), much like immune cells, also secrete or produce exosomes which are loaded with miRNA and other RNA species, as well as constituted activated signaling molecules, such as AKT, which can through paracrine fashion modulate other cells. There is some emerging evidence that the exosome cargo (e.g. miRNA) is actually released and picked by neighboring cells. Exosome-mediated transfer of mRNA and miRNA is a novel mechanism of genetic exchange. Moreover exosomal load assessment and exosomal molecular profiling, such as miRNA signatures, can serve as a source of diagnostic biomarkers that hold great promise for disease detection and monitoring.

      Potential impact of Common Fund investment: A thorough understanding of the cellular milieu can lead to breakthroughs in

      • understanding and control of homeostasis
      • intercellular communication
      • paracrine and autocrine functions
      • transport of nutrients, factors, metabolites, and degradation products to and from cells
      • establishment of resting potential of cells
      • tumor growth and metastasis
      • neurodegeneration
      • tissue injury and repair
      • regenerative medicine
      • drug delivery, since access of soluble drugs to cells in tissues is mediated by ES, and ECM macromolecules serve and can be exploited as attachment sites for various pharmacological compounds
      • drug targeting and development
      • biomarkers of disease
      • cell re-programming
      • cellular mimics of disease
      • elucidating disease processes by bridging the gap between intracellular and extracellular events, and the crosstalk that takes place

      Comments:
      Much of the current research is focused on intracellular mechanisms, and it seems that most of the intracellular pathways are now well described (or almost well described). What is lacking is the transfer of information (signal) from the ECM to the cell and from the cell to the ECM. The ES is complex, and simply stating its mechanical strain that transducers the mechanical signal is “too simple.” The ES will have many different states depending on the physical environment of the tissue (cells and ECM). For instance, when the tissue is destabilized from a rest state, such as with mechanical loads, chemical changes (pH, NaCl), cell migration, enzyme degradation, the ES will change and all processes will adapt to these changes. They can be static or dynamic, and one would expect the cells and ECM to adapt with these micro-environmental (ME) perturbations. What will happen in the ME of the ECM if it is destabilized? Will protein-protein, protein-cell, enzyme-substrate, etc interactions be different depending on the specific ME change? You bet they will, and we know very little about these interactions in real-time physiological situations at the ME level. And these ME will be tissue specific.

       

      Up to Top

      Gene-Based Therapeutics: Manipulating the Output of the Genome to Treat Disease

      Nominator: ORDR
      Participating IC: NHLBI, NINDS

      Major obstacle/challenge to overcome: Gene-based therapeutics are tools to manipulate the output of the genome to treat disease. The most well-known gene-based therapeutic is gene therapy, which is most commonly done using viral vectors, although other vehicles (e.g. nanoparticles), can be used as well. Other gene-based therapies include small interfering RNA (siRNA) and oligonucleotide therapeutics, and zinc-finger nucleases and transposons to modify the genome directly.

      For many gene-based therapies, development and proof of principle preclinical studies are within the budget of a typical RO1 grant award. The major obstacle is moving from preclinical research into clinical trials. Major hurdles at the preclinical level include limitations on the size and sequence of nucleic acids used in gene-based therapeutics, as well as tissue and cell-type specific targeting. Moving to the clinic, hurdles include the practical reality of scaling up production, funding for GMP drug manufacture and toxicology testing, and funding for clinical trials themselves.

      What is needed to overcome this obstacle is a program dedicated to making gene-based therapies a clinical reality. Our proposal is for the Common Fund to support such mechanism, which would facilitate the translation of current gene-based therapies into clinical trials.

      Emerging scientific opportunity ripe for Common Fund investment: Gene-based therapies are clearly ripe for investment by the Common Fund. It has now been established that viral vector based gene therapy is effective in humans1 2. In addition, novel gene targeted therapies are being developed and validated at an accelerating pace. In 2011 alone, we have seen the first evidence that zinc-finger nucleases can be effective in a mouse model of hemophilia 3, and the use of exosomes to deliver therapeutic siRNA across the blood-brain barrier in mice 4.

      NIH ICs have provided the majority of funding for the discovery and preclinical developments of multiple gene-based therapies, and will do so in the future. However, they not positioned to support translation to the clinic at the level that is necessary.

      Common Fund investment that could accelerate scientific progress in this field: We envision a program that combines aspects of RAID (Rapid Access to Investigational Drugs) and TRND (Therapeutics for Rare and Neglected Diseases), but is focused exclusively on gene-based therapeutics. Projects will be chosen for support by streamlined competitive process, and funding provided in a step-wise manner dependent upon continued progress and meeting project targets (similar to RID). For some viral-vector based therapies, the new program could support investigators to utilize the NHLBI gene therapy resource program. For other types of therapeutics, the fund could provide support large scale production of GMP grade nucleic acids, or zinc-finger nucleases, as well as CROs for animal toxicology testing. By funding such a large effort, significant cost savings would be expected based upon economies of scale.

      Another aspect of the program, modeled after the TRND program, would be to carry out Phase 0 and Phase 1 clinical studies to de-risk gene–based therapies, and thereby encourage adoption by industry. It is possible that industry could be involved with this program at an earlier stage in a public-private partnership.

      It should be emphasized that TRND does not work with biologics or gene therapy, so the new program would complement TRND, rather than duplicate effort. While gene therapy and biologics are a part of RAID, given the rate of development of new technologies in this area, and the potential clinical impact, we believe that a much larger program, exclusively focused on gene-based therapeutics, is needed.

      Potential impact of Common Fund investment: If the proposed Common Fund program were to achieve its objectives, the impact would be that one or more gene-based therapies would become established as a treatment option for patients with genetic disease. As a benchmark, gene-based therapy would become as common as bone marrow transplantation is currently at major academic medical centers. Such an outcome could transform the clinical outlook and lives of patients with genetic disease. This would be especially important for rare diseases where in most cases no other treatment options exist.

      Importantly, we anticipate that this program would dramatically impact basic science as well. A commitment to gene-based therapeutics by the Common Fund, would certainly stimulate even more preclinical studies in this field, which would in term feed into the new program, and also provide new tools for basic science. As an example, if the program achieved its objectives, tools could be available that would make manipulating the genome of a mouse, in a specific cell population, as routine as transformation of bacteria is today. If this were to become reality, it would dramatically change the way biomedical science is done. With support from the Common Fund, these impacts are feasible within the 10 year time frame specified by the Common Fund criteria.

      1. Simonelli F, Maguire AM, Testa F, et al. Gene therapy for Leber's congenital amaurosis is safe and effective through 1.5 years after vector administration. Mol Ther. Mar 2010;18(3):643-650.
      2. Aiuti A, Roncarolo MG. Ten years of gene therapy for primary immune deficiencies. Hematology Am Soc Hematol Educ Program. 2009:682-689.
      3. Li H, Haurigot V, Doyon Y, et al. In vivo genome editing restores haemostasis in a mouse model of haemophilia. Nature. Jul 14 2011;475(7355):217-221.
      4. Alvarez-Erviti L, Seow Y, Yin H, Betts C, Lakhal S, Wood MJ. Delivery of siRNA to the mouse brain by systemic injection of targeted exosomes. Nat Biotechnol. Apr 2011;29(4):341-345.

      Comments:
      It will be important to take into consideration the unique features of important disease groups. Mitochondrial diseases for example, need to be tackled at both the nuclear genome and the mitochondrial genome levels.

      Because the importance of mitochondrial function to metabolic diseases and more common degenerative disorders, the complexity of the dual genome approach should be considered.

       


       

      The issue I find with gene therapy is impact in that it seems to affect the treatment of none of the top 13 killers of people in U.S. (heart disease, cancer, stroke, respiratory disease, accidents, diabetes, Alzheimer’s, flu/pneumonia, septicemia, suicide, liver disease, and hypertension).

      Because of this, I would suggest being more specific on the disease targets, e.g., tumor treatment, getting drugs across blood brain barrier to treat Alzheimer’s, genetic tissue remodeling for diseased hearts, rather than just an all call for any disease.

      I mention this also because of the strong ethical implications. I think a strong ethical framework should also be discussed and developed to educate the general public because I'm sure this is just one bad PR campaign from what happened with the stem cell debate.

      I think the call should also include making the gene therapy affordable in some way. Why is gene therapy so expensive, how could we get costs down. Is it because of the costs of clinical trials? If so, if NIH funds this, will it insure the costs of the therapy would be significantly lowered?

      Also implications with insurance, especially if the ethical issues aren’t addressed and Congress gets involved. Especially with pop culture obsession with zombies and outbreaks of disease manufactured in lab and other unforeseen consequences of manipulating the genome, etc, etc.

      I think with cost issues assuaged and ethical issues addressed, gene therapy has great potential, especially for TRND.

       


       

      I am writing you as Executive Director of National Tay-Sachs & Allied Diseases Association (NTSAD) on behalf of the children and adults who are affected by Tay-Sachs, Sandhoff, Canavan and related genetic diseases. I am also writing you on behalf our Scientific Advisory Committee and the many scientists who have been involved in understanding the mysteries of these types of genetic diseases over the years. NTSAD was founded in 1956 by a group of parents who were devastated by the diagnoses of their children. Today, Tay-Sachs is still one of many related genetic diseases that are fatal in children and progressively disabling in their adult onset forms.

      Gene therapy is an elegant solution for this type of neurodegenerative disease that is caused by a single gene defect. The Tay-Sachs Gene Therapy Consortium was founded in 2007 to advance research in the quest for a gene therapy treatment for Tay-Sachs and Sandhoff diseases. This scientific team is on the brink of advancing a treatment to human clinical trials in 1 – 2 years. Treatment proof of concept was demonstrated in several animal models of Tay-Sachs and Sandhoff disease, as well as GM1 gangliosidosis.

      This progress was made possible because of private funding through NTSAD, its family foundations, and many other donors as well as the NIH. We have a plan to raise more funds to complete pre-clinical studies and we are working hard to reach this fundraising goal, although it is significant for a group representing rare diseases. The huge challenge is how to carry the project forward starting from clinical trials through commercialization. I just completed a call with a team of concerned parents and board members figuring out how to finance the clinical trials and attract a corporate partner. We have heard from many people that we must first show proof of concept in humans before getting support from a biotech or pharma company. The Phase 1 cost is estimated to be less than $2 million which is tiny by drug development standards, but huge for us.

      The opportunity to have a TRND-like program for gene therapy could be the difference between life and death for affected individuals, most of them young children. Furthermore, this research lays the foundation for success in gene therapy treatments in other lysosomal storage diseases and leukodystrophies affecting the CNS. It could also have relevance for other neurological diseases including multiple sclerosis, Parkinson’s and Alzheimer’s disease. Also, recognizing that gene therapy may not be the sole answer to effectively treating the disease, we recommend combination therapy strategies (i.e., including not only multiple types of genetic manipulation but also gene therapy combined with cell therapy). NTSAD has funded several research grants in this area.

      A Common Fund investment of a relatively small amount of money could be the tipping point for creating successful life-altering treatments for a number of diseases where few or no treatments exist. NTSAD strong supports the implementation of a Common Fund for gene-based therapeutics.

       

      Up to Top

      Innovative Mobile and Wireless Technologies (mHealth) to Improve Health Research and Health Outcomes

      Nominator: OBSSR

      Major obstacle/challenge to overcome: Mobile and wireless health (mHealth) technologies have developed at an exponential pace in recent years; however, the integration and translation of these cutting-edge technologies into rigorously evaluated health research and healthcare tools have lagged behind. For example, low-cost, real-time devices to assess disease, movement, images, behavior, social interactions, environmental toxins, hormones, and other physiological variables, have made remarkable advances in the last decade because of increased computational sophistication, as well as reductions in size and power requirements. The basic engineering and computer science knowledge exists to develop technologies that will alter the collection of health-related data for basic and translational research, clinical practice, healthcare delivery, and public health in ways that were not imaginable a decade ago. Scientific investments are needed to translate this basic science into quality mobile and wireless health technologies that also leverage other rapidly advancing biomedical technologies.
      In fact, development of the mobile and wireless health technologies is currently progressing at a much faster pace than the science to evaluate their validity and efficacy. Unnecessary devices will be created with little medical impact because they were developed without an empirical foundation and input from the health research community. Private sector technology companies, along with a limited amount of public funding from NSF and NIBIB, support the basic development of novel wireless and mobile technologies, but NIH provides limited support toward the translation of these basic technologies into quality wireless and mobile solutions to facilitate research and improve health. Once a technology is fully developed, various NIH institutes support rigorous evaluation, but there is insufficient funding for the period between basic technology development and evaluation; that is, the development, integration and validation of software and hardware required to develop these cutting-edge technologies into evaluable tools.

      Moreover, these technologies, which promise to sense and assess physiology, disease, behavior, and environmental changes continuously and in real-time, will generate an avalanche of multi-faceted, longitudinal data. The rich longitudinal datasets generated by these multiple inputs also require advanced analytics, akin to a high throughput approach to a continuous stream of data. These analytic tools and sophisticated visualization techniques will provide interpretable data for researchers and/or actionable data for healthcare providers and public health practitioners, as well as new approaches to efficient management of chronic disease.

      Emerging scientific opportunity ripe for Common Fund investment:
      Mobile and wireless health (mHealth) is a nascent and rapidly growing field. These technologies provide the potential to advance research, prevent disease, enhance diagnostics, improve treatment, reduce disparities, increase access to health services and lower healthcare costs in ways previously unimaginable. Real-time, continuous biological, behavioral and environmental data collected by wireless and mobile technologies will improve our understanding of the etiology of health and disease, particularly when integrated with data from areas such as genomics, biomarkers, and electronic medical records. These data are also essential for answering the difficult questions of gene-environment interplay in health and disease, adherence, and the developmental origins of adult disease, as well as informing the development of treatments and prevention programs that are preemptive, personalized and adaptive over time. Further, these tools have the potential to transform clinical trials. Remote monitoring and sensing can allow researchers to recruit and follow patients without the need or cost to transport them to a research or healthcare setting. This will increase participant access and decrease burden, while increasing sample representativeness and the quantity and quality of follow-up data, all at decreased cost.
      A major opportunity also arises from the potential of mobile and wireless health technologies to continuously monitor chronic medical conditions around the world, as well as to implement disease management plans that capitalize on this expanded information. Chronic disease conditions have been recognized in the developed world as a major source of morbidity and mortality. Similarly, in the low- and middle-income countries (LMICs), chronic disease is increasingly being cited as an emerging problem and a major component of disease burden. A prospective in the NEJM (2007;356:209-211) cites that cardiovascular disease accounts for nearly 30% of all deaths worldwide and this percentage is similar in LMICs to the global average. A fundamental characteristic of most chronic disease is that the medical profession manages the disease rather than ‘cures‘ it. The hypothesis that better monitoring will lead to better management, better outcomes and reduced disease burden has yet to be tested.

      The need for rigorous research that examines the potential, as well as the challenges, of harnessing mobile technologies to improve health outcomes is critical to global health. Given the high penetration of cell phones and related technologies in LMICs, as well as the lack of bandwidth in many parts of these countries, research investments could illuminate the potential of these technologies to serve as the underlying infrastructure for transmission of health information and data in low-resource settings. For example, given the capacity for adaptive learning facilitated by these technologies and the potentially heightened level of empowerment experienced by users, research investments can inform how best to use mobile technologies to help educate and train the next generation of providers and patients in low-resource settings, as well as serve as a vehicle for behavior change across diseases and conditions. Equally exciting is the potential for these technologies to provide low-cost alternatives to traditional imaging modalities for screening of chronic, non-communicable diseases, such as cancer and heart disease. Therefore, in addition to the ways in which mobile and wireless technologies support research and health in the United States, numerous specific areas of global health research could benefit from increased and targeted NIH investment in this field.

      To ensure long-term impact of investments in mobile and wireless technologies and to improve health globally as well as domestically, NIH funding should be designed to ensure that both the technology developers and the researchers start with problems that demand solving, so that the field is needs-driven, rather than product-driven. In addition, mobile and wireless technologies are part of an information and healthcare ecosystem in which systems must be able to communicate with each other; therefore, NIH can provide leadership to encourage and support the development of novel, interoperable solutions. Furthermore, significant support for building research capacity in this field will help to ensure a pipeline of investigators both in the U.S. and abroad who have the skills and experience to advance the field forward as technologies and public health needs evolve.

      Computer scientists, engineers, and biomedical/behavioral researchers are beginning to collaborate, and transdisciplinary groups are forming, making this area ripe for Common Fund investment now. In addition, the wide interest in this area provides an opportunity for potential federal (National Science Foundation, World Bank) and non-federal collaborations (e.g., Robert Wood Johnson Foundation, private technology and communication companies) that could augment Common Fund resources and increase the value of the initiative. In addition, NIH has an mHealth Scientific Interest Group that will ensure programmatic expertise across the Institutes and Centers.
      Common Fund investment that could accelerate scientific progress in this field: With its potential for providing low-cost, high quality data to enhance health research and improve health outcomes around the world, mobile/wireless health is of growing interest to the NIH ICs, but no individual IC is able to foster the integrated development needed to move basic wireless/mobile technological development to evaluable solutions, especially since most of these technologies apply to multiple diseases and conditions. This initiative provides the funding to develop and translate novel technologies from prototype components to integrated and validated tools to advance health research, diagnose and treat disease and promote health.

      Common Fund investment in this area would stimulate the required interdisciplinary efforts among computer scientists, engineers, and biomedical, behavioral, and social scientists to fill this translation, development, integration and validation gap. Funding would target four essential aims:
      1. Translation, development, integration of interoperable and affordable mobile and wireless technology into novel scientifically-validated tools for use in research, healthcare or public health;
      2. Validation and implementation of existing wireless and mobile devices into ongoing clinical trials, especially those addressing treatment of chronic disease;
      3. Development of “high throughput” analytic techniques for complex, comprehensive, and multi-streamed data, as well as models of and data visualization to enhance the value of these data; and
      4. Development of mobile health technologies that can address infectious and noncommunicable disease problems (obesity, cancer, diabetes, cardiac disease, etc.) around the world by facilitating disease prevention and behavior change.

      Potential applicants would include technology developers, industry partners, health researchers and others in an iterative development process for which there is currently no model of public funding. Currently, mobile and wireless health research requires multiple grants targeting each step of development, causing delays and preventing research from keeping pace with technological change. Partnerships with industry and other stakeholders will facilitate commercialization and sustained development. Further, to address global health challenges and to facilitate the exchange of information, collaborations between U.S. investigators and partners in low-resource settings (both globally and in the U.S.) would be encouraged. This initiative would also develop a cadre of reviewers with experience evaluating grant applications that involve a combination of technical development aims and health outcome aims. By providing models for how to move these basic technologies through integrated development and rigorous outcome evaluation, this effort could eventually be subsumed by technology companies and basic technology funders extending their reach into integrative development and by having NIH and other clinical research funders expand their interests into the integrated development required to prepare mobile and wireless applications for clinical evaluation.
      Potential impact of Common Fund investment: One impact of Common Fund investment of mobile and wireless technologies would be to move this field from developing devices to developing solutions for chronic disease management or other conditions. One recent example of the potential (Lancet, 2011; 377:658-666) demonstrated that wireless pulmonary artery monitoring of individuals with chronic heart failure resulted in a 40% reduction in heart-failure related hospitalization over the six month follow-up period. The infrastructure developed could also have a significant impact on research on disease monitoring, treatment, and management.

      Further, if this Common Fund program achieves its objectives, scientific and business models will be created for moving cutting-edge technologies much more quickly through integrated development to research evaluation. Currently, many of the wireless and mobile technologies being evaluated with NIH support are considered old, if not obsolete, by the technology community. As a result, the wireless and mobile technologies evaluated with NIH funding will be much more innovative and novel and the pipeline from basic technological development to health research evaluation will be accelerated and streamlined. In addition, this initiative will support the development of the methods needed to analyze and present these complex data sets to enhance both health research, but also healthcare delivery and public health as these large, complex data streams are summarized into actionable health information.

      Comments:
      A common fund for mHealth research is timely and could have a substantial impact on multiple domains of study within the next 5-10 years. A few points to consider:

      1. Enhanced focus on behavior change: I would like to see more attention given to research on general principles of mHealth technology that can be used to affect positive health behavior change that is applicable to a variety of medical conditions and behavioral disorders. Examples include: What are the most effective methods of summarizing and presenting information collected via sensors and/or self-report to influence health behaviors. How often and for how long should we expect users to interact with an mHealth device or software application to most effectively influence health behaviors?

      2. Iterative designs: Due to rapid advances in technology, and the multiple ways in which the technology may be applied, mHealth research is highly iterative. It is possible to make substantial gains in a relatively short amount of time by conducting a series of brief trials to identify strengths and weakness of given approach and make quick adjustments to enhance outcomes. This is possible in large part thanks to the copious amount of data that can be collected, even over a brief period, from mobile devices. However, current funding mechanisms focused on traditional clinical trials and hypothesis testing are not well suited for highly iterative research in which the ultimate intervention designs and outcomes cannot be clearly anticipated at the outset of the project.

      3. Interoperability and Generalizability: Given the rapid pace of advancement and the proliferation of devices, I would encourage a focus on research that will be applicable to wide variety of devices, including those that have yet to be developed. This is related to my point #1: It is possible to conduct research on general mHealth principles that can be applied to a variety of conditions and types of technology.

      I am very excited at the prospect of a stronger emphasis on mHealth research and I look forward to future developments in this area.

       


       

      I feel that some areas to examine are:

      1. Translation of existing, proven behavioral therapies into a mobile space. Text messaging doesn't necessary handle things like motivational interviews.
      2. Developing methods to prove that a behavioral message delivered in mobile space has actually been ingested by the recipient
      3. Linking mobile sensors to biological outcomes, most likely through use of machine learning protocols
      4. Integration of personalized analytics (e.g., detection of analyte continuously, in real time) with mobile devices.
      5. Development of biosensor for remote detection of ECG tracings. I can get heart rate off small doppler radars hung around the neck, but I can't get continuous telemetry-level data.
      6. Improved affect detection protocols.

       


       

      Here are some relevant questions I have been considering that could be incorporated into the above:

      1. How do we visualize and interact with mHealth sensor data to facilitate reflection on and improvement of behavioral health?

      2. mHealth sensors facilitate frequent capture of a variety of information about individuals in the natural environment. How do we deal with the privacy challenges inherent in collecting such detailed personal logs of personal health and daily activities?

      3. It’s unlikely we can ensure 100% accuracy of the data collected by mHealth sensors. How do we deal with these inaccuracies in mHealth applications? How much inaccuracy are we willing to accept?

      4. What safety features are needed to ensure individuals do not make poor or dangerous health decisions based on the information presented to them by mHealth systems?

      5. How do we integrate mHealth technologies with modern clinical practice? What protocols should physicians follow when working with these systems and how do we train them on these protocols?

       


       

      Proposed Common Fund Research Idea: Innovative Mobile and Wireless Technologies (mHealth) to Improve Health Research and Health Outcomes


       

      A common fund such as that described makes a lot of sense. I am thrilled to see it discussed for all the reasons stated above. It would move the field further faster.

      Just one add on comment. The challenges of obtaining effective use of these technologies will not go away with such funding. If the technologies are developed and marketed to agencies that offer a welcoming environment, they are much more likely to be accepted. But this will likely require development of and/or adaptations to regulations, financing mechanisms, interagency agreements, staff training and incentives, communication systems and organizational processes. The last thing we want is a bunch of great innovations languishing because the environment will not support the.

       


       

      You emphasize the implications of this technology for chronic and infectious disease research/interventions. I would strongly suggest that the implications of mHealth for behavioral disorders (e.g., substance abuse, depression, post-traumatic stress disorder or PTSD) also be mentioned. These risky behaviors have great impact nationally and internationally, but have few low-cost easily-disseminable solutions. Alcohol/substance abuse, as well as mental disorders, require equally rigorous and theory-informed intervention development/evaluation as chronic disease.

       


       

      This is very exciting, and very timely. A couple of thoughts:

      1. Part of the problem, alluded to in comments that researchers are looking at antiquated technologies, is that our evaluation methodologies and funding mechanisms do not accommodate the rapid rate of development.

      2. Regarding the methodology, I would emphasize new evaluation strategies. Under the section “Validation of mHealth devices in clinical trials” it seems to suggest using standard evaluation methods. Clinical trials are resource intensive and require long periods of time. Intervention technologies are often antiquated by the time they are validated. Adding to the problem, these intervention technologies are often in “perpetual beta,” and are updated frequently. We have the opportunity to revolutionize our evaluation methods. These intervention systems acquire a vast amount of data, including outcomes, as part of the interventions. If we think of care delivery systems as engineering systems, we can begin to develop methods of continuous evaluation that will begin to break down the distinction between validation trials, ongoing monitoring for safety and effectiveness, and quality improvement.

      3. Regarding funding, the cycles are also slow compared to the rate of technological development. Often even innovative proposals are less than novel by the time they are funded, and are old by the time the work is done. Would it be feasible to develop something akin to an mHealth consortium (perhaps a bit like the NIDA clinical trials network) that would be a partnership between technology developers, clinical researchers and service providing organizations who could collaboratively develop, evaluate and implement cutting edge intervention technologies?

       


       

      Increased funding for mobile, wireless technologies will help to speed their development. But there is also a need for this science to progress in a systematic, organized manner. Currently what we are seeing is that any new technique (e.g., accelerometer-based physical activity monitors) breeds proliferation of new devices as manufacturers rush to get their devices on the market, and the output from these devices is not comparable. Moreover, a manufacturer may have multiple generations of a device, and the algorithms change over time. It would be helpful if there were common datasets that could be used by manufacturers in developing their regression. There may also need to be a rigorous validation of devices once they are produced. I believe the Food and Drug Administration (FDA) requires that pulse oximeters have a certain level of accuracy before they're approved, and the Centers for Disease Control and Prevention (CDC) is involved in setting accuracy standards for clinical laboratories that measure biomarkers (such as lipid profiles). Without these kinds of assurances for devices that measure behavioral variables, the data from these high-tech devices will be suspect.

       


        • AGS Feedback: AGS believes that this is an important proposal, and specifically recommends that any research in this area looks at the need to have the following:

           

           

          o Applications (i.e. the software) be able to run on multiple devices, i.e. Pad/Tablet, smart phone, laptop/desktop seamlessly.
          o Devices are active and accessible regardless of location in hospital, rehab, skilled facility or in the community.
          o Different disciplines and providers all have access to a common record so that everyone can see each other’s notes.
          o Patient specific applications should be accessible through various devices and locations and adaptable to a multitude of health literacy levels.

      • To my mind one of the most compelling statements in the overview of this initiative is this: “Computer scientists, engineers and biomedical/behavioral researchers are beginning to collaborate, and transdisciplinary groups are forming, making this area ripe for Common Fund investment now.”

         

         

        At a time of genuine crisis in U.S. health care due to unsustainable growth in costs and relatively poor outcome/dollar ratios there is a need to fundamentally re-engineer the system. This is not likely to come through retrofitting the current medical-industrial complex. Nor is it likely to come from a few new drugs or devices that intercept isolated biological pathways or sustain the function of single organs or tissues. Rather, it is more likely to emerge through a parallel “health system” rich in new consumer behaviors that are supported by innovations in information technology, design and user experience. These will promote the primary and secondary prevention of disease and improved self-management of illness once it occurs. Innovations in information technology have been the most vibrant and transformative influences on our society for the past 50 years and they are likely to continue to be so for the foreseeable future. Thus, those of us interested in improving individual and population health must collaborate with computer scientists and engineers to adopt their approaches to problem solving and quality improvement.

         

        The Common Fund has an opportunity to accelerate growth in this new area of mHealth in the many ways superbly outlined on this page. Investments should be strategic so that they encourage the involvement of NIH’s partners in research both within HHS and across the many federal agencies with a stake in R&D such as National Science Foundation (NSF), Department of Defense (DoD) and Commerce. Also, a small percent of these funds should be allocated in ways that ensure that the discovery process addresses the breadth of science, technology and policy issues raised by mHealth including the social, economic, legal and regulatory factors that facilitate or hinder innovation in this domain.

         


         

        Outstanding direction as described and potential for high impact results in a high risk area.

         

        I wrote about the funding valley of death between prototype development and outcome evaluations in '06 for the Center for Aging Technologies based on feedback from their gerontechnology researchers. Death Valley has only increased over time. This initiative is sorely needed to advance the related technical and theoretical fields of knowledge. As a NIH reviewer, I do see the need for a different mindset in evaluating this field per stage of research. Early stage studies with an n of 5 conducted in real world can be more informative about implementation problems than an n of 1000 in a lab setting and more efficient and cost effective than large scale. Health Outcome studies of early to mid-stage technology prototypes can be premature when the technology glitches hinder adoption and usage. Yet they are very suitable for studies of the human factors, ethics, behavioral aspects. Mature technologies should be subject to outcome and comparative effectiveness studies with other technologies and traditional non-technological approaches. To do this in the fast paced evolving technology world, multiple cross site collaborations are needed for large scale deployments typically beyond any one organizations' capabilities. Challenges in the technology field can be a good thing, stimulating innovation, not a fatal flaw.

         


         

        Wonderful synopsis of the promise and barriers. Moving from the technology to translation into clinical trials and daily care is indeed difficult to fund through NIH and National Science Foundation (NSF) under present reviewer structures and research priorities. For example, machine-learning algorithms are identifying the type, quantity and quality of upper and lower extremity daily activities for patients in the community. But to fully test reliability, validity, responsiveness, and cost of these potential daily monitoring and outcome measurement tools, clinical researchers in neurological diseases, for example, may have to test devices and algorithms against quality of life questionnaires and disease-specific and generic measurement tools, one disease at a time and for multiple levels of impairment and disability. That will take a generation of studies that will not keep pace with the advancing capabilities of technologies and informatics. So do consider a set of Common Fund solutions.

        The Common Fund could sponsor standard generic platforms for sensor data acquisition, Internet-based data transfer, and basic analysis, then upgrade them as innovation demands. If the Fund also identified and paid for promising tools to be added to applicable, already funded industry- and NIH clinical trials across interventions and diseases, progress will quicken. Greater use of devices will also further drive down the costs of sensors, smartphones and data transfer. Engineers, computer scientists, and clinicians will gain opportunities to develop and field test the most promising applications in collaboration with trialists and, perhaps, agencies such as Medicare. NIH or a benevolent commercial entity could then maintain an open database from all such trials for data mining by investigators of all stripes and K-award trainees who choose the new field of mHealth research.

         


         

        Perhaps the Common Fund, with its focus on “the period between basic technology development and evaluation” could also support work on research ethics specific to mHealth.

        Two examples. First, massive databases, such as those associated with mHealth research, make more salient the surprising failures of anonymization. We have simply not caught up to the fact (cf. Paul Ohm's work on law). Second, ethical issues in mHealth go beyond protecting privacy. Widespread use of phones as research instruments could complicate protocols for clinical trials or infectious disease epidemiology, since phones provide not only a means to observe, but also a means to intervene, in “real time.” Moreover, access to phones outstrips access to basic healthcare and infrastructure---how should we intervene if a cholera epidemic is detected?

        We need to think through these issues (and more). Ethical considerations often crop up subsequent to basic technological development, but it is clear that they ought to be addressed before mHealth technologies are incorporated into health sciences research. This makes the Common Fund an attractive mechanism, and it is not clear that existing mechanisms are adequate to support the work that is needed.

         


         

        This is an excellent summary, particularly with the comments, of key issues that must occur for the promise of mHealth to be realized. As mHealth issues cut across institutes, it is only logical that common funds be used to fund this research.

        To reiterate and further expand on some statements made above, the NIH common fund has the opportunity to support more innovative methods for health behavior change intervention development  through unconventional Requests for Applications (RFAs) and funding mechanisms that can accommodate iterative research designs and unconventional research designs (e.g., Multiphase Optimization Strategies, sequential multiple assignment randomized trial, N-of-1 and mixed model designs). Support for these types of studies will help aid in reducing the time that it takes for research discoveries to translate into clinical practice. Beyond the direct benefit of that to public health, speeding up the scientific process is also imperative for fostering stronger more collaborative rather than combative relations between academia and industry as the academic timeline could be more easily accommodated by the mHealth industry.

         


         

        mHealth methodologies hold tremendous potential to increase our capacity for “pragmatic” trials and rapid iterative learning research combined with broad reach, to study the multilevel factors that link individuals to communities and thereby influence health. Because mHealth technology / methodologies literally connect people with their environmental surroundings they also promise to broaden our approach, incorporating factors at both the community- and individual level. This means we can ask cross-level, transdisciplinary questions with unprecedented precision. Among other reasons, these questions are important because they have direct implications for decision-making about a range of public health policies, especially as they relate to existing health-related disparities.

        Yet in addition to mobile technology, research design and statistical analysis, I want to suggest that we need to think hard about the nature of the outcomes we hope to affect, whether they are chronic disease states, isolated events like vaccinations, or some combination. Researchers focused on long-term chronic diseases have always had to rely on intermediate outcomes as they test their interventions. There is a slippery slope as we move this logic into the world of mHealth, perhaps best represented by all the tech shops that will tell you that “click-throughs” or new user counts are important intermediate behavioral outcomes. So we (and reviewers) have to ask what sort of outcomes we are willing to accept as clinically meaningful? I hope the Common Fund will support research that gauges the “signal strength” of mHealth interventions vis-a-vis a range of real-world outcomes – intermediate and longer-term - so that we can begin to operationalize a set of standards re: acceptable evidence of clinical utility.

         


         

        This innovative, fast paced and growing field demands more attention and the Common Fund could be pivotal in bringing to fruition endless possibilities. The Common Fund program has the potential to benefit researchers across the board and transform research efforts in many disease areas. Importantly, it has the potential to be used for treatment of mental health disorders, a neglected and important area of health that can often be overlooked. When depression can produce the greatest decrement in health (Moussavi, Chaterji, Verdes, Tandon, Patel, & Ustum, 2007; Andrews & Titov, 2007) it is imperative that we remain vigilant to its inclusion on the Common Fund programs objectives.

        Lastly, whilst this proposal carefully articulates some of the important challenges facing researchers in this field, it omits to mention the need to identify strategies to ensure users don’t become desensitized to the use of expanding mobile health applications. With a proliferation of new technological devices, it is important to consider keeping users engaged, without novelty wearing off. Focusing on simple, but effective and safe treatments employing basic technology, will ensure that access to expensive technology does not act as a barrier; instead health care will become more readily accessible and cost effective.

         


         

        A common fund is the ideal mechanism to meet these objectives.

        As indicated, the field is brimming with innovation and potential, but requires support mechanisms designed to see projects from initial conceptualization through usability testing with patients, providers, and caregivers, field trials to ensure technological reliability and clinical signal, more formal clinical and cost-effectiveness trials, and integration with existing systems of care--all against a backdrop of regular platform updates to keep pace with new technological developments. There are numerous unanswered, basic human factors questions that should be addressed prior to proceeding to a formal clinical trial; an optimization framework (e.g., MOST; Collins et al., 2005, 2007) could maximize the benefits from mHealth interventions and avoid potential harm, but it seems unclear how such long-term investments can be supported in the current funding mechanisms.

        Clinical issues such as medication adherence are broad enough to allow such projects to deal with multiple disease areas. Even more broadly, these projects will allow the creation of much-needed guidelines on developing the infrastructure to manage continual changes in platforms, empirically-based and validated procedures for usability testing of healthcare devices and software, models of interdisciplinary communication and productivity, and the myriad of other issues for which there are few existing roadmaps.

        The common fund should also support development of open source initiatives (e.g., openmhealth.org) that provide modular, technologically-agnostic functions that span disease areas. A common library of validated, user-friendly components will exponentially speed development and evaluation of new mHealth solutions, as well as help to connect investigators and speed formation of an mHealth research community.

         


         

        Attention to research in mHealth has the potential to improve health related behaviors for chronic physical and mental disorders. While it is crucial that we develop advanced technologies and applications, we should at the same time fully harness basic capabilities of mobile phones that are highly ubiquitous. An unbalanced approach with a heavy focus on new technologies and sensors runs the risk of increasing health disparities and leaving low-income and underserved communities perpetually behind the curve due to limited access to technology. Although certain technologies that are currently being researched will indeed be antiquated by the time they are properly tested, studying the principles behind technology will aid future development. For example, SMS may not be the intervention of choice in a few years, but messaging and prompting will likely remain a part of mHealth interventions. We should focus efforts to make sure that these highly innovative interventions are being tested and disseminated with people that have the highest needs, whom are often the most difficult to reach. Additionally, I fully support a push for increased funding of mHealth projects and for rethinking our research methods so that they are more adaptive. This requires new methods and openness on the part of funding institutions.

         


         

        I DO NOT think this is an area where we should spend $$. There is ample VENTURE and private capital for mHealth, and we will NOT have the kind of impact with CF funds. SBIR/STTR and other existing mechanisms could support worthwhile ideas. If you must spend NIH CF funds on mHealth, it should be tied to supporting clinical trials.

         


         

        An important point that I'd like comment on: “In fact, development of the mobile and wireless health technologies is currently progressing at a much faster pace than the science to evaluate their validity and efficacy. Unnecessary devices will be created with little medical impact because they were developed without an empirical foundation and input from the health research community.”

        Not only do I agree that many unnecessary devices with little medical impact are likely being created as part of the commercial boom in mobile and wireless health technology, but further, I suspect that the market forces driving some of these commercial ventures is likely generating technology that does real harm. When commercial designers optimize towards outcomes other than health (e.g., advertising traffic or units sold), but market their products as health aids, devices/features that do real harm will almost inevitably result. I'd like to advocate for more research specifically designed to identify potentially harmful mHealth devices/features already on the market. The Food and Drug Administration’s (FDA’s) recently announced proposal to begin regulating certain mHealth applications seems like an important first step. Further, as pointed out, these devices often generate extremely rich data sets. In some instances it may be appropriate to being incentivizing or even requiring companies to share de-identified data.

         


         

        This type of approach is critically important to accelerate in order to meet the needs of rural communities that lack access to health care providers (mental health, endocrinologists, oncologists, etc.) who could manage their care at a distance through technology. Access to broadband as a supportive technology infrastructure is also critical for the deployment of mobile technology devices for health improvement.

        Developing applications that blend both bio/physical/geo sensing capacities and real-time feedback and interaction with the “host” (patient/client) and providers/artificial intelligence (AI) systems will require contexts that cut across a range of research areas. As a psychiatrist I see real opportunities in connecting patients and their care supports (providers, family members, appropriate community supports) with actionable information on their own emotional, physiological and social status. For example, the motion sensing capacity of modern cellular devices could readily translate into supporting an early alert system for mood disorders, e.g. monitoring when physical activity levels change as has been documented in the run-up to mania or severe depression - and could also be tied into looking at texting/communication levels including the content of such communication and offer opportunities for support and intervention. The increasingly negligible cost and the general ubiquity of these technologies also suggest great promise for extending support and interventions to large populations previously lacking access to care for mental health disorders.

        Using the Common Fund approach to anchor and focus such research will help bring needed, broadly available and translatable resources to our most challenged populations.

         


         

        Thank you for this excellent summary of the opportunities and challenges surrounding mHealth.

        Patients, technologists, providers, and health science researchers will all be greatly served by this proposed common fund initiative.

        NIH has the opportunity to do for patient-centered innovation and evidence generation, what DARPA and NSF did for information sharing generally when they created coherent and coordinated funding initiatives for the Internet.

        From the perspective of the openmhealth community (openmhealth.org), we hope that funded projects will focus on developing and sharing modular and open methods and tools and take advantage of the wealth of patient-generated and analytical data that mHealth systems offer.

         


         

        These are exciting technologies to consider for research and clinical care. As so many in our population today are wired in with their own websites and social media platforms, I wonder about the intersection between user-generated content about themselves (i.e. Facebook, Twitter, YouTube) and other-generated (medical record) content about patients. Are there opportunities to link a MyChart and a LinkedIn profile? Would that enhance detection of risk behaviors or communication between physicians and patients?

        These social media platforms also offer opportunities for targeted content. For example, a patient who writes a Facebook status update about trying to diet could have a linked advertisement for their health insurance's weight loss program. Linked advertisements on these sites have potential to be highly specific and relatively inexpensive, and thus provide direct health marketing to targeted populations to promote better health.

         


         

        A Common Fund for mHealth is vital since MHealth interventions have the potential to profoundly change treatment for substance use disorders and other psychiatric disorders (i.e., schizophrenia, depression, borderline personality disorder). In the past 20 years the theoretical model for these disorders has changed to a chronic disease model similar to diabetes or heart disease. Despite this conceptual change, treatment and research models as well as patient’s expectations continue to focus on individual treatment episodes rather than management of chronic relapsing disease. Mobile systems can provide interventions more consistent with a chronic disease model, which either currently do not exist or are highly cost prohibitive. These characteristics, which are particularly important for disorders such as substance abuse, include greater and more frequent patient engagement, availability in the patient’s own environment, availability at any time or place, and intervention based on real-time biological, self-report, or other data.

        In addition, development of mHealth interventions can greatly expand access to treatment. Currently, most individuals with substance use disorders do not seek treatment. Of those who do seek care, approximately 20-50% of persons drop out of substance abuse treatment early and 50-60% of completers relapse within six months. In addition, access to treatment, and treatment alternatives are often severely limited, or cost prohibitive. MHealth interventions have the potential to revolutionize treatment for these disorders by expanding treatment options and availability, and increasing the attractiveness of treatment for the large percentage of abusing and dependent individuals who do not seek treatment.

        As other have noted, research projects in this area have the potential to move from pilot to full scale evaluation of effectiveness in a short period of time, but current funding mechanisms are generally not receptive to this wide spectrum approach. Based on our experience this research involves more iterative adaptation based on feedback and rapidly changing technology. Similarly, researchers need to focus on underlying theoretical principals and move away from simply evaluations of feasibility and acceptability of mobile interventions.

        Finally, research needs to evaluate implementation of mobile interventions. As has been shown with evidence-based individual therapies such as motivations interviewing, cognitive behavioral therapy, and interpersonal therapy, interventions shown to be effective in a specific research environment do not always generalize to wide-spread clinical practice. We should thus expect that implementation of technological interventions may depend on patient interest and compliance, organizational and provider acceptability, patient-provider communication, and cost and reimbursement feasibility. Because such interventions are easily scalable to regional and national interventions, incorporating implementation research issues should be considered for a Common Fund.

         


         

        To encourage adoption of standardized testing procedures, whether in mobile or more traditional methods, the data that's being collected must be able to be described in a platform-independent machine-readable way. Too often, surveys and code books are in PDF or Excel files, or even worse, Microsoft Word files or MS-Word files exported to HTML. This approach effectively kills sharing the data collection instrument in any sort of automated fashion. XML, YAML, JSON even CSV -- maybe buzzwords to some, but if the goal is really to share the methodology, the meta-data (the data about the data) must be machine-readable, and must NOT require using a Microsoft product to read it.

        The Common Fund should encourage such a practice.

         


         

        I have 3 young adult daughters with serious mental illnesses. All became symptomatic as teenagers and are living with their illnesses as they become independent adults.
        Mobile phones are an essential tool in every aspect of their lives.
        They are accessible and user friendly; they are already in everyone’s pocket. They could and should play a part in mitigating the impact of serious mental illness by managing symptoms and the side effects of medication.

        I particularly like the fact that mobile phone apps allow users to take more control over their lives by themselves. For example: the calendar and reminder alarm features on cell phones allow people with memory problems to stay on track with medication and daily responsibilities.
        I’d like to see this expanded.

        Examples:
        1. Apps to help control the rapid heartbeat associated with panic/anxiety attacks: music with a definite beat can routinely cause a listener’s heartbeat to accelerate or decelerate in time with the music. Create apps that a user experiencing a panic attack can quickly access, cuing the speed of the music beat to his own heart rate, then to slow down his own heart beat as the beat of the music gradually slows down. A silent version that pulses strongly is another option.
        2. Apps to manage/reduce OCD behaviors.
        3. Apps to track and manage substance abuse behaviors--alcohol consumption and smoking for example.
        4. Apps to track mood states: calendar like apps that the user can fill in, then transmit to a health care professional, for example after starting a new medication.
        5. Apps to track medication side effects: calendar like apps that the user can fill in, then transmit to a health care professional, for example after starting a new medication.
        6. Apps to help people wake up in the morning: people taking psychiatric medication can have a very hard time waking up in the morning due to sedation causing serious problems with school and job attendance. Develop cell phone app alarm rings and ring patterns as well as powerful, effective vibration apps that can wake up these sound sleepers.

        The goal of mental health care is not just to treat symptoms but rather to improve the quality of the lives of people with mental illness and allow them to lead active, productive lives. Mobile and Wireless Technologies can help achieve this very quickly at a relative low cost. The possibilities are endless.

         


         

        One area needing further research is the extent to which self-management support interventions are clinically-linked and technology-enabled.

        Patients with, or at risk for chronic disease, need a complex set of services and supports. The challenge of integrating these services into the therapeutic regimen needs to be successfully overcome if we are going to make significant health and economic gains.

        Technology will no doubt play a role. The researchable question is what role and with what evidence.

        There are four fundamental advancements that need validation independently and collectively and, if shown to be important, widespread dissemination

        1. Self-management support because patients need help mastering the knowledge, attitudes, skills and behaviors so necessary for good outcomes.

        2. Interventions because comprehensive theory-based, evidence-proven, long-term, longitudinal interventions work better than direct to consumer or non-planned health promotion approaches.

        3. Clinically-linked because patients are more likely to adopt new behaviors when the approach is in the context of a trusted therapeutic relationship and within an effective medical care system.

        4. And, technology-enabled because capitalizing on the amazing power of information technology leads to the delivery of cost-effective, scalable, engaging solutions that prevent and manage diabetes.

         


         

        Foundational research in mHealth technologies is imperative to ensure that limited health care resources are directed toward cost-effective solutions to improve health and health care delivery. U.S. health care costs constitute a crippling burden to the economy while failing to yield a healthy population. Our country ranks #1 in per capita health care spending, yet ranks only 50th in overall longevity and 47th in infant mortality. By 2019, CMS projects that U.S. health care spending will have risen to $4.6 trillion and 19.8 percent of GDP. Individuals with chronic disease account for 75 percent of spending – as well as 83 percent of Medicaid spending and 96 percent of Medicare spending. Two-thirds of the rise in health care spending can be attributed to individuals with chronic conditions.
        Rapidly emerging mobile and wireless technologies have the potential to lower the cost of care and lessen the prevalence of disease. Their benefits may be realized across numerous disease conditions (including congestive heart failure, high-risk pregnancy, kidney disease, asthma and sleep apnea, which are among the costliest) and population demographics. The capabilities of real-time, continuous monitoring of patients and advanced analytics applied to more robust data sets must be fully (and quickly) leveraged to combat our nation’s pervasive health care challenges.
        Numerous stakeholders – ranging from patients to health care professionals to developers to policymakers – need data-driven evaluations of these technologies. NIH is uniquely positioned to advance this vital research and ensure that efforts are directed at advancing both clinically- and cost-effective solutions. The West Wireless Health Institute, an independently-funded, non-profit medical research organization focused on lowering the cost of health care through technology and innovation, encourages NIH to expand on its commitment to improving individual and population health by funding mHealth research.

         


         

        This document, particularly with the comments, provides an excellent overview of some of the opportunities and challenges in mHealth. The growing interest from the NIH in funding work in this area over the last 5 years has been encouraging, and I think those efforts have been successful in increasing the number of researchers who are refocusing their careers on this transdisciplinary area. Building these collaborations takes time, and a common fund effort would accelerate the efforts underway, encouraging more researchers to take a risk on this new area rather than playing it safe. I’d like to reemphasize just a few points.

        1. While there is some great innovation going on in industry, I do not think that our public health funding organizations can rely on industry to validate that the technologies they are creating are truly improving individual or population health. Reason 1. There is a proliferation of websites and devices for behavior monitoring, but a dearth of public information about how these systems are used. There is little, if any, incentive for companies to report outcomes (or even usage) data on their products, particularly if it is not good, and no independent verification of the small amount of information that is publically available. Researchers will need to do this validation testing and verify that engagement (and the effect of the intervention) lasts more than a few days or weeks. In some cases researchers may be able to work with companies to do that, but in many cases I think they will need to develop their own technologies to do it properly. Reason 2. Health technologies developed by industry are going to be driven by potential profitability, and sometimes mHealth technologies that may provide great population health benefits if effective have lousy business models. In the future, if (or hopefully when), our healthcare system moves to a model where there are more financial incentives for companies to maintain wellness, perhaps this barrier to innovation will be reduced. Today, though, there are many exciting ways in which mHealth systems might improve health or wellness that that should be explored but that have weak business models. Reason 3. Companies will tend to keep methods/algorithms/data proprietary while at the same time collectively proliferating the number of devices. This can slow down innovation at a time when we still don’t really know how to best exploit technologies for mHealth. The NIH can encourage work that increases the likelihood that open standards will be developed that will accelerate research.

        2. The disease-focus of the Institutes is a constant source of frustration for those of us interested in technologies that may address more than one condition or behavior at a time. Addressing the entirety of a person’s behavior, condition, or exposure may permit fresh solutions to old problems. When even “pilot” R21 grants must be so disease specific, however, work exploring creative solutions is sometimes left unfunded.

        3. Experts in human-computer interface design have generally discredited the idea that it is possible to identify a problem, design an interface, and then test the interface. This “waterfall” model typically leads to poor designs. Iteration is key to good design, where users are engaged early in the design process and then many rapid, iterative design cycles are undertaken. The typical NIH proposal discourages this approach, for a variety of reasons. A common fund might enable RFAs that encourage this style of research.

        4. Engagement is undervalued and understudied in health intervention testing. By the time most studies have recruited study populations, there is a tremendous amount of subject selection bias and quite a bit of incentivizing (financial and otherwise) that may not lead to results that reflect how the devices would actually be viewed and used outside of a research study. The NIH (and reviewers) should encourage more work where iterative design, development, and testing is done with convenience samples of individuals but at large scales (e.g., using app stores and people who already have the consumer electronics needed). Then, once engagement is demonstrated and the technology is robust, modify the technology for tough-to-reach populations that might require more expensive studies. I see this type of work unlikely to emerge without a cross-Institute approach.

         

        Up to Top

        NIH Global Research Administration and Training Networks (GRAT-Net)

        Nominator: NICHD

        Major obstacle/challenge to overcome: The NIH has dramatically increased its funding of scientific and research training programs in developing countries to tackle a wide variety of global health challenges. Evidence-based solutions to address these challenges require establishing a research infrastructure in resource-poor organizational settings that can employ the best research practices – both methodologically and administratively. While many foreign investigators are trained in the newest scientific theories and research technologies, they and their home institutions may have little direct knowledge of, or experience with, best practices for project management, research oversight, data management, and fiscal accountability. Advancing research and improving the health of all of the world’s people will demand not only investments in the best science, but also investments in a sustainable research enterprise that can effectively and efficiently use and monitor research resources. It takes collaboration and communication between skilled individuals from various functional backgrounds, with access to proper resources, to support the best research management practices.

        Emerging scientific opportunity ripe for Common Fund investment: The increased NIH focus on global health only magnifies the need to develop networks of research administrative professionals and research-adept institutions within developing countries that can support and sustain investments by NIH and other funders over time. This initiative will use the NIH Common Fund and collaborative expertise to build Global Research Administration and Training Networks (GRAT-Net) to address this challenge. The GRAT-Net program would provide state-of-the-art information on grants/contracts management and policies and would train individuals from across the research enterprise to deal effectively with the evolving nature of research administration. GRAT-Net would also provide the newest technology for research administration and help to integrate research administrative capacity within and across institutions, while also forming a series of collaborative research networks in low- and middle-income nations.

        Common Fund investment that could accelerate scientific progress in this field: The GRAT-Net program would build upon the success of the NIH International Extramural Associates Research Development Award (IEARDA), which uniquely targets administrative capacity building at public and private universities in sub-Saharan Africa and India. It would also build on the expertise and experience of other IC-related efforts, such as Fogarty International Center’s Medical Education Partnership Initiative (MEPI) and its work with the President’s Emergency Plan for AIDS Relief.

        Phase I of GRAT-Net awards would focus on establishing core research management teams at international institutions (“nodal institutions”) that would then network with local or regional institutions (see Phase II below). A Principal Investigator would head an initial team of several individuals to be trained, through residency or other programs, in research administration. The team would develop strategic plans for their own institution and a future GRAT-Net consortium to serve a specified geographic area. In addition, the team and home institution would be:

        • Given resources to establish their own office of supported research.
        • Expected to collaborate and integrate their new research management functions with existing functions in business and management, information technology, and academic programs in their home institutions. This would be facilitated by intra-institutional training seminars and workshops, hosted by the institution’s office of supported research.
        • Expected to show direct or linked, multi-functional competencies in budget planning, financial and contracts management, scientific review, grants administration, overall research polices (COI, human subject protections, intellectual property, etc.), data management, development of research resources, and procurement. These competencies would be established through training, building collaborations, and acquisition of resources.

          Phase I would also include:

        • NIH developing new, or enhancing existing, research administrative curricula particularly suited for use in a variety of low- and middle-income countries and adapted to meet different cultural needs or new technologies. These materials would help support a wide range of train-the-trainer and knowledge-transfer activities.
        • Extensive support to procure and develop the information infrastructure and data systems for research administration within the home institution and, eventually, the wider network. This would include ensuring the availability of compatible and appropriate hardware and software to support a full range of research administrative activities.

        Phase II – Creating the GRAT-Net beyond the Nodal Institution
        In Phase II, nodal institutions would be expected to develop working collaborations with institutions with varying levels of research capacity, locally or regionally. This could include:

        • Train-the-trainer programs and other innovative activities to share expertise and develop research administrative capacity across functional areas. Joint course offerings, team courses, and use of social media would be encouraged.
        • Inter-institutional mentoring programs and internships would be encouraged.
        • Relationships would be furthered with multiple funders and community-based groups.

        Potential impact of Common Fund investment: The GRAT-Net program would allow the NIH to reduce duplication of fiscal and human capital currently dedicated to international research and consolidate resources to ensure the ongoing success of global research investments. Ensuring sustainable and knowledgeable research administrative capacity will increase the number of institutions, in low- and middle-income countries, that can contribute to the research enterprise successfully. The initiative would also increase the efficient and productive spending of NIH resources, expand the number of geographic areas in which the NIH could invest successfully, and encourage a broader array of funders, including non-governmental organizations, to collaborate in research efforts. Most importantly, the program will help the NIH more confidently target its global health research investments, not only to great science, but also to international institutions in areas with the greatest health needs.

        Comments:
        None

         

        Up to Top

        Regulatory Science Initiative

        Nominator: NINDS, OSP
        Participating IC: NIDDK, FDA partnership

        Major obstacle/challenge to overcome: The rapid pace of scientific discovery coupled with development of new technologies presents a challenge to researchers, clinical investigators, and regulators as they work to translate basic scientific advances into approved medical products. Basic and preclinical research has been performed in large part independent of regulatory issues. In addition, it is clear that novel technologies and approaches to medical research are outpacing the ability of our regulatory system to incorporate them into current review practices and guidelines. To overcome these obstacles, NIH should support strategic initiatives that are essential to the translation of NIH funded discoveries into diagnostic and therapeutics.

        Emerging scientific opportunity ripe for Common Fund investment: An investment in Regulatory Science will benefit all stakeholders by helping to advance and incorporate cutting-edge science into regulatory decision making and helping to develop improved tools, standards and approaches for assessing the safety, efficacy, quality and performance of medical products. Major advances in genomics and genomics-based medicine are also creating potential scenarios in the clinical setting that are relatively new to the FDA regulatory process. Moreover, the unprecedented partnership between the NIH and FDA through the Joint Leadership Council provides an extraordinary opportunity to coordinate therapy development efforts, including regulatory decision-making guidelines, between the two agencies.

        Common Fund investment that could accelerate scientific progress in this field: A number of scientific opportunities are ripe for investment in the area of Regulatory science and across the therapeutics development pipeline. For instance:

         

        • Advances stemming from next generation sequencing technologies to help set the stage for personalized medicine, and other novel, genomics-based approaches to drug development, and treatments, such gene-editing, regulatory RNAs, gene –silencing or transcriptional activation.
        • Nanotechnology, microfluidics, and stem cell technologies to alter the way experimental agents are tested for safety and efficacy, and ultimately delivered to patients as therapies.
        • New approaches to predictive toxicology including high-throughput screening strategies/models, new in-silico approaches and computer-simulation models.
        • Development of bioinformatics tools and approaches for data mining and meta-analyses on safety and efficacy that capitalizes on the increasing volumes of clinical and medical product-related information in data repositories.
        • Novel approaches of conducting clinical trials for rare and neglected diseases.

        Potential impact of Common Fund investment: Pre-clinical and clinical investigators, and other researchers who are engaged in the diagnostics and therapeutics development industries will benefit from having a more rapid integration of evidence-based knowledge into a regulatory framework, thereby quickening the pace at which basic science advances can move into the therapy development realm. For instance, in the area of stem-cell technologies, the NIH and FDA are working together to identify and define markers and characteristics of “stemness”, thus providing standards that the entire field can use for purposes of comparing studies and preparing for regulatory considerations. The possibility of individualized, autologous utility of stem cell-derived therapeutics, organs, tissues, and other biomedical products are fast becoming a reality. Other emerging areas in regulatory science will advance, such as nanomedicine, personalized medicine, efficient and expeditious clinical trial designs, predictive toxicology, and biomimetic models that are able to simulate human conditions and better predict safety and efficacy. The NIH-FDA joint efforts in these areas would help to pave a clearer and more transparent scientific and regulatory path for the scientific community that will impact therapeutics product development and clinical practice.

         

        Comments:
        Development of animal models that faithfully recapitulate human disease complexities will be important for improving success of late-stage clinical trials. This issue is certainly highly relevant to tissue engineering and regenerative medicine-based therapies. Tissue degenerative diseases in humans often occur in aging individuals, and are typically associated with continuous cycles of tissue destruction, chronic inflammation, autoimmunity, fibrosis and scarring. These processes create an “unfriendly” environment for tissue regeneration, because the regenerated tissues are likely to be detrimentally affected or destroyed as a result of these pathologies. Yet common pre-clinical animal models that are used for testing tissue regenerative therapies do not take into account these complexities of human disease. Such tests are often conducted in young animals and under idealized conditions of acute injury. Thus efficacy of these therapies in such unrealistic animal models often is not predictive of their efficacy in patients in late stage clinical trials. Thus incentives should be provided to encourage development of animal models, particularly large animal models, which realistically recapitulate human disease. While these models are likely to be more expensive than their simplified versions, they will save money in the long-run, because non-promising therapies will be identified and eliminated earlier in the pipeline.

         


         

        Much of the interpretation of the experimental animal data is subjective and this has resulted in poor translation. Establishing and supporting an available and expanding database of systematic reviews of preclinical data on potential drugs (akin to the Cochrane library, which is for clinical data) would go far in focusing translational efforts.

         


         

        As important as safety is the issue of efficacy. Recent studies suggest that up to 65% of clinical trials in Phase III fail because of lack of efficacy or differentiation. This is certainly the case in CNS, where less than 9% of all drugs that enter clinical development make it to approval. Animal models are notoriously poor in predicting clinical outcomes in psychiatry and neurology. Alternative and better translational models might include computer-based mechanistic disease-modeling. This approach aims to understand the human biology and the effect of drugs when they enter a certain human brain environment driven by genotypes and comedications. Prospective support of these approaches in regulatory affairs can help improve clinical trial design by running virtual patient trials. In the case of CNS disorders, there is a large academic expertise of computational neuroscience and other initiatives such as imaging genetics and the human connectome can provide the necessary basis for substantial breakthroughs towards the human virtual patient. Incentivizing these teams to explore human pathology, rather than preclinical animal models through specific Request for Application (RFA) can in the medium term already significantly improve clinical trial outcome. Even a small increase in clinical trial success can mean a world of difference for some patients.

         

        Up to Top

         

        Synergizing Omic Science with Patient Reported Outcomes

        Nominator: NINR

        Major obstacle/challenge to overcome: Current breakthroughs in omic research have increased awareness that diseases are complex disorders arising in response to the interaction among multiple genes, cellular metabolites, and environmental factors. In addition, there is growing speculation that how a person experiences illness may be genetically predisposed. Hence, a new era suggests scientific progress towards innovations in health will require the integration of diverse information from biologic processes, physiologic pathways, and behavioral models in order to predict and treat disease, improve survival, manage symptoms and enhance quality of life. This symbiosis of disparate knowledge is necessary to ensure biomedical science improves health through its translation into practical clinical applications. Therefore, fostering synergy between omic science (genomics, epigenomics, transcriptomics, proteomics, metabolomics, and microbiomics) and patient reported outcomes such as symptoms (‘‘sympt-omics ‘’) and health-related quality life (HRQL) may promote new pathways for reducing burdens associated with chronic illness and enhance personalized health.

        Emerging scientific opportunity ripe for Common Fund investment: Progress in this area promises to fill gaps in broadening the links between genetic and molecular variants and patient reported outcomes in chronic illness. This opportunity will: 1) support cross-cutting research with substantial potential to create new perspectives for reducing the illness burden of chronic health conditions; and 2) encourage collaborative teams of diverse, interdisciplinary investigators to
        tackle the complex health and research challenges posed by chronic illness and to turn their discoveries into practical solutions for patients.
        Common Fund investment that could accelerate scientific progress in this field:

         

        • Identification of molecular biomarkers and other classifiers to assess individual susceptibilities to variations in patient reported outcomes such as symptoms and HRQL.
        • Development of novel methods, including modeling approaches and use of emerging technologies to measure and predict associations among omic variants and patient reported outcomes, such as symptoms and HRQL.
        • Design and testing of tailored interventions to improve treatment outcomes that are founded in the linkages among omic variants and patient reported outcomes, such as symptoms and HRQL.

          Potential impact of Common Fund investment: Synergy of omic profiling with practical application to patients will allow new pathways for improving treatment outcomes to emerge and ultimately reduce the chronic illness burden and enhance personalized health across the lifespan.

        Comments:
        Truly patient centered, personalized healthcare is going to require the cross-cutting integration of a person’s most basic genomic and other data including that which best represents the very aspects of personhood (e.g. experience, quality of life, symptoms, values). One will inform the other, and together this information will be required to tailor treatments to individuals and summarize characteristics of populations. Hence, this topic area is critical, not only to link our understanding of the biologic basis of the human experience, but also to ready us for healthcare of the future.

        Currently, there are a series of converging themes in healthcare, that differentially incorporate aspects of genomics and patient-reported outcomes, namely comparative effectiveness research, patient-centered care, personalized medicine, healthcare redesign, healthcare quality, and of rapid learning healthcare (RLHC). RLHC has been described as the intersection of all of these (see IOM 2007, and J Clin Oncol. Sep 20 2010;28(27):4268-4274.), and fundamentally relies on a system of linked data where the interactions between different data sources are continuously evaluated, understood, and applied. In a rapid learning system, data generated on a daily basis through routine clinical care feed into an ever-growing coordinated data systems. Using adaptive design and other approaches, the system learns by routinely analyzing captured information, iteratively generating evidence, and constantly implementing new insights into subsequent care. Each new patient’s care is informed by the treatment, outcomes, and experiences of large numbers of similar patients who preceded him/her in time, and individual patient care is reinvested into this overall system of data.

        Critical to this process is linked information and informatics, including the interaction between genomics and patient reported outcomes. As a part of this proposal, consider an agenda item that specifically promotes personalized healthcare and rapid learning systems through innovation. For example, comparative effectiveness research around genomics-based tests, with patient reported information like quality of life both a predictor and outcome; multivariable models that include both genomic and patient-reported data sets, and are tailored at point of care to inform clinical decisions; use of patient reported information to triage patients who may be at greater risk and therefore need a specific genomic test.

         


         

        Could it somehow be included a sort of Google Health or standardized site for patients to enter their symptoms?

        I used to use Google Health religiously (I think Google is getting rid of the program) to monitor when I felt bad or when I went to the doctor, etc.

        I think we really need some sort of standardized network. I think patients are ready and willing to participate in these studies if it’s easy (like developing an app for the cell phone with reminders to enter symptoms after checkups or after surgery, etc).

        It should be standardized so that many researchers can have access to this info and help with the data mining, to enable a sort of scientific crowd sourcing.

        And rather than have patient samples that represent the U.S., how about samples that represent the world. B/c I think it’s safe to assume that America is in the unique position of having the resources AND a very diverse genetic pool to pull from for these studies. Most countries will likely modify practices and therapeutics based on studies such as these. It would seem reductive and borderline unethical to not include what represents the world population as much as possible.

        And to be sure to categorize people's phenotypes moreso than just their race. If you do, really dig into their genetic history, rather than assume someone's genetic history by the color of their skin or even what they culturally identify with.

         


         

        Many Academic Research Centers and Institutes possess the resources and infrastructure required to genotype and phenotype human research participants and patients. These exceptional and existing resources include; gene arrays, mass spectrometers for 'omics' (peptide-protein-lipidomics, metabolomics), imagers/scanners (MRI, CT, PET, MRS), nano-particles and physical chemists, bioinformatics hardware/software. These existing Resources should not be ignored as we (NIH Common Fund) plan and move forward with “disease-oriented, patient-centered, personalized healthcare that provides cross-cutting integration of a person’s genotype, phenotype, behavior-type with other data.”

        Please do not re-create NIH Common Fund resources that are redundant with the exceptionally productive, existing Biotechnology Centers. This would be a wasted investment. Please re-enforce and re-invest (competitive supplements) in the existing Biotechnology Centers, so they can continue to upgrade their equipment and expand personnel, and so they can synergize (electronically) with one another, and provide the technical expertise and analytical platforms required to more rapidly move towards fully integrated 'omic-based' genotype and phenotype analyses and biomedical advances that improve healthcare and delivery.

         

        Up to Top

        A synthetic cohort for the analysis of longitudinal effects of gene-environment interactions

        Nominator: NIA
        Participating IC: NHLBI, NIMHD, NHGRI, NINR, OBSSR, ORD

        Major obstacle/challenge to overcome: Francis Collins has emphasized that there is a pressing need for a large-scale US prospective cohort study of genes and the environment, with a minimal sample size estimated at 500,000 (Collins, 2004; Manolio et al., 2006). However, even with a very large cohort, at least 7 to 15 years follow-up is needed to accrue enough incident cases to adequately power studies of common diseases (Burton et al, 2009).

        Several NIH institutes support multiple high-quality cohort studies with rich biomedical, environmental, behavioral, and social data that, if strategically coordinated, could help elucidate how genes and environments interact to affect trajectories of health and disease (Willett et al., 2007). Recently, many of these cohorts have been extensively genotyped and some have undergone whole exome sequencing. Open sharing policies through dbGaP have created databases of extensive genotype data linked to phenotypes which are incompletely harmonized, and whose data standards vary significantly. More complex analyses could be served by a synthetic cohort with harmonized data. The longitudinal data included in these cohorts increase the reliability of measures and afford the examination of change phenotypes, which are crucial for understanding the causal pathways leading to changes in health. An integrated, harmonized, synthetic cohort will also facilitate the selection of genetically distinctive subpopulations for specific studies of genetic and environmental influences and their interaction. It would accelerate and support new approaches to discovery such as PheWAS (Denny et al., 2010; see also phenotype mining), fine-grained admixture mapping (Shriner et al., 2011), or conditioning on known risk variants to find potential gene-gene interactions via GWAS (Wijsman et al., 2011). Harnessing the potential inherent in these existing studies will allow us to analyze the roles played by life-style factors, social circumstances, and environmental exposures in modulating disease risk and progression.

        We face a similar set of opportunities and challenges from the ever-growing number of patient registries, which increasingly include rich data, but where there is little consistency to date as to what data are reported or required. Investigators and patient advocacy groups alike express the need for creation of patient registries to facilitate research, but we are not yet getting the maximal benefit from the available data (Drolet & Johnson, 2008). In rare diseases especially, the ability to rapidly and accurately identify affected persons and to know enough about genotype and phenotype to be able to determine who might be informative for basic biology studies and who might be available for clinical trials is critical to the efficient conduct of research studies across the biomedical spectrum. The feasibility of greatly expanding current patient registries is reinforced by the fact that patients are increasingly willing to share their data by their participation in internet sites such as Patients Like Me (http://www.patientslikeme.com/), but these data collected through the sites have limited utility to researchers. Patients are also signing up to participate in research studies through sites such as Research Match at Vanderbilt (https://www.researchmatch.org/). The value of existing and proposed registries is severely limited by inconsistencies in data standards, ontologies, and policies for access and sharing and by the phenomenal inefficiencies of duplication of effort and investments. The need for such registries crosses multiple ICs.

        Emerging scientific opportunity ripe for Common Fund investment: The initiative would systematically evaluate design issues for a synthetic cohort study of genes, environment, health, and behavior drawing from existing cohort studies which have rich longitudinal or early life data. The large collection of longitudinal NIH-funded cohorts that have valuable data on exposures throughout the life course have already had profound impact on medical, behavioral, and social science even before the genomic era. Many of these cohorts have added genetic and biomarker collections and now provide unparalleled opportunities for exploiting these epidemiological findings at a genomic level. To date, however, attempts to harmonize or synthesize data collection efforts among studies have been modest, in part due to resource constraints and IC boundaries.

        Strikingly similar issues have arisen for patient registries, which have been proliferating rapidly both for rare and common diseases, with their goals now not limited to enhancing participation in clinical trials, but to enhancing our understanding of patient outcomes (e.g., AHRQ’s 2010 Registries for Evaluating Patient Outcomes: a User’s Guide). The common obstacle faced by both cohort studies and patient registries is a lack of harmonization among existing efforts. Essential analyses requiring large samples are frustrated by the lack of documented history of the creation of the registry, of common or documented consent procedures, of data standards, of common measures, and of time points for observation across the sample. Patient registries thus present an additional opportunity for trans-NIH harmonization. It should be possible to leverage the investment in the design of a virtual cohort to create a format for all NIH IC supported registries. With a common set of data standards, sharing of and access to data, all ICs could support registries within the virtual cohort system which would create efficiencies of scale and consistency of operations to greatly enhance the impact of individual IC investments.

        With this initiative, we could also explore the possibility of collaborating with privately funded efforts (e.g., 23andMe) and cohorts funded by other government agencies. Although implementing the synthetic cohort design would require costs for harmonization and data collection, there are existing platforms for the former (e.g., P3G, www.p3g.org, resources developed by caBIG) and a growing number of freely available, high quality measures of phenotypes and exposures developed by NIH (e.g., instruments from the HRS, PROMIS, PhenX, and the NIH Toolbox). Furthermore, the HITECH and Affordable Care Acts have enhanced the prospects for the widespread incorporation of these measures in EHRs (e.g., the eMERGE network). This initiative is therefore well poised to create a time- and cost-effective harmonization plan.

        Common Fund investment that could accelerate scientific progress in this field: The recent special issue of Science (Feb 11 2011) on large datasets emphasized that ”large integrated data sets can potentially provide a much deeper understanding of both nature and society and open up many new avenues of research”. Better organization and access to data is imperative to realizing emerging scientific opportunities, including the development of common metadata. We propose that the Common Fund be used to invest in the three most important planning activities for a future synthetic cohort project and explore the potential for extending this effort to cover patient registries.

         

        • Catalog and prioritize the existing NIH cohorts and form a harmonization plan for these studies.
        • Develop the analytical framework and bioinformatics infrastructure that would be required for the future synthetic cohort. This will include executing calibration studies to create crosswalks between measures, especially among longitudinal studies with rich behavioral, clinical, and biological phenotypes, and adding EHR information to the data where feasible.
        • Develop plans for data-sharing and consent policies and, crucially, detailed scenarios for the cost of synthetic cohorts of varying sizes and intensities.

        Although we believe that in the long run the synthetic cohort would be cost effective given that ICs currently maintain the full costs of participating studies, there are uncertainties as to the actual per-participant costs of any follow-up CF activity, and it will be crucial during the pilot to determine the post-CF longer term maintenance costs of the synthetic cohort and identify a credible source of funds (e.g., costs could be distributed across ICs/agencies supporting the synthesized cohorts, buying the substantial benefits of harmonization). The target design would include a synthetic cohort of 500,000 well-phenotyped participants. We estimate the cost of this initial developmental and feasibility initiative at $2.0 million per year for two years, with funds to be divided between RMS needs and supplements to enable cohort study investigators to participate.

        Should the initial process of creating a virtual cohort prove successful, efforts could be extended to include issues related to patient registries. The initial investment by the Common Fund could create a program for investment of IC-specific resources for global benefit at marginal increases in cost. The Office of Rare Diseases has recently a web-based Global Rare Diseases Patient Registry-Data Repository (GRDR) which could form the nucleus of the registry (see Forrest et al. 2010, and Rubenstein et al., 2010, for more information). To ensure maximal benefit, the NIH would require the following of all data deposited in a new, central Registry:

         

        • Consent would require data sharing, with the level of sharing (controlled versus generally available) determined by the degree of PII in the data.
        • Data standards would be established by groups with appropriate expertise. Only data meeting the pre-defined standards would be accepted.
        • Access and use criteria would be established consistent with levels of PII.
        • Provision would be made for withdrawal of data from the Registry in response to participant request.

        Potential impact of Common Fund investment: The plan for a synthetic cohort for the analysis of longitudinal effects of gene-environment interactions would establish the feasibility and cost of an intended scalable synthetic national cohort of people for discovery research in health and disease; in essence it would design an affordable but well-powered phenome-genome project as recommended at the recent NIH Innovation Brainstorm meeting. Because the founder cohorts are longitudinal, the synthetic cohort would provide rapid access to trajectory information as well as a richer characterization of the social, environmental, and genetic factors influencing health and health disparities. Leveraging such a system to harmonize new and existing registries would ensure ongoing value, and provide multiple benefits: increased sample size for study of the pathogenesis and treatment of rare diseases, the ability to study the overlap in pathogenesis of apparently unrelated diseases sharing an etiology (e.g., coronary artery disease and invasive melanoma; see Manolio, 2010), and the chance to observe g-e interactions for phenotypically specific or genetically overlapping disease states, identifying genetic and environmental “pathogens” that can be studied in a synthetic cohort. This planning activity could also inform the design of a de novo national cohort study, or determine to what degree a synthetic cohort could provide similar information while allowing a greater degree of innovation within individual studies to address questions within their specific areas of science.

        If determined to be feasible, the development of a centralized Registry, within which IC and outside organizations could deposit secure and sharable data, would allow more rapid translation of basic science information into clinically useful knowledge, greatly improved data quality, and enhanced engagement of patient advocacy groups in support of research at minimal cost. The centralized process would facilitate communication efforts to ensure broad awareness of the resource.

        References

        Burton, P. R., Hanell, A. L., Fortier, I., Manolio, T. A., Khoury, M. J., Little, J. & Elliott, P. (2009). Size matters: just how big is BIG? Quantifying realistic sample size requirements for human genome epidemiology. International Journal of Epidemiology, 38(1) 263-273.

        Collins, F. S. (2004). The case for a US prospective cohort study of genes and environment. Nature, 429, 475-477.

        Collins, F. S. & Manolio, T. A. (2007). Necessary but not sufficient. Nature, 445, 259.

        Denny, J. C., Ritchie, M. D., Basford, M. A. Pulle, J. M. Bastarache, L., and others. (2010). PheWAS: demonstrating the feasibility of phenome-wide scan to discover gene-disease associations. Bioinformatics, 26(9), 1205-1210.

        Doplet B. C., Johnson K. B. (2008). Categorizing the world of registries. Journal of Biomedical Informatics 411009–1020.

        Forrest, CB, Bartek, RJ, Rubinstein Y, and Groft, SC; The Case for a Global Rare Diseases Registry The Lancet, 377, 1057-1059.

        Manolio, T. A., Bailery-Wilson, J. E., & Collins, F. S. (2006). Genes, environment and the value of prospective cohort studies. Nature Reviews Genetics, 7, 812-820.

        Manolio, T. A. (2010). Genomewide association studies and assessment of the risk of disease. NEJM, 363, 166-176.

        Science Staff. (2011). Challenges and Opportunities. Science, 331, 692-693.

        Shriner, D., Adeyemo, A., Ramos, E., Chen, G., & Rotimi, C. N. (2011). Mapping of disease-associated variants in admixed populations. Genome Biology, 12, 223-231.

        Willett, W. C., Blot, W. J., Colditz, G., A., Folsom, A. R., Henderson, B. E. & Stampfer, M. J. (2007). Not worth the wait. Nature, 445, 257-258.

        Wijsman, E.M. Pankratz, N.D., Choi, Y., Rothstein, J. H., Faber, K. M. and others (2011). Genome-wide association of familial late-onset Alzheimer’s disease replicates BIN1 and CLU and nominates CUGBP2 in interaction with APOE. PLoS Genetics, 7(2), e1001308.

        Rubenstein, Y. R., Groft, S, C., Bartek, R., Brown, K., Christensen, and others. Creating a global rare disease patient registry linked to a rare diseases biorepository database: Rare Disease-HUB (RD-HUB), Contemporary Clinical Trials, 31(5), 394-404

         

        Comments:
        The creation of very large, publicly available synthetic research cohorts with harmonized phenotypic data and extensive genomic data is a brilliant idea that could revolutionize the study of gene-environment interactions at relatively low marginal cost. There is sufficient uniformity in the life course across adjacent and overlapping birth cohorts that pooled analyses of rich and comparable longitudinal data across existing and new studies would accelerate the accumulation of highly reliable findings. One example of such an accelerated developmental design, from the social sciences, is a longitudinal study of youth development in Chicago in which each of several cohorts was followed long enough to overlap in age with the next older cohort. For more than 30 years, I have been principal investigator (PI) of a longitudinal study of one cohort – The Wisconsin Longitudinal Study (WLS) – which has followed some 10,000 high school graduates of 1957 from youth to their early 70s and which has now collected DNA from a large share of surviving graduates and their siblings. My colleagues and I should be more than glad to join in harmonizing our data and contributing it to this joint undertaking. Having worked over the decades with other leading social, behavioral, and biomedical researchers – many of whom already have a long history of data-sharing – I am quite confident about the willingness and ability of a large community of social, behavioral, economic, and demographic researchers to engage actively in the synthetic cohort enterprise. I also have every reason to believe that this effort will have enthusiastic support from colleagues in the National Institute on Aging and the National Institute for Child Health and Human Development.

         


         

        This is one of the most timely, crucially important, and innovative NIH Common Funds project proposals. Why? As almost everyone agrees, more and more evidence has shown that few diseases or conditions are caused purely by genetic factors; most are the result of interactions between genetic inheritance and environmental factors. Therefore, to expand our knowledge of how to improve the health of individuals and populations, it becomes imperative to conduct research that explores the effects of interactions among social, behavioral, and genetic factors on health. However, research on this crucially important topic is still very much underdeveloped, and identifying true genetic-environmental interaction effects is a tremendous challenge and difficult task. This is mainly because there are many environmental factors which may potentially interact with many genetic variants; testing effects of (many x many) possible interactions inevitably involves multiple comparisons that may increase false positive errors if the sample size is not large enough. Therefore, as the comment above (“The creation of a very large…..”) correctly summarized, the creation of very large publicly available synthetic research cohorts with harmonized phenotypic data and extensive genomic data is a brilliant idea that could revolutionize the study of gene-environment interactions at relatively low cost.

        I have two suggestions for NIH and the international community to consider within the framework of this NIH Common Funds project proposal. First, I suggest that NIH consider taking the international leadership initiative for strengthening the analysis of longitudinal effects of gene-environment interactions on human health, including both the U.S. population and some other populations with similar longitudinal data resources and urgent needs such as rapid and large scale of aging. This may substantially strengthen the statistical power of the harmonized phenotypic and genomic database for human health and may be done through inter-governmental agencies’ collaborations and co-funding mechanisms.

        Second, given the fruitful outcomes of the Human Genome Project and the International HapMap Project, NIH and the international community (including relevant agencies of the interested countries) may consider launching a new International CentMap project with cost-and-database-sharing. If it works, the International CentMap project may create GWAS and/or exome sequencing genomic and phenotypic database for at least several thousand or more centenarians from different countries and ethnic groups. As we may all agree, focusing on phenotype extremes is often a good approach to gain research leverage at reasonable expenses. Currently, researchers either compare the genotypic/phenotypic data between patients with a specific disease or disorder (extremely poor health) and normal people as controls, or compare the genotypic/phenotypic data between a small sample of centenarians (extremely good health) and normal people as controls, and it of course makes good sense to do so. Logically, however, greater insight may be gained by comparisons among centenarians who may likely carry positive genes and/or live with good behavior/environment, patients with diseases or disorders who may likely carry negative genes and/or live with poor behavior/environment, and normal people as controls. Such triple comparisons may at least reconfirm and even enhance the findings of dual comparisons such as patients-controls and centenarians-controls. Unfortunately, genotypic data from large samples of centenarians are not yet available worldwide, while such databases for patients with various diseases and controls already exist widely. Clearly, the analysis of longitudinal effects of gene-environment interactions on human health would be significantly enhanced if the International CentMap database with large enough samples of centenarians becomes available through initiative efforts of this proposed NIH Common Fund project. In sum, the scientific rationale for the International CentMap is to strengthen research on the effects of genetic and behavioral/environmental factors and their interactions on disease prevention and health promotion, rather than longevity only.

        Finally, I wish to briefly introduce the Chinese Longitudinal Healthy Longevity Survey (CLHLS) jointly funded by NIA/NIH, National Natural Science Foundation of China and other Chinese resources and UNFPA. CLHLS had conducted nearly 80,000 interviews nationwide in 1998, 2000, 2002, 2005, and 2008-2009. Among them, 14,376 interviews were with centenarians, 18,938 with nonagenarians, 20,823 with octogenarians, 14,285 with young-old aged 65-79, and 10,962 with adults aged 35-64. In the CLHLS, survivors were followed-up and deceased respondents were replaced with new participants. Data on mortality and health status before dying for 17,649 elders aged 65-110 who died between waves were collected in interviews with close family members of the deceased. Blood dry-spot samples were collected from 4,116 oldest-old aged 80-110 in the CLHLS 1998 baseline survey. We also collected 14,000 saliva DNA samples using Canadian-made Oragene kits and 1,945 full blood samples in our 2008-09 wave. In total, we have DNA samples from 18,093 individuals, including about 4,000 centenarians, 4,300 nonagenarians, 4,000 octogenarians, 3,250 elders aged 65-79, and 2,520 middle-age controls. However, these DNA samples have not yet been genotyped and analyzed, except for our pilot research project that genotyped the SNPs of FOXO and ADRB2 longevity candidate genes for a very small portion of DNA from about 1,000 oldest-old aged 90+. We are trying our best to apply for needed funds for GWAS/genotyping and the genetic-behavioral-environmental interactions analyses. Similar to the above comment on this NIH Common Funds proposal, as PI of the CLHLS and on behalf of my colleagues at Duke University and Peking University, I wish to express that we will be more than happy to join in harmonizing our CLHLS data and contributing to the international joint undertaking of the analysis of longitudinal effects of gene-environment interactions for improving human health and longevity.

        The advantages of a large synthetic panel based interviews, genetic information and physical measures from the existing longitudinal aging surveys have been well expressed in other submissions to this call: immediate availability of results, cost minimization due to utilization of existing survey data and infrastructures, availability of a huge range of phenotypical information across medical, social, economic domains, exploiting the possibilities offered by advances in genetic measurements (1m.+ SNPs) etc. There is no doubt that the creation of such a cohort would provide a new and very rich vein of scientific material and also lead to improvements in how future aging studies are conducted and analyzed and how the biological and social sciences can be optimally merged to give new insights into the human aging process. We fully share these views and will not re-iterate them in depth.

         

        We would like to focus specifically on the advantages of including a range of international surveys from the HRS “family” in this endeavor. This would have a number of specific advantages:

         

        1. The addition of an increased number of cross-country phenotypes and genotype.
        2. The addition of variation in healthcare systems and policies (e.g. U.S. vs Europe).
        3. The fact that some of the studies outside the U.S. include innovative measures and tests over and above those available in the U.S. studies, such as the collection of venous blood in ELSA (England) and TILDA (Ireland), so widening the range and quality of possible investigations.
        4. The international studies include a number of the tests which appear on the PhenX list of desirable phenotypical measurements such as the retinal photo, detailed cardio-vascular measures and cognitive tests (trail tracing, etc) in TILDA.
        5. In addition, the international studies include some unique new measures such as the life history data in ELSA which allow the analysis of phenotypical information (collected on a retrospective basis) over a period extending to almost a century. Another example is the detailed physical measures of gait and balance included in TILDA.

         

        The international studies will thus provide new possibilities for the analysis of gene/environment interactions

         

        Most of the international studies range in size from 5,000 to 20,000. For the purposes of full genetic analysis larger samples would be required (indeed this is the purpose of the synthetic cohort). However, analysis of the existing surveys would provide very useful pilot and preliminary data and would suggest where future large-scale data collection and analysis could be focused.

         


         

        The proposal appears to be one that could prove immensely valuable for the scientific community. It has the potential to transform research efforts for many diseases. The case is presented from a medical perspective and further development of the environmental side, including other gene-environment interactions with other phenotypic outcomes, would enhance its potential. Well-being, personality, wealth, partnership behaviors, employment characteristics, and mental health could be included. Perhaps the synthetic approach is not as able to capture systematically social and economic known determinants of ill health, especially inequalities. If so, the pilot stage should explicitly address this.

        The methodology developed for creating such synthetic cohorts for gene-environment interactions has the potential for wider applicability. Perhaps international collaboration with long running UK cohorts (1946, 1958, 1970, MCS, ELSA, and ALSPAC) and other population studies with rich phenotypic data funded by economic and social or medical funders in the UK might further enhance the proposal.

         


         

        This is an initiative that deserves the highest priority. In a globalised world where national boundaries are increasingly becoming porous, international comparisons to understand the biological and environmental determinants of complex disorders are critical. The need to create synthetic cohorts that collect information on exposures over the life course and have been carefully characterized with regard to health outcomes using harmonized measurement strategies cannot be overemphasized. In order to disentangle the puzzle of complex disorders it would be extremely valuable to put together such synthetic cohorts from a range of environments and genetic backgrounds. The World Health Organization's (WHO’s) Study on Global Ageing and Adult Health (SAGE) has created a cohort of ~90,000 individuals, with the support of the National Institute on Aging, that will be followed over the next decade to begin with. Data are made publicly available with accompanying metadata and efforts at harmonization with other studies such as the HRS and ELSA. Dried Blood Spots have been collected and are being analyzed for a range of biomarkers. Future rounds plan to collect saliva samples for DNA and add other phenotypic measures. WHO is uniquely positioned to strengthen collaboration especially with regard to public health genomics and I strongly support this effort.

         


         

        This is an outstanding idea. The National Longitudinal Surveys of Labor Market Experience (NLS), which is sponsored by the US Department of Labor, has collected over 30 years of detailed, nationally representative data on the lives of 10,000 people born 1957-64, many of whom have siblings in the study, as well as a similar number children of the female respondents – a study that is already a two-generation study. The greatest part of the cost of this project was paid for by the Department of Labor, although NICHD has sponsored the second generation study – the Children of the NLSY. Together, these agencies have supported high frequency, prospective data collection of extensive environmental, socio-economic, cognitive, and health data. We are already in a position to know, in incredible detail, the phenotypic details of extended families. There is clear potential to re-purpose these data, with relatively modest investments, into a robust resource that is comparable to what other investigators have collected. When we face serious budget constraints, the suggested approach will most likely be a very cost-effective way to help transform our research tool set without having to wait ten or more years to generate the necessary data ab initio.

         


         

        The timing and opportunity for such a cohort could not be better. The idea is not only academically and scientifically sound but sensible (which policy should be)! The challenge will be to build on harmonization efforts already underway so as to profit and build on energies and ideas already in the field. To that end, NIH should call a summit of the organizations mentioned in the text and catalyze a concerted move forward towards the systemic building of this virtual cohort.

         


         

        Building a synthetic cohort or cohorts is an important idea and a timely opportunity. Harmonizing longitudinal developmental phenotypes is a key advance, because research based on retrospectively recalled phenotypes or cross-sectional phenotypes is known to be inherently problematic for genomic discovery. The problems are exacerbated when retrospective or cross-sectional data are used to index environmental exposures for gene-environment interaction (GxE) studies. Mental health and neurocognitive phenotypes should be included to maximize value for money.

         


         

        This is a very valuable and timely proposal that will advance science in significant ways. At University College London we curate longitudinal cohort studies such as the English Longitudinal Study of Ageing (ELSA) and the Whitehall II study that are partly supported by NIH. These studies have rich phenotypic data stretching back many years, and participants are monitored for future morbidity and mortality. The studies include not only biomedical and genetic data, but extensive psychosocial, cognitive, behavioral and economic information, making them very suitable for interdisciplinary research. These resources are very valuable, but are insufficient in size on their own for detailed analysis of gene-environment interactions. So the creation of a synthetic cohort would have significant benefits, allow more sophisticated analyses, and potential lead to discoveries that will benefit human health and well-being. Harmonization of phenotypic information is already underway between UK cohorts and studies such as the Health and Retirement Study, and these are supported by NIA. Such methods could be extended in the future in incorporate other cohorts and different types of data. This proposal therefore comes at the right time in the evolution of these population-level resources, and would be hugely beneficial to scientific progress.

         


         

        This proposal is certainly timely and of very high priority. Gene–environment interactions will likely be the next scientific frontier, since genetic heterogeneities for most diseases of complex origin explain very little of the disease risk, and since exposures to individual environmental chemicals mostly show only weak associations, especially if the exposure has not been determined at the most vulnerable ages. Birth cohorts may provide crucial information for such research and meta-analyses, and they may provide banked samples from which missing data can be determined. The idea of a large synthetic cohort has also been launched in the European Union (EU), where some collaboration already exists between birth cohorts, with support from the European Commission. International collaboration is highly warranted and should include birth cohorts with increased exposures to important environmental chemicals in order to increase the statistical power and limit the impact of unavoidable imprecision in the exposure assessment.

         


         

        Scientific research depends on high-quality data. Generating high-quality data often requires substantial public investment, and the proposed idea of creating synthetic cohorts with both phenotypic and genotypic data presents a great promise to bring the kind of data we need to advance the science in its most cost-efficient way. In doing so, a rich set of phenotypic data should be considered, as the critical importance of socioeconomic determinants of health has been well recognized, and therefore it is important to include not only harmonized phenotypic data on health and health behavior but also economic and social resources and behaviors.

         


         

        Creating a synthetic cohort of family-based data with both phenotype and genotype information would offer another opportunity. National Longitudinal Study of Youth (NLSY), for example, has collected family-based data sets, including nationally representative women of selected birth cohorts and their children. Such population-based, family data sets with phenotypic and genotypic information will present an opportunity to further investigate intergenerational transmission of genetic and socioeconomic resources. NLSY has taken the first step of collecting saliva samples to build such family-based data, which have much to add to a synthetic cohort of unrelated population.

         


         

        Initiatives such as the one proposed here are crucial to enable further scientific developments. However, the construction of synthesized datasets providing high-quality information is extremely challenging and demands methodological rigor and adequate resources. To enhance the potential for pooling, it is obvious that design and information collected by studies should be ‘prospectively harmonized’. Emerging studies would, in that context, make use of common questionnaires and standard operating procedures. However, it is also important to foster, where possible, synthesis of data already collected (‘retrospective harmonization’). Naturally, the quantity and quality of information that can be pooled will then be limited by the heterogeneity intrinsic to the pre-existing differences in study design and conduct, but it could increase the utility of existing studies. Both prospective and retrospective approaches contribute to the harmonization agenda and each comes with its own specific strengths and weaknesses. However, regardless of the approach used, a series of factors must be carefully considered whenever one aims to harmonize and synthesize data. These factors include: 1) the design of studies and targeted populations must be compatible and properly documented in order to assess eventual bias in the final results generated; 2) the choice of variables to be harmonized must be scientifically valid and the longitudinal structure of data collection must be taken into full account in order to properly explore causal effects; 3) the targeted information collected or generated by the participant studies must be inferentially equivalent; and 4) adequate agreement on data access, intellectual property and other ethico-legal issues must be taken into account.

         


         

        The leadership team of the Panel Study of Income Dynamics (PSID) is strongly supportive of the proposal to create a synthetic cohort for the analysis of the longitudinal effects of gene-environment interactions and, if the proposal moved forward, would be keen to have PSID be part of the synthetic cohort. The proposal would complement other studies that are planned or underway (such as collecting and processing DNA from PSID sample members and the new consortium to support social science genome-wide association studies) and would especially help to expand ties between geneticists and researchers in the social and behavioral sciences.

         

        Begun in 1968, PSID is a longitudinal study of a representative sample of U.S. individuals and the family units in which they reside. It emphasizes the dynamic aspects of economic and demographic behavior, but its content is broad, including sociological, psychological, and health measures. PSID is the longest running national panel (i.e., a survey that repeatedly assessing the same individuals over time) on individual and family dynamics and has consistently achieved extraordinarily high re-interview response rates of 96–98%. PSID began with a nationally-representative sample of approximately 5,000 families. Tracking rules call for following the original-sample members’ children, who are themselves classified as sample members and are eligible for tracking as separate family units when they set up their own economically-independent households. PSID interviewed individuals from these families every year from 1968 to 1996 and biennially thereafter—whether or not they were living together in the same dwelling. In 2007, the PSID sample comprised 8,289 family units with 12,850 respondents and their spouses/ partners and a total of 22,580 individual family members (including respondents and spouses/partners). PSID follows sample members over the entire lifecourse and provides prospective panel data for conducting longitudinal analyses of individuals and families, including intergenerational and intragenerational analyses.

         

        Pooling genetic data from existing panel studies that have multiple survey waves and family-based designs will allow researchers to deploy sophisticated statistical models to the service of understanding of how the social environment and the distribution of alleles in the U.S. population interact with each other. In other words, only such a study would provide researchers with mechanisms—and sufficient statistical power—to attain both the genetic and environmental identification that is necessary for advancing our understanding of cause and effect with respect to how the genetic and social worlds intersect.

         


         

        Up to Top

        Venture Fund for Research and Development of New Medications to Treat Chronic Pain (see “NIH Award Strategies” in Innovation Brainstorm ideas)

        Nominator: NIDA
        Participating IC: NCI

        Major obstacle/challenge to overcome: Chronic pain, which affects 116 million Americans and is a significant public health burden, is not adequately managed by current therapies. Although opiates are the most commonly prescribed medications to treat chronic pain conditions (e.g. cancer pain), their use pose important clinical risks such as abuse liability, diversion, and overdose. Other types of chronic pain (e.g., neuropathic pain caused by diabetes) are not well managed by either opiates or other approved agents (e.g., antidepressants). Currently, a significant amount of dollars is being invested in testing medications to treat chronic pain but the results of the studies have not yielded any significant progress in the treatment of this condition. There is an urgent need to conduct research that helps to understand the neurobiological mechanisms of chronic pain, which in turn will help to identify new targets and thus new compounds to treat this condition.

        Unfortunately, it has been challenging to develop collaborations and much more to share resources among industry, academia, and government investigators to advance the study of chronic pain. A concerted and synergistic approach among those three groups will greatly advance the understanding and management of chronic pain. It is expected that the development of a venture fund for research and development will facilitate the collaboration among industry, academia, and government which will result in the discovery of new targets and the development of new medications to treat this condition.

        The purpose of this program is to support eligible institutions that enter into a joint venture or collaboration with other entities which concomitantly provides support in the form of funds or resources to conduct research to advance the development of medications to treat chronic pain.
        Research may focus on the discovery of new potential therapeutic targets, new molecules with action on those targets, as well as Phase I safety/tolerability studies, single or multisite Phase II or III studies, or translational projects.

        Emerging scientific opportunity ripe for Common Fund investment: Currently, there are multiple individual efforts from industry, academia and government to advance the knowledge of the mechanisms of pain as well as the discovery and development of new pharmacotherapies; however, most of those efforts are not coming to fruition because of the lack of a coordinated and synergistic approach. This initiative is very timely because it aims at channeling all those efforts and making them more synergistic in achieving an ultimate goal of having safer and more effective medications to treat chronic pain

        Common Fund investment that could accelerate scientific progress in this field: It is expected that the identification and validation of novel targets associated with chronic pain can lead to novel and effective therapies. A pilot phase is proposed that if successful would go on to a therapeutics development phase to be done in collaboration with private sector partners.

        Pilot Phase:
        Identification and validation of new therapeutic targets

        Therapeutics Development Phase:
        Identification of bioactive compounds for the new targets identified during the pilot phase
        Pre-clinical studies
        Early stage clinical trials

        Potential impact of Common Fund investment: The ability to effectively treat chronic pain conditions will impact more than 116 million Americans. In addition, identifying and validating chronic pain targets – may also lead to diagnostic tests that may prevent or delay the onset of chronic pain conditions.

        Comments:
        Because large pharma companies are generally reducing investment in exploratory studies in the pain area exactly at the time that research in this area is beginning to take off in terms of potential target identification, I think this would be a great area for the NIH to direct resources to. There is great unmet clinical need and some new means is required to produce new drugs ready for Phase 1, 2 proof of principle testing. A key component of target selection and validation is that this needs to be wherever possible human data driven using human genetics and human neurons (from stem cell technology) for assays and screens, as well as whole-genome unbiased screens in model organisms. Since the nature of this early drug development work is so different from standard academic study, specific centers where academics and professional drug developers can work together in an academic-industrial hybrid translational setting will be required, bringing together the best practices from both, innovation and risk taking with high technology parallel project, management, respectively.

         


         

        The bandwidth of Target Validation & Drug Validation in humans is truly an area of high need for investigation in the chronic pain field. However, to get traction in this work-space cross-institutional effort is likely required, and it may be useful to survey institutional interest in forming therapy discovery & development partnerships to address chronic pain.

        Regarding the proposal put forward for comment:
        1. The current, proposed definition of a pilot as focused on “validation” of single targets, to be followed and distinct from other activities, should be rethought. Validation activities typically include capacities currently viewed as belonging to the therapy development phase. Emphasis might also be given from the start to developing a portfolio of projects progressing through a validation path; this diversification of approach would also have an appeal to funders.
        2. The de-risking of targets, mechanisms and pathways has moved into pre-competitive space and may appropriately be viewed as a mandate of public, translational funding up to IND stage. The NIH is also well versed in moving programs as far as preclinical and IND stage. Presumably, Common Fund dollars can apply to a variety of mechanisms including cooperative agreements and SBIRs.

        3. The Venture Fund idea may be most attractive at the clinical stage, and presents the opportunity to leverage and aggregate other clinical investments (CTSA, ACTION) as drivers for a partnering fund.

         


         

        This is an important and growing health problem, and one that is difficult to approach at all levels. Identification of new therapeutic targets and bioactive compounds is needed, as are more pre- and early stage clinical trials. Research is also needed to track the clinical outcomes of current treatments and to establish a network of investigators who would be ready to respond quickly to any new trials that emerge from the early phase investigations. This might be set up as a network of individual investigators, a number of clinical research facilities dedicated to chronic pain management, or perhaps both. The SEER (Surveillance, Epidemiology and End Results) program for cancer might be a model for how to collect and integrate data from multiple sites.

        This idea comes at a critical crossroads in pain research. A few notable failures to translate findings from the lab to humans has resulted, along with the current economic environment, in the drastic decrease in R&D for new pain therapeutics done internally in big pharma. At the same time, an immense wealth of new information on mechanisms of pain and pain sensitization has been generated in recent years. The lack of industry funding for taking these discoveries forward represents a major hurdle to the development of new therapeutics.

        The vacuum created by industry leaving this domain could be effectively addressed by this NIH initiative. I think this is an excellent place for the NIH to invest. In particular, I would like to point out that target selection, validation, and trial design can be done in a strictly science-driven way in the context of academia, rather than being driven in large part by business models. Indeed there are many excellent potential targets for therapy development that have already been identified, and significant resources are needed to move them forward. Thus, there is great opportunity here to move viable concepts further along the de-risking path to therapy development. This may be just the paradigm shift needed to really help bring novel therapies to the patients who so desperately need them.

        Finally, I would like to add that money from the NIH, while desperately needed in the pain field (which is grossly under-funded for the immense disease burden on society), cannot solve this problem alone. New ways of thinking and real partnerships between academic labs, NIH core resources, and Industry partners are needed. The NIH could really help in facilitating in this by looking at successful public/private partnerships and finding ways to replicate these models for other projects.

        This idea certainly identifies an area of critical unmet clinical need that is ripe for transformative research. The only suggestion that I would make is that the fund should not be focused solely on “new” targets. There are a number of promising targets that have been identified and which are supported by pre-clinical research. However, further development of drugs focused on these targets has been slowed or halted because of lack of funding for the clinical research necessary for validation in humans. Support for that form of translational research should be included within the fund's goals.

Up to Top