22 July 1996
Few scientific disciplines have the broad appeal and the truly interdisciplinary nature of nuclear astrophysics. Laboratory data and nuclear theory, our best understanding of the microscopic, are applied to understanding the evolution of matter under nature's most extreme conditions of temperature, density, and energy. The circumstances are themselves usually invisible and must be inferred from a combination of theory and astronomical observation. In recent years, there has been explosive growth in the astronomical data available because of increases in computer power, the number of large earth-based telescopes, and telescopes in space. Laboratory nuclear astrophysics has also continued to develop, though at a slower pace. Even in these troubled times, the field of nuclear astrophysics, at least as measured by publications and people involved, has been growing. This is due to the interesting, fundamental problems it addresses. Where and how has each of the elements (and their isotopes) been assembled? What came out of the Big Bang? How do stars, including our own sun, live and die? How do novae and supernovae work? How might things differ in other galaxies and, in our own galaxy, at other times? Why, as a consequence of physical law, is our universe the way it is, and ultimately, why are we ourselves the way we are?
Research in nuclear astrophysics has also been a prototype for fruitful collaboration between government laboratories and universities. Not only are the government labs a repository of much useful nuclear data, facilities, and expertise needed by those at the universities, but also the same numerical techniques developed at these labs for radiation transport and hydrodynamics find broad application in astrophysics.
Yet despite its overall appeal and general good health, there is a developing crisis in nuclear astrophysics - a crisis brought on, in part, by its very success and growth. A breakdown in communication is occurring. Nuclear data, won in the laboratory at great effort and cost, are not always making their way in a timely fashion into the leading astrophysical models. Those data that do make it are often fragmentary and sometimes biased by personal selection or individual experiences (e.g., who happened to be at what meetings). Similarly, the needs of the users of nuclear data are not always being communicated with sufficient clarity and prioritization to those who are in a position to make the measurements.
There are several reasons why communication between the lab and the model builder has deteriorated. One is the sheer volume of nuclear data the models now require. The quest for physical detail that modern computers now make feasible leads to the need for literally tens of thousands of nuclear reaction rates, binding energies, and lifetimes. No one person can keep track of all this information, let alone continually assess its reliability. Another problem is the demise of the standardized rate compilations which were based for over 30 years in the Kellogg Laboratory at Caltech, i.e., the work of Professor Willy Fowler and his colleagues. While these rate compilations may have had some shortcomings, e.g., in terms of documentation and justification, they provided a generation of stellar model builders with a standardized set of nuclear reaction rates that made intercomparison of their models both easier and more meaningful. It is the nature of laboratory measurement that often only a part of what the astrophysicist needs is determined - the rate in a limited temperature range, for example, or a resonance energy with uncertain partial widths. Astrophysicists are reluctant to put such fragmentary information in their codes without guidance from a respected, experienced evaluator, particularly when it can have a big effect on the results. Another problem is that the rate information, more often than not, is published in journals or workshop proceedings that the astrophysicist does not frequently read. There are also "private collections" of results, e.g., Hauser-Feshbach calculations, that the collectors have been unwilling to provide to other individuals because the work involved in making the transfer was too great. Finally, there are sometimes conflicting measurements of the same quantity, which the astrophysicist is in no position to resolve.
Traditionally, the work of compiling and evaluating nuclear data for astrophysics has been done by members of the nuclear astrophysics community who recognized the importance of this activity and have willingly volunteered time from their own research to serve the larger community. This important volunteer work now needs to be enhanced by a more systematic and coordinated approach. We believe that two steps are necessary in order to meet these pressing needs. The first step is the creation of a Steering Committee for Nuclear Astrophysics Data with a rotating membership of 6 to 12 nuclear physicists and astrophysicists having the joint expertise necessary to provide direction and coordination for the overall effort, as well as setting priorities within such a program and establishing the procedures and guidelines for the evaluations. The second step involves the establishment of a central archive for all nuclear data that has direct astrophysical application. (This Center will not duplicate the function of the NNDC, but will supplement it with information of particular importance and usefulness to the nuclear astrophysics community.) One of the first actions of such a Center would be to collect all the useful data that currently exist and to store it in an easily and universally accessible location, presumably on an electronic network. However, this must not be just a place where old data are stored. It must be a living archive, one that is continually updated and cast in a form suitable to its users. Most importantly, it must be a "refereed archive", with oversight exercised by the Steering Committee.
Presently there exists a considerable amount of data that is not in a useable format (e.g., resonance properties that have not been converted into reaction rates) or else is useable in only a limited range of conditions. Usually such data are simply ignored. One function of the Center and the Steering Committee would be to ensure that the set of standard/"best" rates are given in a standardized form which is best suited for use in the astrophysicists' computer calculations and numerically stable at all reasonable temperatures. In some cases where there are conflicting or preliminary measurements, the Steering Committee will need to make the difficult choices and to provide the necessary caveats to the user so that realistic uncertainties can be assigned.
At least initially it is important that 1 FTE (e.g., two half-time persons) at the post-doc level or higher, be committed specifically to this project, to set-up this Center. It would be important that these individuals (Nuclear Astrophysics Data Coordinators) be trained as nuclear astrophysicists (PhD a minimum) and be familiar with the community of users. We believe that it is important that each coordinator have a split appointment in which there is also ample opportunity for doing research in fields closely related to nuclear astrophysics. The Coordinators should also take a pro-active role in obtaining new nuclear data of astrophysical value, as well as archiving the old and seeing that the data are in an optimal format (or formats) for use on the computer. Where it is appropriate, the Coordinators should call attention to the need for further measurements. The roles of the Coordinators should include some evaluation work, but this would probably not be their primary responsibility. The Coordinators will serve as a liason between the scientists doing evaluations (at many universities and laboratories), the astrophysicists using the evaluated rates, and the data group associated with the host institution. One of the responsibilities of the Coordinators and the Steering Committee will also include coordinating this Project with other programs (such as EuroNet) in order to avoid unnecessary duplication.
Because of its experience overseeing the archiving of nuclear data, we recommend that the DOE set up a modest program (in the range from $100k to $150k per year, with the larger sum including some hardware costs appropriate for the initial set-up year) associated with the appointment of the Coordinators for the archiving, evaluation, and dissemination of nuclear astrophysical data. The infrastructure costs associated with the Coordinators' activities and the research half of the Coordinators' salary costs should be borne by the host institution. To the extent that DOE remains the chief source of funding for the program, the Steering Committee may also be subject to DOE confirmation. Coordinators should be solicited through open recruitment and be selected jointly by the research group supporting their research and by the Steering Committee, with important input from the Data Group associated with the host institution.
The host institution should be determined by competitive proposals. The institution selected should be able to provide accommodation and resources for archiving, and preferably would also be able to commit some additional personnel on a part-time basis to provide professional advice to the Center and to help in some of the actual evaluation work. The Steering Committee would have the responsibility for regularly reviewing the work of the Coordinators and the Center.
Finally, while the archive should exist at a single site, it is expected that there will be electronic links to other data sets world wide. It is important to recognize that this Project should be international in scope, involving scientists and evaluators in a wide variety of countries. Access to this data archive should be free and unrestricted - available to the greatest worldwide spectrum of users that is feasible.