- Differences between a finding, a conclusion, and a recommendation: examples
- Career Center
Table of Contents
- Defining the Terms: What Is a Finding, a Conclusion, and a Recommendation in M&E?
- Why It Matters: Understanding the Importance of Differentiating between Findings, Conclusions, and Recommendations in M&E
- How to Identify and Distinguish between Findings, Conclusions, and Recommendations in M&E
- How to Communicate Findings, Conclusions, and Recommendations Effectively in M&E Reports
- The Benefits of Clear and Accurate Reporting of Findings, Conclusions, and Recommendations in M&E
1. Defining the Terms: What Is a Finding, a Conclusion, and a Recommendation in M&E?
Monitoring and Evaluation (M&E) is a critical process for assessing the effectiveness of development programs and policies. During the M&E process, evaluators collect and analyze data to draw conclusions and make recommendations for program improvement. In M&E, it is essential to differentiate between findings, conclusions, and recommendations to ensure that the evaluation report accurately reflects the program’s strengths, weaknesses, and potential areas for improvement.
In an evaluation report, a finding, a conclusion, and a recommendation serve different purposes and convey different information. Here are the differences between these three elements:
A finding is a factual statement that is based on evidence collected during the evaluation . It describes what was observed, heard, or experienced during the evaluation process. A finding should be objective, unbiased, and supported by data. Findings are typically presented in the form of a summary or a list of key points, and they provide the basis for the evaluation’s conclusions and recommendations.
Findings are an important part of the evaluation process, as they provide objective and unbiased information about what was observed, heard, or experienced during the evaluation. Findings are based on the evidence collected during the evaluation, and they should be supported by data and other relevant information. They are typically presented in a summary or list format, and they serve as the basis for the evaluation’s conclusions and recommendations. By presenting clear and accurate findings, evaluators can help stakeholders understand the strengths and weaknesses of the program or initiative being evaluated, and identify opportunities for improvement.
1.2 Examples of Finding
Here are some examples of findings in M&E:
- “Program participants reported a high level of satisfaction with the quality of training provided, with 85% rating it as good or excellent.”
- “The program was successful in increasing the number of girls enrolled in secondary school, with a 25% increase observed in the target communities.”
- “Program beneficiaries reported improved access to healthcare services, with a 40% increase in the number of individuals accessing healthcare facilities in the program area.”
- “The program’s training curriculum was found to be outdated and ineffective, with only 30% of participants reporting that the training was useful.”
- “The program’s monitoring and evaluation system was found to be inadequate, with data quality issues and insufficient capacity among staff to carry out effective monitoring and evaluation activities.”
These findings represent objective, measurable results of the data collected during the M&E process, and can be used to inform program design and implementation, as well as to draw conclusions and make recommendations for improvement.
A conclusion is a judgment or interpretation of the findings based on the evidence collected during the evaluation. It is typically expressed in terms of what the findings mean or what can be inferred from them. Conclusions should be logical, evidence-based, and free from personal bias or opinion.
Conclusions often answer the evaluation questions or objectives, and they provide insights into the effectiveness or impact of the program, project, or intervention being evaluated. By synthesizing the findings into a cohesive narrative, evaluators can provide stakeholders with a clear and actionable understanding of the program or initiative being evaluated. Conclusions can also inform future planning and decision-making, by identifying areas for improvement and highlighting successful strategies or interventions. Overall, conclusions are a crucial component of the evaluation process, as they help stakeholders make informed decisions about the programs and initiatives they are involved in.
1.4 Examples of Conclusion
Here are some examples of conclusions in M&E:
- Based on the data collected, it can be concluded that the program was successful in achieving its objective of increasing access to clean water in the target communities.”
- “The data indicates that the program’s training curriculum is ineffective and in need of revision in order to better meet the needs of participants.”
- “It can be concluded that the program’s community mobilization efforts were successful in increasing community participation and ownership of the program.”
- “Based on the data collected, it is concluded that the program’s impact on improving maternal and child health outcomes is limited and further efforts are needed to address the underlying health system and infrastructure issues.”
- “The data collected indicates that the program’s impact on reducing poverty in the target area is modest, but still significant, and further investment in complementary programs may be needed to achieve more substantial reductions in poverty rates.”
- These conclusions are based on the evidence presented in the findings and represent the interpretation or explanation of the meaning of the findings. They help to provide insight into the impact and effectiveness of the program and can be used to make recommendations for improvement.
A recommendation is a specific action or set of actions proposed based on the findings and conclusions of the evaluation. Recommendations should be practical, feasible, and tailored to the needs of the stakeholders who will be implementing them. They should be supported by evidence and aligned with the goals of the program, project, or intervention being evaluated.
Recommendations often provide guidance on how to improve the effectiveness or efficiency of the program, project, or intervention, and they can help to inform decision-making and resource allocation. By presenting clear and actionable recommendations, evaluators can help stakeholders identify and prioritize areas for improvement, and develop strategies to address identified issues. Recommendations can also serve as a roadmap for future planning and implementation and can help to ensure that the program or initiative continues to achieve its intended outcomes over time.
Overall, recommendations are an essential component of the evaluation process, as they help to bridge the gap between evaluation findings and programmatic action. By proposing specific and evidence-based actions, evaluators can help to ensure that evaluation results are translated into meaningful improvements in program design, implementation, and outcomes.
1.6 Examples of Recommendation
Here are some examples of recommendations in M&E:
- “To improve the effectiveness of the program’s training, the curriculum should be revised to better meet the needs of participants, with a focus on practical, hands-on learning activities.”
- “To address the data quality issues identified in the monitoring and evaluation system, staff should receive additional training on data collection and management, and the system should be revised to incorporate additional quality control measures.”
- “To build on the success of the program’s community mobilization efforts, further investments should be made in strengthening community-based organizations and networks, and in promoting greater community participation in program planning and decision-making.”
- “To improve the program’s impact on maternal and child health outcomes, efforts should be made to address underlying health system and infrastructure issues, such as improving access to health facilities and training health workers.”
- “To achieve more substantial reductions in poverty rates in the target area, complementary programs should be implemented to address issues such as economic development, education, and social protection.”
These recommendations are specific actions that can be taken based on the findings and conclusions of the M&E process. They should be practical, feasible, and based on the evidence presented in the evaluation report. By implementing these recommendations, development practitioners can improve program effectiveness and impact, and better meet the needs of the target population.
2. Why It Matters: Understanding the Importance of Differentiating between Findings, Conclusions, and Recommendations in M&E
Differentiating between findings, conclusions, and recommendations is crucial in M&E for several reasons. First, it ensures accuracy and clarity in the evaluation report. Findings, conclusions, and recommendations are distinct components of an evaluation report, and they serve different purposes. By clearly defining and differentiating these components, evaluators can ensure that the report accurately reflects the program’s strengths and weaknesses, potential areas for improvement, and the evidence supporting the evaluation’s conclusions.
Second, differentiating between findings, conclusions, and recommendations helps to facilitate evidence-based decision-making. By clearly presenting the evidence supporting the evaluation’s findings and conclusions, and making recommendations based on that evidence, evaluators can help program managers and policymakers make informed decisions about program design, implementation, and resource allocation.
Finally, differentiating between findings, conclusions, and recommendations can help to increase the credibility and trustworthiness of the evaluation report. Clear and accurate reporting of findings, conclusions, and recommendations helps to ensure that stakeholders understand the evaluation’s results and recommendations, and can have confidence in the evaluation’s rigor and objectivity.
In summary, differentiating between findings, conclusions, and recommendations is essential in M&E to ensure accuracy and clarity in the evaluation report, facilitate evidence-based decision-making, and increase the credibility and trustworthiness of the evaluation.
3. How to Identify and Distinguish between Findings, Conclusions, and Recommendations in M&E
Identifying and distinguishing between findings, conclusions, and recommendations in M&E requires careful consideration of the evidence and the purpose of each component. Here are some tips for identifying and distinguishing between findings, conclusions, and recommendations in M&E:
- Findings: Findings are the results of the data analysis and should be objective and evidence-based. To identify findings, look for statements that summarize the data collected and analyzed during the evaluation. Findings should be specific, measurable, and clearly stated.
- Conclusions: Conclusions are interpretations of the findings and should be supported by the evidence. To distinguish conclusions from findings, look for statements that interpret or explain the meaning of the findings. Conclusions should be logical and clearly explained, and should take into account any limitations of the data or analysis.
- Recommendations: Recommendations are specific actions that can be taken based on the findings and conclusions. To distinguish recommendations from conclusions, look for statements that propose actions to address the issues identified in the evaluation. Recommendations should be practical, feasible, and clearly explained, and should be based on the evidence presented in the findings and conclusions.
It is also important to ensure that each component is clearly labeled and presented in a logical order in the evaluation report. Findings should be presented first, followed by conclusions and then recommendations.
In summary, identifying and distinguishing between findings, conclusions, and recommendations in M&E requires careful consideration of the evidence and the purpose of each component. By ensuring that each component is clearly labeled and presented in a logical order, evaluators can help to ensure that the evaluation report accurately reflects the program’s strengths, weaknesses, and potential areas for improvement, and facilitates evidence-based decision-making.
4. How to Communicate Findings, Conclusions, and Recommendations Effectively in M&E Reports
Communicating findings, conclusions, and recommendations effectively in M&E reports is critical to ensuring that stakeholders understand the evaluation’s results and recommendations and can use them to inform decision-making. Here are some tips for communicating findings, conclusions, and recommendations effectively in M&E reports:
- Use clear and concise language: Use clear, simple language to explain the findings, conclusions, and recommendations. Avoid technical jargon and use examples to illustrate key points.
- Present data visually: Use tables, graphs, and charts to present data visually, making it easier for stakeholders to understand and interpret the findings.
- Provide context: Provide context for the findings, conclusions, and recommendations by explaining the evaluation’s purpose, methodology, and limitations. This helps stakeholders understand the scope and significance of the evaluation’s results and recommendations.
- Highlight key points: Use headings, bullet points, and other formatting techniques to highlight key points, making it easier for stakeholders to identify and remember the most important findings, conclusions, and recommendations.
- Be objective: Present the findings, conclusions, and recommendations objectively and avoid bias. This helps to ensure that stakeholders have confidence in the evaluation’s rigor and objectivity.
- Tailor the report to the audience: Tailor the report to the audience by using language and examples that are relevant to their interests and needs. This helps to ensure that the report is accessible and useful to stakeholders.
In summary, communicating findings, conclusions, and recommendations effectively in M&E reports requires clear and concise language, visual presentation of data, contextualization, highlighting of key points, objectivity, and audience-tailoring. By following these tips, evaluators can help to ensure that stakeholders understand the evaluation’s results and recommendations and can use them to inform decision-making.
5. The Benefits of Clear and Accurate Reporting of Findings, Conclusions, and Recommendations in M&E
Clear and accurate reporting of M&E findings, conclusions, and recommendations has many benefits for development programs and policies. One of the most significant benefits is improved program design and implementation. By clearly identifying areas for improvement, program designers and implementers can make adjustments that lead to more effective and efficient programs that better meet the needs of the target population.
Another important benefit is evidence-based decision-making. When M&E findings, conclusions, and recommendations are reported accurately and clearly, decision-makers have access to reliable information on which to base their decisions. This can lead to more informed decisions about program design, implementation, and resource allocation.
Clear and accurate reporting of M&E findings, conclusions, and recommendations also supports accountability. By reporting transparently on program performance, development practitioners can build trust and support among stakeholders, including program beneficiaries, donors, and the general public.
M&E findings, conclusions, and recommendations also support continuous learning and improvement. By identifying best practices, lessons learned, and areas for improvement, development practitioners can use this information to improve future programming.
Finally, clear and accurate reporting of M&E findings, conclusions, and recommendations can increase program impact. By identifying areas for improvement and supporting evidence-based decision-making, development programs can have a greater positive impact on the communities they serve.
In summary, clear and accurate reporting of M&E findings, conclusions, and recommendations is critical for improving program design and implementation, supporting evidence-based decision-making, ensuring accountability, supporting continuous learning and improvement, and increasing program impact. By prioritizing clear and accurate reporting, development practitioners can ensure that their programs are effective, efficient, and have a positive impact on the communities they serve.
Leave a Comment Cancel Reply
Your email address will not be published.
Sr. monitoring, evaluation and learning specialist pqlc · washington, dc (hybrid).
- Washington, DC, USA
- Solidarity Center
Deputy Director, Institutional Support and Program Implementation (ISPI)
- Virginia, USA
Chief of Party – Ukraine TPM
Deputy chief of party – ukraine tpm, chief of party – sudan tpm, deputy chief of party – sudan tpm, country specialist, evaluation of gavi covax, partnership analyst, evaluation of gavi covax.
- United States
Research Methodologist, Evaluation of GAVI COVAX
Epidemiologist, evaluation of gavi covax, how strong is my resume.
Only 2% of resumes land interviews.
Optimize your resume
Expert recommendations to help you create the best resume!
READY TO LAND M&E JOB YOU LOVE?
Get our FREE walkthrough guide to landing a job in International Development
We will never spam or sell your data! You can unsubscribe at any time.
Services you might be interested in
Useful guides ....
Masters, PhD and Certificate in M&E
What is Evaluation?
What is the difference between Monitoring and Evaluation?
Types of Evaluation
Monitoring, Evaluation, Accountability, and Learning (MEAL)
Deals & Coupons ...
LAND A JOB REFERRAL IN 2 WEEKS (NO ONLINE APPS!)
Sign Up & To Get My Free Referral Toolkit Now:
Writing the parts of scientific reports
22 Writing the conclusion & recommendations
There are probably some overlaps between the Conclusion and the Discussion section. Nevertheless, this section gives you the opportunity to highlight the most important points in your report, and is sometimes the only section read. Think about what your research/ study has achieved, and the most important findings and ideas you want the reader to know. As all studies have limitations also think about what you were not able to cover (this shows that you are able to evaluate your own work objectively).
Possible structure of this section:
Use present perfect to sum up/ evaluate:
This study has explored/ has attempted …
Use past tense to state what your aim was and to refer to actions you carried out:
- This study was intended to analyse …
- The aim of this study was to …
Use present tense to evaluate your study and to state the generalizations and implications that you draw from your findings.
- The results add to the knowledge of …
- These findings s uggest that …
You can either use present tense or past tense to summarize your results.
- The findings reveal …
- It was found that …
Achievements of this study (positive)
- This study provides evidence that …
- This work has contributed to a number of key issues in the field such as …
Limitations of the study (negative)
- Several limitations should be noted. First …
Combine positive and negative remarks to give a balanced assessment:
- Although this research is somewhat limited in scope, its findings can provide a basis for future studies.
- Despite the limitations, findings from the present study can help us understand …
Use more cautious language (modal verbs may, can, could)
- There are a number of possible extensions of this research …
- The findings suggest the possibility for future research on …
- These results may be important for future studies on …
- Examining a wider context could/ would lead …
Or indicate that future research is needed
- There is still a need for future research to determine …
- Further studies should be undertaken to discover…
- It would be worthwhile to investigate …
Academic Writing in a Swiss University Context Copyright © 2018 by Irene Dietrichs. All Rights Reserved.
Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges (2015)
Chapter: 3 findings, conclusions, and recommendations.
3 Findings, Conclusions, and Recommendations
The committee’s findings, conclusions, and recommendations are presented in this chapter and correspond to elements of the committee’s statement of task ( Box 1.1 ), beginning with task A:
- Assess the principles for development of any new governance structure for the NNSA [National Nuclear Security Administration] National Security laboratories in order to:
- Give multiple national security agencies, including the Department of Defense [DOD], the Department of Homeland Security [DHS], the Department of Energy [DOE], and the Intelligence Community [IC], direct sponsorship of the national security laboratories as federally funded research and development centers [FFRDCs] so that such agencies have more direct and rapid access to the assets available at the laboratories and the responsibility to provide sustainable support for the science and technology needs of the agencies at the laboratories;
- Reduce costs to the Federal Government for the use of the resources of the laboratories, while enhancing the stewardship of these national resources and maximizing their service to the Nation;
- Enhance the overall quality of the scientific research and engineering capability of the laboratories, including their ability to recruit and retain top scientists and engineers; and
- Maintain as paramount the capabilities required to support the nuclear stockpile stewardship and related nuclear missions.
The principles for development of any new governance structure, determined by the committee for item A, are discussed in Chapter 2 .
THE ISSUE OF MULTIAGENCY SPONSORSHIP OF THE LABORATORIES
The committee addresses item A(i), multiagency sponsorship of the laboratories, below.
Finding 1.1. The committee found no evidence that DOD, DHS, or the IC want to take on direct sponsorship of the DOE/NNSA laboratories as FFRDCs.
Rather, these agencies reported difficulties obtaining adequate funding to support their own facilities, infrastructure, and laboratories. These agencies are eager to use the DOE/NNSA laboratories in areas that are critical to their own missions, but as a general matter are not interested in committing their own budgets to pay for recapitalization of major NNSA laboratory facilities and equipment. They are generally pleased with the quality of the services they receive from the NNSA laboratories, although they uniformly perceive the laboratories to be expensive in comparison to other research providers.
Finding 1.2. Agency representatives who presented to the committee did not report major problems in obtaining access to the DOE/NNSA laboratories.
Perhaps the most satisfied customer is DHS, which enjoys a unique status as a result of specific legislation 1 passed in the aftermath of the events of September 11, 2001. It guarantees DHS access to DOE/NNSA laboratories on an equal basis with DOE. 2 The committee was told, however, that while this legislation may have “greased the wheels” of access early in the history of DHS, 3 it does not currently accelerate the approval process for DHS Work for Others (WFO) projects compared with other agencies. 4
1 The Homeland Security Act of 2002 (P.L. 107-296, Title III, Section 309 (a) November 25, 2002).
2 Memorandum of Agreement (MOA), dated February 28, 2003, provides guidance enabling DHS to gain efficient access to specific DOE capabilities. DHS has the same access to the DOE labs as DOE does.
3 Parney Albright, former Assistant Secretary, DHS, in comments to the committee on March 12, 2014.
4 Kathleen Alexander, NNSA, comments to the committee on March 12, 2014.
Several previous studies have cited frustrations on the part of customers with the complexity of the WFO approval process and the length of time required to initiate new projects (or add scope and funding to existing ones). 5 NNSA has attempted to address these frustrations and to streamline the approval process, 6 but further improvements are necessary, such as those highlighted below in Recommendation 4.1.
Conclusion 1. A new governance model involving formal, multiple-agency FFRDC sponsorship of the DOE/NNSA laboratories would create more problems than it would solve and would be resisted by the other agencies.
Researchers at Sandia National Laboratories (SNL) presented an analysis to the committee 7 in which they qualitatively matched a series of key NNSA laboratory values with a range of possible governance models and concluded that the best fit would be an arrangement in which DOE remained the primary sponsor of the laboratories while DOD and the Office of the Director of National Intelligence (ODNI) became co-sponsors. The committee appreciates the value of analysis of radical changes that would promote greater involvement of the national security agencies in the governance of the NNSA laboratories. But it notes that there are no examples of successful multi-sponsor FFRDCs on the scale of the NNSA laboratories. One reason, no doubt, is the funding uncertainty inherent in such an arrangement; for example, jointly funded multiyear investment programs are vulnerable to reprogramming of funds by participating sponsors or Congress in response to changing priorities, and the processes for appealing and resolving such decisions are not obvious. Further, the large number of congressional oversight and appropriations committees with jurisdiction on national security issues makes it highly impractical to have multiagency sponsorship of the NNSA laboratories; overlapping committee jurisdictions would inevitably result in blurring the lines of authority and funding. DOE’s mix of national security and domestic responsibilities produces a complex web of congressional oversight and appropriations that already complicates funding DOE’s national security mission. Adding other agencies and their oversight and appropriations committees to the mix would be a recipe for further problems and complexities, not success.
5 Congressional Advisory Panel on the Governance of the Nuclear Security Enterprise, 2014, Interim Report, April, available from the Institute for Defense Analysis, Alexandria, Va.
6 Kathleen Alexander, NNSA, “Comments to The National Academies Committee on Assessment of Governance Structure of the NNSA National Security Laboratories,” presentation to the committee on May 5, 2014.
7 Jill Hruby, SNL, “Perspectives on Governance of 21st Century National Security FFRDCs,” presentation to the committee on April 8, 2014.
The clearest line of authority (Principle 2) for an FFRDC is via single-agency sponsorship. Given DOE’s responsibility for the nuclear stockpile and the centrality of the laboratories in fulfilling that responsibility, DOE is now, and is likely to remain for the foreseeable future, the most appropriate sponsor for the national security laboratories. DOE thus must continue to assume both the authority and the responsibility to assure the success of the laboratories.
Recommendation 1.1. The Department of Energy should remain the sole sponsor of the NNSA laboratories as federally funded research and development centers.
NNSA, representing the sole sponsor (DOE), should provide strategic-level oversight of the laboratories and should, for example, negotiate the umbrella contract for the laboratories’ operations and assess performance under those contracts. DOE should also provide much of the federal funding for the laboratories’ programs and operations.
Recommendation 1.2. To complement the Department of Energy’s (DOE’s) sponsorship of the NNSA laboratories, the other national security agencies should have a strategic partnership with DOE that should be formally recognized and should give those agencies a seat at the governance table for the laboratories.
It is short-sighted to think of multiagency sponsorship as the only alternative model to the current governance approach of DOE as the sole sponsor (and sole source of governance) of the laboratories. Because the current system is not working well, most recent studies have proposed shared governance, not multiple sponsors (see Appendix E ). The national security agencies should have roles in the laboratories’ governance that complement DOE’s role as sponsor and promote sustained strategic engagement, as described in Principle 5.
As sponsor of the national security laboratories, DOE/NNSA has overall responsibility for managerial oversight and funding for the laboratories’ core operations and investments, which are focused on nuclear weapons. As strategic partners, 8 the national security agencies would have the responsibility to help the laboratories, and their DOE sponsor, to understand the larger national security agenda and enable the laboratories and sponsor to meet future national security needs beyond those of
8 The partnership envisioned here is among DOE and the other national security agencies in the execution of their national security missions through their use of the laboratories in appropriate ways.
nuclear weapons. This strategic partnership relationship stems from the national security agencies’ vested interest in the laboratories’ performance and development of capabilities to support the agencies’ future mission needs.
As part of this strategic partnership function, it is essential that the national security agencies have a strategic dialog and relationship with the national security laboratories. The agencies need to be aware of the capabilities of the DOE laboratories that have relevance to their own respective missions and also need to provide the laboratories with assessments of their own future challenges and science and technology (S&T) needs. This will allow them to call on the laboratories’ special capabilities when needed, and, in turn, allow DOE and its laboratories to plan for and develop the necessary capabilities to meet the agencies’ future needs. Each strategic partner would, as now, obtain the benefit of the laboratories’ skills through the WFO process and, as appropriate, would invest or advocate for the investment in facilities or major equipment that serve its national security needs.
There are benefits for the laboratories, for DOE/NNSA, and for the other national security agencies in reinforcing this strategic partnership role. The work for other agencies helps to sustain key nuclear weapons capabilities, helps to attract and retain the best scientific talent, and provides value to the nation from its investment. Although an enterprising culture among the scientists and engineers in the laboratories is essential, the work undertaken by the laboratories should not dilute or distract from the achievement of the core weapons mission. Moreover, in line with Principle 4, the efficient use of national resources means that funds should go to the most capable and cost-effective performers; work should not be directed to the laboratories merely because they exist and have available staff.
This strategic partnership concept depends on three foundational ideas: (1) clarity of the national security mission, (2) a shared vision of the future, and (3) a collaborative approach to meeting future financial and capability needs. The national security agencies’ strategic partnership function should be pursued by way of an enhanced role for the Mission Executive Council (MEC; see below).
THE NEED TO DEFINE THE MISSION OF THE NATIONAL SECURITY LABORATORIES
Finding 2. Although the NNSA laboratories are now by law referred to as “national security laboratories” rather than “nuclear weapons laboratories,” no one has clearly articulated what this evolution means in terms of the mission of the laboratories or the
proper relationships with other national security agencies and laboratories.
Consistent with Principle 1, an efficient organization requires a clear mission statement.
Recommendation 2. The Department of Energy, in collaboration with the other national security agencies, should develop a clear mission statement for the national security laboratories.
The foundation for strategic planning should be a crisp and clear mission statement that articulates the role that the national security laboratories should play in supporting the national security agencies. Indeed, according to the Federal Acquisition Regulation section 35.017-1, one of the responsibilities of the sponsor of an FFRDC is to define the scope of its mission in the sponsor agreement. The mission statement of the FFRDC is to be reviewed by the sponsor no less than every 5 years.
The core mission is assuring the reliability, safety, and security of the arsenal of nuclear weapons. Ancillary elements clearly should encompass capabilities that flow directly from that mission, such as weapons-related intelligence and assessments, nuclear forensics, radiation-hardened microelectronics, radiation detection capabilities, and the like. The challenge is in defining the appropriate role for the national security laboratories with regard to S&T matters that are more distant from the core weapons mission. DOE can and should use the MEC in its coordinating capability (in consultation with the laboratories) in developing a mission statement for the laboratories that reflects their evolving role. Although the laboratories should expect to be significantly involved in this process, it is essential that the new mission statement reflects the stakeholders’ (i.e., the DOE sponsor’s and national security agency partners’) buy-in and endorsement.
THE NEED FOR JOINT MULTIAGENCY STRATEGIC PLANNING THROUGH THE MISSION EXECUTIVE COUNCIL
Finding 3.1. Essentially no strategic planning has taken place in a multiagency context to determine the future national security capabilities needed by the United States, including those that should be funded and resident within the NNSA laboratories.
The committee’s review of current strategic plans at DOE, NNSA, and the laboratories found that while these organizations do strategic planning individually, there is no joint, integrated process for strategic planning that
involves DOE/NNSA, the other national security agencies, and the laboratories. The planning should include not only the articulation of current capabilities and future thrusts and directions, but also governance-related issues, such as assessments of human and physical capital and strategies, to assure that capabilities for the future are funded and developed to meet the national security agencies’ needs. Such strategic planning could reasonably be accomplished through an expanded role for the MEC.
Although not members of the MEC, the laboratories need to be regular participants in order to provide insights both as to existing capabilities and as to future capabilities that could advance achievement of the national security mission.
THE ROLE OF THE MISSION EXECUTIVE COUNCIL
Finding 3.2. The four-party Governance Charter and the MEC it established are significant beginnings to implement the national security agencies’ strategic partnership role in the governance of the national security laboratories. However, the MEC’s performance to date has not met the need for shared, long-term research and development (R&D) planning among the four national security agencies or addressed how the agencies would prioritize and fund the sustainment of national security laboratory capabilities. Moreover, the MEC has had limited engagement with the NNSA laboratories.
The process of interaction among the MEC member agencies is not yet mature, perhaps because turnover has been relatively high among MEC-related officials, and there have been relatively few meetings of the full MEC. MEC-commissioned topical working groups have made progress on near-term, tactical-level issues, which demonstrates the potential for the MEC to drive a long-term strategic dialog and planning process among DOE/NNSA, the national security agencies, and the national security laboratories.
The committee notes that senior-level agency commitment is important for the success of the four-party Governance Charter and for an enduring strategic partnership relationship between the national security agencies and DOE/NNSA. Such a relationship will enhance the laboratories’ ability to support the national security agencies in the future. This means an active strategic dialog, not only on which current laboratory capabilities need to be sustained, but also for identifying capabilities that are appropriate to be developed at the NNSA laboratories to deal with future national security challenges. This will require the national security agencies to share with other MEC agencies and the national security laboratories their strategic priorities and over-the-horizon challenges that
they need S&T help to address. 9 The laboratories, in turn, will need to share with MEC agencies their assessment of the impact of S&T developments on potential national security issues, as well as the status of their current and planned capabilities to address these issues. Participation of high-level agency leadership in the MEC—at the Under Secretary or key Assistant Secretary level 10 —is necessary for the MEC to successfully negotiate the requisite coordination of strategic planning, priority setting, and funding of laboratory capabilities to meet current and emerging national security challenges. Periodic cabinet-level interaction is also essential.
Recommendation 3.1. The Mission Executive Council should become the primary vehicle to define and implement the national security agencies’ governance role. It should develop and pursue an agenda focused on identifying strategic priorities and critical capabilities to deal with ongoing and upcoming national security challenges, coordinate approaches for supporting needed investments in the laboratories, and provide coordinated guidance and processes. It should provide the following:
- Authoritative, periodic, and structured strategic guidance to the Department of Energy (DOE) and the laboratories about member agencies’ medium- and long-term mission challenges and thrusts.
- Periodic, structured assessments of the laboratories’ performance and capabilities in meeting current national security mission needs and the impact of DOE’s oversight on this performance (as an integral part of the laboratory overall assessment process).
- A strategic dialog among the agencies and the laboratories about investments that may be needed to better meet anticipated future mission needs and how those investments can be structured and funded.
- A strategic dialog with the Office of Science and Technology Policy, the Office of Management and Budget, and the relevant authorizing and appropriating committees of Congress to discuss future laboratory needs and funding priorities as they relate to the laboratories’ broader national security mission.
- Periodic consideration of other (non-NNSA-laboratory) sources of science and technology to meet national security needs.
9 The MEC may conclude that it should involve other agencies in its deliberations. For example, the Department of State has an important role in nuclear nonproliferation policy and it might be invited to participate in MEC deliberations on that subject.
10 The level of the appropriate person may vary among individual agencies.
Recommendation 3.2. The high-level Mission Executive Council meetings should be complemented by biannual meetings (or more frequent as necessary) at the cabinet level that would be chaired by the Secretary of Energy.
Such periodic meetings would ensure that the capability and funding priorities identified by the MEC are raised to the attention of senior policymakers.
A description of the activities of a more expansive, effective MEC, as envisioned by this committee, is presented in Box 3.1 .
THE ISSUE OF COST
With respect to item A(ii) of the statement of task, an NNSA laboratory analyst 11 stated in a briefing to the committee that the costs to non-DOE agencies for contracting with the NNSA laboratories for S&T work through the WFO process are in the middle of the range of such costs for contracting with other potential research providers. However, at least in the perception of customer agency representatives who briefed the committee, the NNSA laboratories are considered to be relatively expensive compared with other potential research providers, and the high cost of the laboratories is a significant factor in their decision about where to conduct R&D work. 12 An apples-to-apples comparison of the fully burdened rates of different research institutions is not a simple analysis, and the committee did not attempt to conduct an independent assessment. However, it is plausible that the laboratories’ responsibilities for safeguarding and working with special nuclear materials, as well as the layers of oversight currently employed to ensure compliance with DOE’s and NNSA’s risk-averse approach to environmental, safety, and health issues at the laboratories would make their overall costs comparatively high.
At least a small part of the cost problem at DOE/NNSA laboratories may be alleviated through changes in culture and attitude. The DOE Office of Science laboratories have to operate under the same DOE orders as DOE/NNSA laboratories, but appear to do so more smoothly and cost-effectively. The committee received briefings from the DOE Office of Science, 13 a site
11 Dori Ellis, LLNL, “Enabling Interagency Work at the NNSA National Security Laboratories,” presentation to the committee on May 6, 2014.
12 Al Shaffer, DOD, “DOD Perspectives,” presentation to the committee on March 12, 2014; John Phillips, CIA (retired), “NAS DOE Governance Input,” presentation to the committee on March 13, 2014.
13 Steve Binkley, DOE, “A DOE View on NNSA Labs Governance,” presentation to the committee on March 12, 2014.
BOX 3.1 Mission Executive Council 2.0 Vision
The Mission Executive Council (MEC) has been slow to develop and is not yet effective in either establishing an appropriate governance role for the national security agencies with the DOE/NNSA laboratories or serving as a strategic science and technology (S&T) forum among the national security agencies and between the national security agencies and the laboratories. The committee’s vision of what the MEC should become—a new and improved MEC 2.0—is described below.
MEC 2.0 would provide a forum for two-way strategic communication between the laboratories and the national security agencies. In one direction, the specific science and engineering capabilities of the individual laboratories would be identified and promulgated among the participating national security agencies, as well as the laboratories’ views of strategic S&T trends and the national security challenges that may flow from them. In the other, the strategic S&T priorities and research and development needs of the participating agencies that are appropriate to be addressed by the laboratories would be identified and promulgated to the laboratories. MEC 2.0 would utilize this exchange of information to formulate a mission statement for the national security laboratories and generate an integrated, strategic S&T plan that would inform individual agency and laboratory planning.
MEC 2.0 would provide a forum for strategic dialog between the participating agencies and the laboratories with regard to investments that may be needed to sustain capabilities at the laboratories or develop new ones that the agencies consider crucial to their national security missions. As appropriate, MEC 2.0 would engage with the Office of Management and Budget, the Office of Science and Technology Policy, and relevant committees of Congress to help ensure that the identified priorities are funded.
MEC 2.0 would feature regular meetings of senior leaders of the participating agencies and include dialog as appropriate with the directors of the DOE/NNSA laboratories. The Secretary of Energy would chair periodic meetings of his/her counterparts from the participating agencies to review the MEC’s strategic direction and ensure that issues that arise are addressed in a timely and integrated fashion. No changes to the original charter of the MEC would be required to realize this vision. a
a The 2010 four-party memorandum of understanding (MOU) entitled “Governance Charter for an Interagency Council on the Strategic Capability of DOE National Laboratories as National Security Assets,” which includes the original charter of the MEC, is reprinted in Appendix F. In the fiscal year (FY) 2013 National Defense Authorization Act (P.L. 112239, Sec 1040 (a)), Congress authorized the establishment of the Interagency Council on the Strategic Capability of the National Laboratories and then detailed the council’s membership (the Secretaries of Energy, Defense, and Homeland Security, the Director of National Intelligence, and the Administrator of the NNSA) and its responsibilities. The legislation does not specifically mention the MEC but mandates that “the President may determine the chair, structure, staff, and procedures of the Council.” The legislation calls for a report from the Council assessing, among other items, the implementation of the 2010 four-party MOU. As of this writing, that report had not yet been delivered to Congress.
office at one of the Office of Science laboratories, 14 and two senior managers of the Office of Science laboratories currently conducting substantial work for other national security agencies. 15 The committee found that the Office of Science’s approach to governance of its national laboratories and their WFO projects achieves a more cost-effective operation, along with higher customer agency satisfaction.
The DOE Office of Science’s governance relationship with its laboratories is characterized by partnership, strong mission-delivery focus, and a positive approach to enabling work for other federal agencies. Planning is a joint effort between the laboratories and the Office of Science, and the laboratories’ plans are reviewed annually. Other federal agencies provide feedback to the laboratories annually. Aligned with this mission-focused partnership, the site offices of the Office of Science laboratories are nominal in size in comparison to those of the DOE/NNSA laboratories. The committee received comments that this approach, culture, and attitude about governance in the Office of Science contributes to less complexity, reduced cost, and greater productivity at the Office of Science laboratories. It could serve as a model for change in the DOE/NNSA (see Recommendation 4.2).
STREAMLINING THE WORK FOR OTHERS PROCESS
The committee looked at the processes and procedures for the NNSA laboratories and the Nevada National Security Site (NNSS) to obtain approval to conduct work for other U.S. government agencies. The committee received oral briefings from NNSA headquarters, the laboratory directors of Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), SNL, and the president of NSTec, LLC (the management entity of NNSS). In addition, substantive information was provided by laboratory personnel in response to several committee questions. 16
Considerable frustration exists between the laboratories and the sponsoring agencies around the current DOE approval process for WFO. DOE Order (O) 481.1C describes the requirements for the contracting officer to approve WFO (Section 4.c of the order). The major concern expressed by the laboratories, and to some degree the sponsoring agencies, is the
14 Roger Snyder, DOE, “Site Office Perspectives,” presentation to the committee on May 5, 2014.
15 Thomas Mason, ORNL, and Tony Peurrung, PNNL, untitled presentations to the committee on April 9, 2014.
16 Material was provided by Dori Ellis and William Goldstein of LLNL, Charles McMillan of LANL, Paul Hommert, Jill Hruby, and Matt Riley of SNL, and Raymond Juzaitis and James Holt of NNSS.
substantial time it takes on the part of the laboratory and the sponsoring agency to package the request for WFO to meet the field office contracting officer’s interpretation of the DOE order (and there are multiple contracting officers at each NNSA field office). This can be a major problem when a project is time sensitive. The committee did not find any DOE- or NNSA-wide standards for risk assessment or broad agreement for WFO projects that would be acceptable at an NNSA laboratory. Thus, the decision to approve is largely up to the contracting officer, who is generally not a domain expert, but becomes the risk acceptance official in cases where there is no clear benefit to DOE in allowing the laboratory to conduct the work. Briefings received by the committee indicated that the processing time—from the point that the non-DOE agency first contacts a laboratory to the point that the WFO proposal is in a form that the field office contracting officer will approve it—can take 2 to 12 months. This process requires many iteration cycles and much effort on the part of the other agency and the laboratory personnel.
Table 3.1 shows the large number of WFO projects that were active in FY 2013 at each of the NNSA laboratories and the NNSS.
Not only are the totals in Table 3.1 large, the project values can be quite small, with a substantial fraction costing less than $100,000. Indeed, the numbers in Table 3.1 do not reflect the true number of approval requests that must be processed by the laboratories for a given project. Each funding increment, each new phase, and each addition of work scope for the same customer and the same project requires a new approval package that has to be submitted to the contracting officer. Both SNL and LLNL told the committee that the number of WFO packages that they may have to process each year can be three to four times the number of new projects initiated for any given year, requiring potentially thousands of transactions each year.
TABLE 3.1 Number of Active Work for Others Projects at the NNSA Laboratories and NNSS in Fiscal Year (FY) 2013
SOURCE: Compiled from discussions with various laboratory personnel.
LLNL has tracked the nominal number of days to get a WFO package through the process to approval; the in-laboratory time is typically double that in the field office. Both have been reduced over the years, but the total was about 32 work days in FY2013. This in-laboratory expense is typically covered in laboratory overhead, which contributes to overall costs for both DOE and the WFO clients.
Committee interviews with the leadership of the Pacific Northwest National Laboratory (PNNL) site office and the management of PNNL and Oak Ridge National Laboratory (ORNL)—two DOE Office of Science laboratories that perform a considerable amount of national security WFO—found that achieving approval of WFO projects at these laboratories was less difficult. One apparent difference is that the Office of Science headquarters takes an active role with the laboratories and site offices in annual planning for WFO, with the intention of approving a broad scope of work within which each laboratory may accept WFO projects.
In recognition of the above, NNSA has communicated with the DOE Office of Science to better understand its WFO process. NNSA recently established the position of NNSA Director of Interagency Work to “enable rather than control” WFO work at NNSA laboratories. This has resulted in a reduction in the approval time at NNSA headquarters and at the field offices. However, the processing times still vary widely from site to site. In conclusion, the NNSA WFO approval process is a costly and burdensome transactional oversight process, particularly as conducted by the field offices.
Finding 4. The current NNSA WFO approval process has been improved but still involves costly, repetitive steps at the laboratories and is associated with unnecessary transactional oversight and lack of cooperation and advanced planning between the laboratories and NNSA field offices.
DOE/NNSA has made progress in reducing the time required to approve WFO proposals, and on their side, customer agencies such as DHS 17 and the Defense Threat Reduction Agency 18 have streamlined the process by developing standard forms for requesting work from NNSA laboratories. However, the individual laboratories process on the order of 1,000 WFO-related actions each year, including very small projects and
17 DHS Form UA201208, supplied by James Johnson, Director, DHS Office of National Laboratories.
18 Defense Threat Reduction Agency Blanket Determination and Findings (D&F), AIC No. BAP119922631, September 13, 2011.
renewal of previously approved projects. 19 Because the review process for a small award is the same as for a large award, efficiencies can be gained by consolidating the review processes under umbrella agreements previously negotiated between DOE/NNSA and the sponsoring agencies. A clear mission statement (see Recommendation 2) would also facilitate the WFO approval process by making it clear what kinds of WFO the laboratories should take on and what kinds they should not.
Recommendation 4.1. NNSA should generate one or more Work Scope Agreements (WSAs) with each of the other national security agencies (the Department of Defense, the Office of the Director of National Intelligence, and the Department of Homeland Security) that is considered a strategic partner. The WSA would be the bounding document for bringing in new work from the strategic partner. Used in conjunction with a Work Boundary Agreement 20 (WBA) between NNSA and each laboratory, work that falls within the WSA/WBA envelope would require only the processing of the funding documents. Further approvals would not be needed.
The laboratories would then be held accountable for their WFO decisions in their next evaluation period. The committee provides conceptual examples of a WSA and a WBA between one NNSA laboratory and one sponsoring agency in Appendix H . Initially, Recommendation 4.1 could be implemented as a pilot project with one or more laboratories and one partner agency. If successful, the template could be replicated by the other laboratories and agencies. According to an estimate by an analyst at one laboratory (LLNL), this would reduce LLNL’s approval efforts by 25 to 30 percent, as well as reducing the level of effort at the NNSA field office. The NNSA Administrator should set an outcome-oriented cost reduction goal of, say, 30 percent in WFO administration costs, and monitor progress against this metric.
The committee is not suggesting that there should be one set of WFO procedures for all non-DOE national security agencies, but rather one set for each partner (one for DHS, one for DOD, one for the IC, etc.) This would accommodate unique requirements while still expediting and simplifying the process. Once proven to be cost-effective, the committee does believe that there are efficiencies to be gained if the proposed WFO processes were standardized DOE-wide, including DOE/Office of Intelligence, DOE/Office of Science, NNSA, etc.
19 Dori Ellis, LLNL, in comments to the committee on May 6, 2014.
20 The WBA is based on the operational constraints that would provide the envelope for WFO work at the laboratories—safety, security, facility requirements, total cost, etc.
Recommendation 4.2. NNSA should conduct a comparative assessment of the Office of Science approach to Work for Others, including planning, processes, working relationships between site offices and laboratories, and all associated oversight and approval actions.
Committee interviews with the leadership of the PNNL site office and the management of PNNL and ORNL—the two DOE Office of Science laboratories that perform a considerable amount of WFO—found less controversy and inefficiency around approval of WFO at these DOE Office of Science laboratories than at the NNSA laboratories. One apparent difference is that the Office of Science headquarters takes an active role with the laboratories and site offices in annual planning to include WFO, with the intention to approve a broad scope of work and core competencies within which each laboratory may pursue WFO projects. 21
MAJOR EQUIPMENT AND FACILITIES INVESTMENTS AT THE NNSA LABORATORIES BY OTHER AGENCIES
Other national security agencies have, from time to time, provided funding for new facilities and major capital equipment at the NNSA laboratories in support of their missions. The committee is aware of three such capital investments that have been made by non-DOE agencies in the past: DHS funding for the National Infrastructure Simulation and Analysis Center at SNL; IC funding for part of the refurbishment of an older facility at SNL; and joint funding from DOE/Office of Science, DOE/NNSA, and DHS for a facility at PNNL.
Finding 5. Based on the limited experience to date with investment by other federal agencies in major equipment and facilities at the national laboratories, there is no proven, systematic approach to assure such investments are made in a timely, cost-effective way.
In each case the committee is aware of, there was no standard process for non-DOE investment, and in each case, the process proved to be ad hoc, tortuous, and time consuming. Success depended on sustained attention and effort of all parties involved, particularly of the host laboratory.
Recommendation 5. The Mission Executive Council (MEC) should be directed to develop a systematic approach for multiagency
21 See U.S. Department of Energy, Office of Science, “Laboratory Planning Process,” last modified October 22, 2014, http://science.energy.gov/lp/laboratory-planning-process/ .
investment in the laboratories that allows for Department of Energy investment, together with investment from other federal agencies when and where appropriate. This approach should enable timely enhancement of both facilities and major equipment to meet a range of national security needs consistent with the new mission statement of the laboratories, or to meet a specific newly identified need. The MEC should define the principles for such an approach, oversee the development of a rolling 10-year plan, approve the annual plan, coordinate agency presentations to the Office of Management and Budget and Congress, and monitor agency accountability for meeting the plan.
THE NEED TO ATTRACT A TALENTED WORKFORCE AND MAINTAIN KEY CAPABILITIES
With regard to items A(iii) and A(iv) of the statement of task, the committee believes that the continued ability of the NNSA laboratories to attract and retain top scientific and technological talent is a critically important metric by which any new governance structure should be judged (Principle 3). In two recent studies, 22 the National Research Council (NRC) reviewed and commented on the quality of science and engineering at the NNSA laboratories, and emphasized the important role of interagency engagement, including WFO, in helping both to sustain the critical nuclear weapons capabilities and to attract and retain top talent. This committee agrees with the previous committee’s findings on the value of WFO and believes that its recommendations on recognizing and operationalizing the strategic partnership responsibilities of other national security agencies (Recommendation 1.2), clear definition of the mission (Recommendation 2), improved strategic planning (Recommendation 3.1), streamlined approval of WFO (Recommendation 4.1), and simplified recapitalization processes (Recommendation 5) will help to maintain critical capabilities at the laboratories and to attract and retain top talent.
Any assessment of the quality of science and engineering at a laboratory must include not only the technical people but also the management and leadership team as well. There is an aspect of the leadership within NNSA that deserves mention. It is important for sustainment of employee
22 National Research Council (NRC), 2012, Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories , The National Academies Press, Washington, D.C. (Phase I report), and NRC, 2013, The Quality of Science and Engineering at the NNSA National Security Laboratories , The National Academies Press, Washington, D.C. (Phase II report).
morale that NNSA leadership include individuals respected by the technical community in the laboratories. While the most important qualification of the overall leader is the ability to run a complex organization efficiently, the leadership team should include individuals who have significant competence in relevant technical disciplines in order to foster a sense of partnership and teamwork between NNSA management and laboratory personnel. The success of the DOE Office of Science laboratories may be attributed in part to a leadership team that includes individuals who are part of the relevant scientific community and who are seen to identify fully with and be part of any technical success. 23 In order to establish such leadership, managers with respected technical competence should be strongly considered as part of the NNSA management team in order to build the appropriate linkage between NNSA management and its laboratories. Increased use of detailees with technical backgrounds from other agencies could help fill this need.
OTHER LABORATORIES THAT SHOULD BE INCLUDED
The committee addresses item B from the statement of task below:
- Recommend any other laboratories associated with any national security agency that should be included in the new governance structure.
The committee did not have the time or resources to evaluate all of the many laboratories (DOD laboratories, various FFRDCs associated with national security agencies, or even all of the DOE national laboratories) that do national security work for others. Nevertheless, as discussed below, the committee sees no reason why laboratories that conduct a large volume of interagency national security work should be excluded from the governance considerations outlined in this report. The committee found that in FY2012, two DOE/Office of Science laboratories, ORNL and PNNL, received major funding from other U.S. government national security agencies, largely for the purpose of performing national security research and development (approximately 24 percent of PNNL’s funding and approximately 16 percent of ORNL’s funding). Both laboratories have been successful in increasing their non-DOE national security work over the past 20 years. In addition, both ORNL and PNNL reported that they had succeeded in obtaining long-term, non-DOE investment in facilities for future use in support of the non-DOE federal agency work. Based on its discussions with ORNL, PNNL, and representatives of other govern-
23 Victor Reis, DOE, presentation to the committee on April 8, 2014.
ment agencies, the committee concludes that these laboratories can and do readily respond to the non-DOE national security needs within the current DOE Office of Science administrative and governance guidelines.
The important role of NNSS, which receives approximately 19 percent of its funding from other agencies, has already been discussed. Hence, the committee makes the following recommendation.
Recommendation 6. The governance model described in this report should encompass other laboratories and facilities that receive a significant fraction of their funding from interagency national security work, including Pacific Northwest National Laboratory, Oak Ridge National Laboratory, and the Nevada Nuclear Security Site. Moreover, as a long-term goal, Work for Others processes should be applied uniformly across these institutions, as described in Recommendations 4.1 and 4.2.
Over time, other DOE laboratories or facilities may emerge that conduct a significant amount of interagency national security work, and if this occurs, the committee sees no reason why they should be excluded from the governance considerations proposed in this report. For example, the Government Accountability Office reported that at the Idaho National Laboratory in FY2012, 17.5 percent of work performed was WFO. 24
IMPLEMENTATION OF THE RECOMMENDED GOVERNANCE STRUCTURE
The committee addresses items C and D from the statement of task below:
- Discuss options for implementing the new governance structure that minimize disruption of performance and costs to the government while rapidly achieving anticipated gains.
- Discuss legislative changes and executive actions that would need to be made in order to implement the new governance structure.
The committee concludes that the optimal course is to build on and improve the existing governance structure. The implementation of a multiagency FFRDC governance structure such as that implied in item A of the statement of task would likely involve extensive modification of existing arrangements with multiple agencies. This would be resisted
24 Government Accountability Office, 2013, National Laboratories: DOE Needs To Improve Oversight of Work Performed for Non-DOE Entities, GAO-14-78, Washington, D.C.
by the various national security agencies. The more modest approach proposed by the committee could be put into place by the Secretary of Energy in consultation with the affected agencies, building on a structure the national security agencies have already initiated.
Because the recommendations in this report largely build on the current governance structure, the committee believes they would not require new legislation. For example, the MEC has already been created, and the expanded role envisioned here is clearly within the scope of the MEC, as articulated in the four-party memorandum of understanding (reprinted in Appendix F ), although DOE and the other national security agencies will need to enhance their commitment to—and participation in—the MEC.
THE IMPACT OF MULTIAGENCY ENGAGEMENT ON PEER REVIEW
The committee addresses item E from the statement of task below:
- Assess the contribution and long-term impact of strategic multiagency engagement on the ability to maintain an effective peer review capability for the breadth of skills needed for the nuclear weapons missions.
The NNSA laboratories currently have a variety of mechanisms for peer review of their S&T activities; this is especially true of the unclassified activities. Because much of the work done by NNSA laboratories in carrying out their mission of stewardship of the nuclear weapons stockpile is classified, it is not subject to the normal peer review processes and evaluations that are common in the unclassified world. It is therefore critical that the laboratories maintain the breadth of internal expertise and the capacity for independent assessment within the classified environment to ensure that proposed solutions are robust and resources are spent efficiently. With respect to item E of the statement of task, the committee believes that the pool of qualified personnel available to the NNSA laboratories for peer review would be enhanced by the multidisciplinary challenges required for an expanded mission scope involving both nuclear weapons and national-security-related research. The committee also believes that such an expanded mission would enhance the attraction and retention of top talent to carry out such a
broad and challenging scope of research. The NRC has an ongoing study evaluating peer review at the NNSA laboratories. 25
DOE’s national security laboratories have long played a critical role in U.S. national security. Given the rapid evolution of S&T generally, the globalization of enhanced S&T capabilities in particular, and the impact of S&T developments on the U.S. national security agenda, DOE/NNSA’s laboratories will remain a critically important resource for U.S. national security agencies for many decades to come. To ensure that the laboratories can meet national security needs in the future, adjustments are needed now to improve their governance and strengthen their strategic relationship with the non-DOE national security agencies. The recommendations in this report are designed to create a sustainably successful collaborative relationship among DOE, the other national security agencies, and the national security laboratories. Furthermore, the committee believes that everything recommended here is within the authority of the Secretary of Energy to implement. Implementing these recommendations would increase the probability that critical NNSA laboratory capabilities to support the national security of the United States will be available when needed in the future.
25 The cited NRC study was requested in Public Law 112-239, the National Defense Authorization Act for Fiscal Year 2013, Section 3144. It is sponsored by the National Nuclear Security Administration. According to the statement of task, the study will assess the following:
- The quality and effectiveness of peer review of designs, development plans, engineering and scientific activities, and priorities related to both nuclear and non-nuclear aspects of nuclear weapons;
- Incentives for effective peer review;
- The potential effectiveness, efficiency, and cost of alternative methods of conducting peer review and design competition related to both nuclear and non-nuclear aspects of nuclear weapons, as compared to current methods;
- The known instances where current peer review practices and design competition succeeded or failed to find problems or potential problems; and how peer review practices related to both nuclear and non-nuclear aspects of nuclear weapons should be adjusted as the three NNSA laboratories transition to a broader national security mission.
This page intentionally left blank.
Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges is an independent assessment regarding the transition of the National Nuclear Security Administration (NNSA) laboratories - Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories - to multiagency, federally funded research and development centers with direct sustainment and sponsorship by multiple national security agencies. This report makes recommendations for the governance of NNSA laboratories to better align with the evolving national security landscape and the laboratories' increasing engagement with the other national security agencies, while simultaneously encouraging the best technical solutions to national problems from the entire range of national security establishments. According to this report, the Department of Energy should remain the sole sponsor of the NNSA laboratories as federally funded research and development centers. The NNSA laboratories will remain a critically important resource to meet U.S. national security needs for many decades to come. The recommendations of Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges will improve the governance of the laboratories and strengthen their strategic relationship with the non-DOE national security agencies.
Welcome to OpenBook!
You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.
Do you want to take a quick tour of the OpenBook's features?
Show this book's table of contents , where you can jump to any chapter by name.
...or use these buttons to go back to the previous chapter or skip to the next one.
Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.
Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.
To search the entire text of this book, type in your search term here and press Enter .
Share a link to this book page on your preferred social network or via email.
View our suggested citation for this chapter.
Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.
Get Email Updates
Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.
6. Implications, Conclusions and Recommendations
While your data analysis will need to analyze every questions asked, discussing such things as statistical significance and correlations, when you are ready to draw conclusions, you will have to determine what the main findings of your report really are. Not everything is worthy of being re-discussed when drawing conclusions. It is quite likely that the reader or readers of the final report have not spent much time thinking about the research, but want to understand quickly without having to read every last bit of analysis and data manipulation.
The final chapter of the research report must bring the research together and provide an interpretation of the results, written in language that is commonly understood even by managers who may not be well versed in statistical analysis, a summary of the critical conclusions of which management or any other specific audience needs to be aware, and strategic recommendations based on the findings of the research.
In more commercial reports the analysis of the data and the interpretation of the results may well go hand in hand, with only those findings directly relevant to the study objectives being discussed. Only summary tables and charts are part of the write-up. In these cases, the detailed analysis and a comprehensive set of tables and charts are usually confined to a technical report.
In the Data Analysis , the results for each question in the survey were discussed along with the appropriate statistical analysis and an illustration in the form of a table or chart. As part of the interpretation of the results, you need to go back to the findings previously discussed and interpret them in light of the subproblems you posed as part of your research question. This subproblem interpretation is based on the results of each research item. Whereas in the data analysis you only identify the results without editorializing or commenting on them, now we are ready to draw conclusions about the data.
As part of the interpretation, you will want to place your results in the context of your literature review. That is to say, to what extent do you have an explanation why other researchers might have reached different conclusions, or even what the implications are of your data pointing to similar results. Since your literature review drove the development of your hypotheses, it is logical that you would discuss whether hypotheses tested positive or negative as part of your interpretation.
In more commercial research reports, the data analysis and their interpretation are usually presented together; in more academic reports they are separated into two chapters (four and five), with the first one discussing only the direct conclusions based on presentations of numbers, percentages and other hard data, and the second one interpreting the work presented in Chapter four. However, because they are so closely related, it is a good idea to prepare and write these two chapters in parallel, even for academic reports.
Summarizing conclusions is a two-step process, whereby
- you review the conclusions of all the hypotheses, and from these conclusions
- you draw overall conclusions for the research question itself.
These conclusions are usually listed numerically, and then further discussed one by one. The reasoning followed to reach the conclusions and the data that supports the statements made are incorporated into a brief editorial comment with respect to the global interpretation.
It is absolutely critical at this point not to cede to temptation to make concluding statements that would apply the study’s results beyond the parameters established for the study under the problem definition . Indeed, you may even want to incorporate a statement warning the reader not to interpret the results in such a way that generalizations beyond the study’s parameters are made.
You may, however, want to address how a potentially valid generalization could be made in your Recommendations section.
No matter how complete your study was, there will always be further research that will be required to shed more light on the research question, particularly if there is an interest in generalizing the findings beyond the study’s parameters. You will also have found areas within the literature itself that have considerable gaps that should be addressed, and to which your study may or not have contributed. Therefore, a summary section regarding recommendations for further study is appropriate.
If the research was undertaken on behalf of a client, then it is also important to provide the manager with a set of recommendations that directly address the management situation that led to the research being commissioned in the first place. However, as much as the manager may want far reaching recommendations, care has to be exercised that they are indeed anchored in the findings of the study and do not exceed its parameters.
Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.
6.6: Formal Report—Conclusion, Recommendations, References, and Appendices
- Examine the remaining report sections: conclusion, recommendation, reference list, appendices
What Are the Remaining Report Sections?
Conclusions and recommendations.
The conclusions and recommendations section conveys the key results from the analysis in the discussion section. Up to this point, readers have carefully reviewed the data in the report; they are now logically prepared to read the report’s conclusions and recommendations.
According to OACETT (2021), “Conclusions are reasoned judgment and fact, not opinion. Conclusions consider all of the variables and relate cause and effect. Conclusions analyze, evaluate, and make comparisons and contrasts” (p. 7) and “Recommendation(s) (if applicable) suggest a course of action and are provided when there are additional areas for study, or if the reason for the Technology Report was to determine the best action going forward” (p. 7).
You may present the conclusions and recommendations in a numbered or bulleted list to enhance readability.
All formal reports should include a reference page; this page documents the sources cited within the report. The recipient(s) of the report can also refer to this page to locate sources for further research.
Documenting your information sources is all about establishing, maintaining, and protecting your credibility in the profession. You must cite (“document”) borrowed information regardless of the shape or form in which you present it. Whether you directly quote it, paraphrase it, or summarize it—it’s still borrowed information. Whether it comes from a book, article, a diagram, a table, a web page, a product brochure, an expert whom you interview in person—it’s still borrowed information.
Documentation systems vary according to professionals and fields. In ENGL 250, we follow APA. Refer to a credible APA guide for support.
Appendices are those extra sections in a report that follow the conclusion. According to OACETT (2021), “Appendices can include detailed calculations, tables, drawings, specifications, and technical literature” (p. 7).
Anything that does not comfortably fit in the main part of the report but cannot be left out of the report altogether should go into the appendices. They are commonly used for large tables of data, big chunks of sample code, background that is too basic or too advanced for the body of the report, or large illustrations that just do not fit in the body of the report. Anything that you feel is too large for the main part of the report or that you think would be distracting and interrupt the flow of the report is a good candidate for an appendix.
References & Attributions
Blicq, R., & Moretto, L. (2012). Technically write. (8th Canadian Ed.). Pearson Canada.
OACETT. (2021). Technology report guidelines . https://www.oacett.org/getmedia/9f9623ac-73ab-4f99-acca-0d78dee161ab/TR_GUIDELINES_Final.pdf.aspx
Content is adapted from Technical Writing by Allison Gross, Annemarie Hamlin, Billy Merck, Chris Rubio, Jodi Naas, Megan Savage, and Michele DeSilva, which is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.
Writing in a Technical Environment (First Edition) Copyright © 2022 by Centennial College is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.
Share This Book
Recommendations are often included with a report’s conclusion, although they serve different purposes. Whereas a conclusion offers you the opportunity to summarize or review your report’s main ideas, recommendations suggest actions to be taken in response to the findings of a report. You can regard recommendations as a prompt to action for your readers. As you have seen from your planning, your report structure should lead up to the recommendations and provide justification for them. Just as a proposal grows from your project’s goals and objectives, a report should actually grow backwards from your recommendations. Having your recommendations accepted then becomes part of your purpose.
What makes a good recommendation? Effective recommendations:
- describe a suggested course of action to be taken to solve a particular problem;
- are written as action statements without justification;
- are stated in clear, specific language;
- should be expressed in order of importance;
- are based on the case built up in the body of the report; are written in parallel structure.
A word of caution about writing recommendations: you should always consider your relationship with the reader first. If you have no authority to make recommendations, the reader may be hostile to their presence.
Have a look at the following examples from different types of reports. Many of the recommendations included here are well written but a few contain some significant shortcomings. Position your cursor over the excerpts to see our comments.
Content on this page requires a newer version of Adobe Flash Player.
Have a language expert improve your writing
Run a free plagiarism check in 10 minutes, generate accurate citations for free.
- Knowledge Base
- Research paper
- How to Write Recommendations in Research | Examples & Tips
How to Write Recommendations in Research | Examples & Tips
Published on September 15, 2022 by Tegan George . Revised on July 18, 2023.
Recommendations in research are a crucial component of your discussion section and the conclusion of your thesis , dissertation , or research paper .
As you conduct your research and analyze the data you collected , perhaps there are ideas or results that don’t quite fit the scope of your research topic. Or, maybe your results suggest that there are further implications of your results or the causal relationships between previously-studied variables than covered in extant research.
Table of contents
What should recommendations look like, building your research recommendation, how should your recommendations be written, recommendation in research example, other interesting articles, frequently asked questions about recommendations.
Recommendations for future research should be:
- Concrete and specific
- Supported with a clear rationale
- Directly connected to your research
Overall, strive to highlight ways other researchers can reproduce or replicate your results to draw further conclusions, and suggest different directions that future research can take, if applicable.
Relatedly, when making these recommendations, avoid:
- Undermining your own work, but rather offer suggestions on how future studies can build upon it
- Suggesting recommendations actually needed to complete your argument, but rather ensure that your research stands alone on its own merits
- Using recommendations as a place for self-criticism, but rather as a natural extension point for your work
A faster, more affordable way to improve your paper
Scribbr’s new AI Proofreader checks your document and corrects spelling, grammar, and punctuation mistakes with near-human accuracy and the efficiency of AI!
Proofread my paper
There are many different ways to frame recommendations, but the easiest is perhaps to follow the formula of research question conclusion recommendation. Here’s an example.
Conclusion An important condition for controlling many social skills is mastering language. If children have a better command of language, they can express themselves better and are better able to understand their peers. Opportunities to practice social skills are thus dependent on the development of language skills.
As a rule of thumb, try to limit yourself to only the most relevant future recommendations: ones that stem directly from your work. While you can have multiple recommendations for each research conclusion, it is also acceptable to have one recommendation that is connected to more than one conclusion.
These recommendations should be targeted at your audience, specifically toward peers or colleagues in your field that work on similar subjects to your paper or dissertation topic . They can flow directly from any limitations you found while conducting your work, offering concrete and actionable possibilities for how future research can build on anything that your own work was unable to address at the time of your writing.
See below for a full research recommendation example that you can use as a template to write your own.
If you want to know more about AI for academic writing, AI tools, or research bias, make sure to check out some of our other articles with explanations and examples or go directly to our tools!
- Survivorship bias
- Self-serving bias
- Availability heuristic
- Halo effect
- Hindsight bias
- Deep learning
- Generative AI
- Machine learning
- Reinforcement learning
- Supervised vs. unsupervised learning
- Grammar Checker
- Paraphrasing Tool
- Text Summarizer
- AI Detector
- Plagiarism Checker
- Citation Generator
While it may be tempting to present new arguments or evidence in your thesis or disseration conclusion , especially if you have a particularly striking argument you’d like to finish your analysis with, you shouldn’t. Theses and dissertations follow a more formal structure than this.
All your findings and arguments should be presented in the body of the text (more specifically in the discussion section and results section .) The conclusion is meant to summarize and reflect on the evidence and arguments you have already presented, not introduce new ones.
The conclusion of your thesis or dissertation should include the following:
- A restatement of your research question
- A summary of your key arguments and/or results
- A short discussion of the implications of your research
For a stronger dissertation conclusion , avoid including:
- Important evidence or analysis that wasn’t mentioned in the discussion section and results section
- Generic concluding phrases (e.g. “In conclusion …”)
- Weak statements that undermine your argument (e.g., “There are good points on both sides of this issue.”)
Your conclusion should leave the reader with a strong, decisive impression of your work.
In a thesis or dissertation, the discussion is an in-depth exploration of the results, going into detail about the meaning of your findings and citing relevant sources to put them in context.
The conclusion is more shorter and more general: it concisely answers your main research question and makes recommendations based on your overall findings.
Cite this Scribbr article
If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.
George, T. (2023, July 18). How to Write Recommendations in Research | Examples & Tips. Scribbr. Retrieved November 6, 2023, from https://www.scribbr.com/dissertation/recommendations-in-research/
Is this article helpful?
Other students also liked, how to write a discussion section | tips & examples, how to write a thesis or dissertation conclusion, how to write a results section | tips & examples, what is your plagiarism score.