» Social planning

Department of Defense: 2014 Climate Change Adaptation Roadmap

Climate change will affect the Department of Defense’s ability to defend the Nation and poses immediate risks to U.S. national security. The Department is responding to climate change in two ways: adaptation, or efforts to plan for the changes that are occurring or expected to occur; and mitigation, or efforts that reduce greenhouse gas emissions. This Climate Change Adaptation Roadmap (Roadmap) focuses on the Department’s climate change adaptation activities. Read More

Nonprofit Leadership Development: A Model for Identifying and Growing Leaders within the Nonprofit Sector

This is a report on core competencies, growing skills, preparing for the future. The Looking-Glass Institute conducted extensive interviews with 36 Executive Directors in Pittsburgh, PA. Funded by Bruner Foundation, Heinz Endowments and the Forbes Funds, this work has broad implications for both grantmakers and their grantees. Read More

Philanthropic Capacity Building Resources (PCBR) Database

The PCBR database contains 401 descriptions of capacity-building programs being carried out by U.S. foundations. From this database, you may obtain program profiles and conduct searches for programs under one or more search categories, as defined below. Read More

Sustaining Evaluative Thinking

Organizations use policies, usually written statements, to clarify their guiding principles, i.e., acceptable and unacceptable procedures. The development of specific, evaluation-related policies about, for example, leadership succession or staff development, can help to extend or sustain evaluative thinking in organizations. Read More

Evaluation and HR

Organizations can learn a lot from and about their staff using evaluation data collection strategies such as surveys, interviews and record reviews. As shown below, information about staff and their actions, perceptions and feedback can be used to inform organizational practices. Remember, organizations that regularly use evaluative thinking ask questions of substance (including questions about their staff), determine what data are needed to address the questions and how those data could be obtained, systematically collect and analyze data (including staff feedback), share the results and develop strategies to act on findings. Read More

Evaluation and Technology

Organizations typically need technology systems for four main purposes (1) management support (e.g., for use with financial, HR, or procedural information), (2) internal and external communication (e.g., email capability, proposal or report development and submission, contact with program participants/others), (3) program operations (e.g., archiving and record-keeping, best practice research, program descriptions, logic model development) and (4) evaluation (e.g., programs, overall organizational effectiveness, clients). An organization that is committed to evaluative thinking embraces supportive technology use – in other words, they ask questions about whether the organization’s hardware and software inform organizational and program practices, they systematically collect data about technology systems in use to ensure that they truly are supportive, and they analyze and act on the data that is collected (i.e., technology changes are made based on information). Read More

Using Evaluation Findings

There are multiple ways to communicate about and use evaluation findings. Results of surveys, interviews, observations etc., can be incorporated into planning processes and as described in both previous and following sections, basic assessments of many organizational procedures can be undertaken to inform ongoing organizational practice. When a more complete evaluation is conducted however, for example of a specific program, the results of that evaluation are usually summarized into an evaluation report. The following provides some guidelines for completing a formal evaluation report, which by the way should be started during the data collection phase of evaluation. Read More

Commisioning Evaluation: Tips for Grantmakers and Grant Seekers

As stated previously, choosing data collection strategies (e.g., surveys, observations, record reviews) depends upon the purpose of the evaluation, the evaluation questions, the time frame, and the available resources. Evaluation assistance can be obtained from independent technical assistance or evaluation consultants, evaluation or other technical assistance consulting firms, and sometimes universities with graduate programs that include training or projects in program evaluation. Before you hire any consultant or organization be sure to find out whether they have: experience with program evaluation, especially with non-profit organizations; basic knowledge of the substantive area being evaluated; good references (from sources you trust); and a personal style that fits with your organization’s working style. Think long and hard about the purpose of the evaluation project you are considering. This bulletin will help you understand: differences between research and evaluation, what evaluation should cost, and what you should think about before you initiate it. Read More

Getting Serious about Evaluations

The four key components of Evaluative Thinking – asking questions, systematically gathering data, analyzing data and sharing results, developing action steps – can be applied to most aspects of your organizational practice. But they require you to put evaluation skills to use. Read More

Board Members and Evaluation

Many years of participatory evaluation practice show that involvement of multiple stakeholders is beneficial. It is our steadfast belief that evaluators, funders, program providers and their board members can all be meaningfully engaged in program evaluation, but all parties need to be on the same page about the following. 1. Evaluations are partly social (because they involve human beings), partly political (because knowledge is power), and only partly technical (Herman, Morris, Fitz-Gibbons, 1996). All three of these evaluation features, not just technical design, should be considered when stakeholders discuss evaluation. 2. Evaluation data can be collected using qualitative methods (e.g., observations, interviews) and/or quantitative methods (e.g., surveys, practical testing of subjects). Although there has been much debate about which strategies and types of data are best, current thinking indicates that both are valuable, can be collected and analyzed rigorously, and can be combined to address key evaluation questions. 3. There are multiple ways to address most evaluation needs. Different evaluation needs call for different designs, types of data and data collection strategies. Read More

Mission

EERL's mission is to be the best possible online collection of environmental and energy sustainability resources for community college educators and for their students. The resources are also available for practitioners and the public.

EERL & ATEEC

EERL is a product of a community college-based National Science Foundation Center, the Advanced Technology Environmental and Energy Center (ATEEC), and its partners.

Contact ATEEC 563.441.4087 or by email ateec@eicc.edu