archive-edu.com » EDU » C » CORNELL.EDU

Total: 95

Choose link from "Titles, links and description words view":

Or switch to "Titles and links view".
  • Cornell Office for Research on Evaluation - Welcome
    as well as whether or not things are working as desired However the validity of evaluation that has been conducted has rarely been challenged This website provides information about our research activities evaluation of evaluation as well as basic information and resources for evaluators Read More Latest News New NSF Awards EAGER and TUES New Publication Urban J B Hargraves M and Trochim W M 2014 Evolutionary Evaluation Implications for

    Original URL path: https://core.human.cornell.edu/ (2014-10-20)
    Open archived version from archive


  • Cornell Office for Research on Evaluation - About
    the development and testing of systems evaluation methods measures tools and systems What we are really trying to do is integrate evaluation into the system of how people actually think about their work so that evaluation almost disappears Evaluation becomes so much a part of program management and planning that educators don t think of evaluation as something that is imposed on them but it becomes part of the evolutionary

    Original URL path: https://core.human.cornell.edu/about/ (2014-10-20)
    Open archived version from archive

  • Cornell Office for Research on Evaluation - Current Grants
    the CORE approach are used in other specific programmatic contexts such as the Program Leadership Certification professional development program and the Northern NY regional agriculture program For more information see Extension and Outreach National Science Foundation Integrating Research and Practice through Collaboration on Practitioner Generated Models NSF EAGER Award 1346848 Collaborative Research Impace of the Summer Institutes on Faculty Teacing and Student Achievement NSF TUES Award 1322861 A Phase II

    Original URL path: https://core.human.cornell.edu/grants/ (2014-10-20)
    Open archived version from archive

  • Cornell Office for Research on Evaluation - Research
    Translational Research Evaluation Policy Regression Discontinuity Initially developed by Donald T Campbell the regression discontinuity quasi experimental design was the foundation of Dr Trochim s graduate research The Regression Discontinuity RD design is seen as a useful method for determining whether a program or treatment is effective Research based Tools Dissemination The Guide to the Systems Evaluation Protocol SEP The Netway and mySEP Research Methods Knowledge Base Concept Mapping Concept

    Original URL path: https://core.human.cornell.edu/research/ (2014-10-20)
    Open archived version from archive

  • Cornell Office for Research on Evaluation - General Evaluation Resources
    The American Evaluation Association This website has general information about the association its conferences and journals the Guiding Principles for Evaluators a Career Center and Find an Evaluator function http www eval org CDC Evaluation Working Group Home Page http www cdc gov eval index htm University of Wisconsin Extension Program Development and Evaluation http www uwex edu ces pdande The Evaluation Center University of Western Michigan http www wmich

    Original URL path: https://core.human.cornell.edu/resources/ (2014-10-20)
    Open archived version from archive

  • Cornell Office for Research on Evaluation - Extension and Outreach
    research touching every aspect of life in New York Cornell Cooperative Extension is a key outreach system of Cornell University with a strong public mission and an extensive local presence that is responsive to needs in New York communities The Cornell Cooperative Extension educational system enables people to improve their lives and communities through partnerships that put experience and research knowledge to work CORE works with the University s Extension

    Original URL path: https://core.human.cornell.edu/outreach/ (2014-10-20)
    Open archived version from archive

  • Cornell Office for Research on Evaluation - Our View
    wants to be evaluated But the real issue is how do we understand the world around us Evaluation is a central mechanism and inherent part of asking what s going on here Why is it going on the way it is going on Imagine trying to drive your car without windows and you can imagine what a world without evaluation would be like We think of evaluation as a form of looking ahead and providing feedback much like a driver who decides which roads to take and is alert to roadblocks or other obstacles Without evaluation program managers are much like a blind driver Evaluation systems enable managers to look forward Evaluation provides them with feedback about how things are working as well as whether or not things are working as desired In other words evaluation is essential to learning We cannot have learning in our society unless we have the basic input about what s happening around us When we are born we each are born with senses We are each born with a nervous system We are each born with a central processing unit that processes that stuff But when organizations or groups are born they don t

    Original URL path: https://core.human.cornell.edu/about/view.cfm (2014-10-20)
    Open archived version from archive

  • Cornell Office for Research on Evaluation - Protocol
    program staff exclusively while in other cases this term may refer to members of the organization from various levels in the organizational hierarchy program staff administrators funders as well as participants and related stakeholders The process of working through the Protocol will consist of collaborative meetings that will seemingly spiral through several focal points over time as well as ongoing work around building a culture of evaluation in the participating organization This process is essential to the nature of the SEP It is through these discussions that members if the organization and its program practitioners will develop a new outlook on their work that will change both their understanding of how the program stakeholders perceive the program as well as their sense of purpose in what they are doing and why The SEP is a standardized protocol that nevertheless enables any program to develop an evaluation uniquely tailored to that program In this sense it addresses the administrative need in an evaluation environment to standardize evaluation approaches while respecting the variety of contexts within which programming is conducted Putting evaluation concepts into a simple set of steps which we call the Systems Evaluation Protocol requires that we present the Guide in a linear format In fact an important objective for us in this work has been to instill the idea that effective modern evaluation requires evaluators to move beyond a linear mindset Good evaluation requires feedback and is embedded within a dynamic changing system Although any written document is by definition linear systems evaluation is a non linear and iterative process see Simple Rules We expect that in various contexts it will be appropriate to perform steps out of the presented sequence or in tandem as well as to revisit steps repeatedly throughout the process As a reminder there are three phases to evaluation Planning Implementation and Utilization Our Guides present the Protocol for the first phase only These resources are available to download here updated 10 8 12 You will be asked for contact info and your intended use of the protocol and this information is just to help us monitor how it s being used You will not be placed on any email lists anywhere due to this You will be redirected to a new page the same as this page but with links to the resources you re encouraged to bookmark it or you ll have to do this again next time The Systems Evaluation Protocol has applications well beyond the field of STEM education The guides outline the protocol steps for evaluation planning but not for evaluation implementation and evaluation utilization When followed this series of repeatable steps can lead to the creation of a project logic model a project pathway model and an evaluation plan Program leaders and staff learned evaluation skills that are applicable to all the program activities of their organization Their paradigms of program evaluation and development were broadened to encompass the greater system within which a program is embedded The Facilitator

    Original URL path: https://core.human.cornell.edu/research/systems/protocol/index.cfm (2014-10-20)
    Open archived version from archive