天发娱乐棋牌_天发娱乐APP-官网|下载

图片

图片

Skip to main navigation Skip to main content
The University of 天发娱乐棋牌_天发娱乐APP-官网|下载

Evaluating public engagement activities

Evaluation (the systematic assessment of the worth or merit of an activity, project, programme or policy) can be

- formative (for ongoing improvement); or

- summative (to document success/failure).

There are many methodologies and approaches.

The prompts on this page?will help you think about WHY and HOW you might want to evaluate your public engagement project/activity; and where to go for more detail.

(acknowledgement: this guidance is informed by the NCCPE Beginners Guide to Evaluation workshop)

If you would like to take a step-by-step approach to designing your evaluation,?you can use our Evaluation Toolkit.

Benefits

A well designed and conducted evaluation will:

  • Afford opportunity to improve
  • Help to clarify your objectives and determine if you've been successful
  • Demonstrate impact, value, benefits, value for money, of your activity
  • Provide evidence, a record of achievement
  • Help to secure/retain funding
  • Inform and improve future activities
  • Inform the practice of others

Purpose

What are you evaluating and why?? The type of evaluation you choose will depend on this, for example:

  • Do you want to capture change over time or is this a one-off exercise?
  • Do you want to set some targets and measure whether you've met them?
  • Do you want to compare what you are doing with what others are doing?
  • Do you need external verification or is this just an internal exercise?
  • Are you interested in identifying best practice, eg on how to collaborate for mutual benefit?
  • Do you want to measure engagement from a community perspective?

Planning your evaluation

You will normally develop your evaluation plan alongside your project/activity plan, as the two should be linked.

Your evaluation plan should summarise what , why and how you are going to do things. It doesn't need to be a long document and should include:

  • Aim - what do you want to achieve with the evaluation? (big picture)
  • Objectives - what do you need to do to achieve this? (Link to your project objectives and ensure they are Specific; Measurable; Achievable; Relevant; Time defined)
  • Identify the stakeholders/audience
  • Evaluation Questions - what do you (and/or other stakeholders) want to know? ?Think in terms of measuring outputs (results), outcomes (overall benefits or changes) and impact (longer term effects, influences or benefits)
  • Methodology - what strategy will you choose?
  • Data collection - what techniques will you use to collect your evidence? (see more below)
  • Data analysis - how will you analyse your data? (see more below)
  • Reporting - who will be reading your report?

When thinking about how you are going to conduct your evaluation, consider:

Triangulation - combining different approaches to develop a deeper picture of the activity, to help reduce bias. This involves capturing different perspectives on your activity (e.g. from the participants, from the deliverer (you), and from a neutral observer (helper, colleague, the evaluator); and using a variety of collection techniques.

Creating a baseline - this is important so that you can measure and evidence any change eg. to know if people's knowledge or attitudes have changed you need to know where you are starting from. Where possible build this into the engagement activity itself.

Quantitative and qualitative data - ideally you should be collecting a mixture of these (e.g. responses to factual questions plus responses to open questions), so that you can explore and understand what is happening in more depth.

Sampling - you don't need to evaluate everyone and everything, just a representative sample. A large sample takes longer to analyse and may not give you any more information. Quantitative data usually involves larger sample sizes (e.g. 40-60) and you should ask at least 100 people before expressing results as percentages. Qualitative data usually involves smaller sample sizes (e.g. 10-20) but provides more depth.

Data Collection

When deciding how to collect your data, consider:

  • suitability for your audience
  • the questions you want answered
  • the time needed (for participants and evaluators)
  • venue and location
  • ethical considerations (respect, honesty, ownership, integrity, confidentiality)

It is possible to be creative with your data collection- there are many techniques, with different strengths and weaknesses: eg response cards, questionnaires, interviews, focus groups, graffiti walls, drawings, observation, video, images/photos ....

Data analysis

This is often an iterative process, moving back and forth across three steps:

1. Noticing and collecting (downloading/typing up/labelling/debriefing)

2. Sorting and thinking (listening/reading/processing quantitative data)

3. Critical analysis and interpretation (comparing, contrasting, exploring themes & patterns/describing and illustrating findings eg tables, charts, text, quotes)

Remember to:

  • Allow plenty of time
  • Refer back to original aim, objectives and evaluation questions
  • Look for patterns and group data (i.e. coding)
  • Find representative quotes
  • Look for contradictory data
  • Be critical of your interpretation of data
  • Be reflective - what worked well? What didn't work well? What would you do differently?

Making use of your evaluation

Spending time and energy on collecting data is pointless unless you use the information, learn from it and share it with others.

Consider who will be reading your report and tailor it accordingly. Remember to feedback findings to those who were involved (wherever possible), value their contribution and thank them. Ensure that the findings are acted on.

A formal report structure might include: summary, context of evaluation, aims, objectives and evaluation questions, description of activity/event, methodology, summary of evidence, overview of activity/event, conclusions and recommendations.

Useful 天发娱乐棋牌_天发娱乐APP-官网|下载s

Privacy Settings