Want to create interactive content? It’s easy in Genially!

Get started free

Module 6: Introduction to Evaluation & Research Proposals

Emily Sheehy

Created on September 14, 2025

Start designing with a free template

Discover more than 1500 professional designs like these:

Syllabus Organizer for Higher Education

Internal Guidelines for Artificial Intelligence Use

Math Mission

Simulation: How to Act Against Bullying

World Ecosystems

AI Diagnostic Survey for the Corporate Environment

Video: Keys to Effective Communication

Transcript

Module 6

Introduction to Evaluation & Research Proposals

Start

Special Acknowledgement

Section 1: Evaluation vs. Research

Section 2: Overview of Evaluation Framework

Index

Section 3: Overview of Research Proposals

Evaluation VS Research

Objectives

Section 1: Evaluation vs. Research - Similarities & Differences

By the end of this section, you will be able to:

Describe why we conduct studies

Define the difference between Evaluation and Research

List the similarities and differences of evaluation and and research as disciplines

OBJECTIVES

EvaluationDefined

"...Systematic collection about activites, characteristics, and outcomes of programs to make judgements, improve effectiveness, and inform decisions" (Patton, 2008)

Let's look at two definitions:

"...Identification, clarification, and application of defensable criteria to determine an object's value in regard to those criteria." (Fitzpatrick, 2011)

Types of Evaluation

Formative & Summative
Improvement + judgement and accountability
Process Evaluation
Analyzing program implementationand delivery
Outcome & Impact
Evaluating the effectiveness of a program
Developmental
For adaptive, innovative environments

Research Defined

“...Process of steps used to collect and analyze information to increase our understanding of a topic or issue.” (Cresswell, 2018)

"...Systematic investigation of social phenomena through objective and replicable procedures for the purpose of discovering general principles.”(Babbie, 2021)

Types of Research

Design & Approach
The 3 Methods
Click the buttons for more
Purpose
Data Methods

Why do we Conduct Studies?

Influence Policy & Practice

Answers a Question

Informs Decisions

Generates Knowledge

Capacity Building

Now you know! What do you think?

Title

What are the Reasons?

Accountability

Improvement

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Subtitle

Click the right corners to find out!

WHY DO WE CONDUCT STUDIES? (CONT.)

Click through to reveal the hidden audio

ANSWER QUESTION

ACCOUNTABILITY

Generate knowledge

INFLUENCE POLICY

INFORM DECISION

IMPROVE PRACTICE

BUILD CAPACITY

Audio revealed!

Commonalities of Research & Evaluation

Utilizes Data Collection Methods

Systematic Inquiry

Ethical Considerations

Theory-Informed

Evidence-Based

Transparency

Quality and Rigor

Software Tools

Commonalities of Research & Evaluation

COMPARISON TABLE

Knowledge Check

Knowledge Check pt. 2

Overview of the evaluation framework

Objectives

Section 2: Overview of the Evalution Framework

By the end of this section, you will be able to:

Define how we measure Value in an evaluation study

List the four evaluation standards

Describe the six steps in the evaluation framework

OBJECTIVES

Evaluation

Things to consider during the planning phase of an evaluation:

  • What will be Evaluated?
  • What context should be considered?
  • What are standards?
  • What evidence is needed?
  • Can conclusions be compared to standards?
  • How will lessons learned be used to improve effectiveness?
Assigning Value

As with any study, an investigator starts with a question or a problem to solve, and the goal is to implement the study to obtain impact and/or value.

Click through the different terms to learn more

Info

1.) Relevance and Utility

Evaluation Standards

Info

2.) Rigor

3.) Independence and Objectivity

Info

Info

4.) Transparency

Info

5.) Ethics

EVALUATION:

6 Steps in an Evaluation Framework

Assess context and stakeholders

Evaluation design (questions)

Define the program

Ensure use and share lessons learned

Gather credible evidence

Justify results

WIC Program Clinical Staff Funders Mothers

Maternal Health Example:

Communicate findings and dissemination plan for stakeholders (pt.1)
6wk Breastfeeding program (purpose, components, logic model)

Breastfeeding Educational Intervention

Evaluation Standards:
Relevance & Utiliy, Rigor, Independence, Transparency, Ethics
Conduct data analysis, Ensure conclusions linked to data and note limitations
Evaluation Design, Example Mixed Method focus group and pre/post survey
Data Collection Step

Source: Jacobson and Teutsch​

Knowledge Check

Knowledge Check pt. 2

OVerview of research proposals

Objectives

Section 3: Overview of Research Proposals

By the end of this section, you will be able to:

Define the common elements of a Research proposal

Define the purpose of an Internal Review Board (IRB)

List elements to consider in crafting a research question (PICO, FINER)

OBJECTIVES

Describe three ways to improve your literature review findings

Introduction

Research Study:

Theory / Literature Review

Common Sections

Methodology
Data Analysis
Summary & Conclusions
Bibliography

Research Proposal: Details needed

(Note: Not a Linear Process!)

Methodology
Theory / Literature Review
Bibliography
Introduction
Summary & Conclusions
Data Analysis
  • Study Design
  • Sampling
  • Instruments
  • Analysis
  • Human Subjects & IRB Ethical Considerations
  • Results
  • References
  • Instruments
  • Budget
  • Project Plan
  • Title & Abstract
  • Background
  • Problem (for) question
  • Objectives
  • Findings
  • Literature Review

Reference: Creswell, Hulley, NIH Grants

IMRaD: Research Articles (& Abstracts)

Introduction

Discussions

IM-RaD

Methods

Results

Research with Human Subjects

Human Subjects

IRB ensures rights, safety, well-being, and compliance

& International Review Board (IRB)

Required before any Study with human subjects

Review:​How an IRB Protects Human Subjects (6:45)

Literature Review

Practical Suggestions

Hover over the boxes for more
3- Diverse Databases
2- Citation Tracking
1- Collaborate with librarians
6- Organize Data
5- Critically Apprise
4- Inclusion and Exclusion Criteria
7- Synthesize Findings

LITERATURE REVIEW: INSTRUMENTS & TOOLS

TYPES OF RESEARCH QUESTIONS

Good Question

Good vs. Poor Research Questions

Good Question

Poor Question

  • Clear variables and population
  • Aligned with method + feasible
  • Supported by Literature, next step or gap
  • Ethical and answerable
  • Lacks clarity, or many questions
  • Impractical
  • Already answered
  • Unethical, not measurable, speculative
What makes for a good question?

Use this side of the card to provide more information about a topic. Focus on one concept. Make learning and communication more efficient.

Use this side of the card to provide more information about a topic. Focus on one concept. Make learning and communication more efficient.

Good Question

Poor Question

Title

Title

Specific, focused, reasonable

Too broad or vague

Write a brief description here

Write a brief description here

RESEARCH QUESTIONS:

MATERNAL HEALTH EXAMPLE

Hover over or click the buttons for more

Directional

COMPARATIVE

DESCRIPTIVE

RELATIONAL / CORRELATIONAL

EXPLORATORY

CAUSAL

PICO stands for:

PICO Method

PopulationIntervention Comparison Outcome

Crafting Research Questions

Example: In elderly patients with hypertension (P), how does low-sodium diet (I) compared to a standard diet (C) reduce blood pressure (O)?

FINER: Research Questions

Novel
Interesting
Feasible
Relevant
Data Ethical

Can it be conducted ethically, and does it minimize harm and respect participant rights?

Will there be enough time, resources, expertise, or sample time?

Does it add new knowledge, confirm or refute, or build on work?

Why does it matter? What is the impact?

Motivation and Curiosity

(Hulley, 2013)

Knowledge Check

Knowledge Check

SUMMARY

COMPLETED:
  • Differentiate between the disciplines
  • Describe ‘why’ we conduct studies
  • Define commonalities of both disciplines

Section 1: Overview of Research & Evaluation

  • Define how we obtain ‘Value” in Evaluation
  • Describe the six steps in the Evaluation Framework

Section 2: Overview of Evaluation

  • Define elements in a Research
  • Define what and why we use an IRB
  • List methods to improve your literature review
  • Describe the PICO and FINER method for developing a research question

Section 3: Overview of Research

REFERENCES

MODULE 6 COMPLETED

Remember to review what you've learned!

Step 4:

Gather credible evidence

During step 4, we determine what evidence is needed to answer your overarching evaluation questions.

  • Indicators
  • Data Sources
  • Quality & Quantity
  • Logistics

Listen to the audio for a full description

Evaluation Standards

Rigor

Produce findings and limitations. The rigor (and credibility) is dependent upon thoughtful planning and implementation and interpretation of results.

Builds Academic Expertise and Researches

Research

Encourages reflection and learning in organizations

Examine

WHY DO WE CONDUCT STUDIES?

Build Capacity

&

Evaluation Standards

Independence and Objectivity

Evaluation should be conducted as objective as possible and independent of undue influence. Any conflict of interest, biases or other potential biases need to be avoided to ensure objective outcomes.

focus the evaluation design

STEP 3

At this stage, we define the evaluation's...

Title

Methods

Questions

Users

Purpose

Uses

Timeline

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

Use this side to give more information about a topic.

For this case, the primary purpose is to assess program effectiveness in improving breastfeeding practices.

Subtitle

Click the right corners to find out!

Step 1:

Assess context and stakeholders

In this step, an evaluator ensures that they understand an evaluation’s people, place and capacity.

  • Persons or organizations
  • Program Operations
  • Impacted Individuals
  • Users

Listen to the audio for a full description

Types of research

Design & Approach

Descriptive ResearchWorks to describe a population, condition, phenomenon, but not analyzing relationships or causal relationships

Experimental research Involved manipulation of variables to determine causal effects, and typically uses random assignment (cause and effect)

Observational researchThis involves watching and recording behaviors and matters when they naturally occur in the real world

Quasi-Experimental Groups are not randomly assigned, allows for comparison of groups

Correlational research Examines relationships among variables, but cannot determine causation

Listen to the audio for more information

Step 3:

Evaluation Design

Define why the evaluation is being conducted, how the results will be used and who will learn/use the findings.

  • Purpose + Users & Uses
  • Questions & Methods
  • Agreements

Listen to the audio for a full description

Types of research

Data Methods

Qualitative: Analyzing text and words

Quantitative: Analyzing numeric and numbers

Listen to the audio for more information

Mixed: A mix of qualitative and quantitative methods

To build new understanding, theories, or models that can be generalized

Research

Understand program or context

Examine

WHY DO WE CONDUCT STUDIES?

Generate Knowledge

&

Evaluation Standards

Relevance and Utility

Must address information important to stakeholders to be useful, findings be actionable, understandable, and allows for action.

Contribution to Policy matters

Research

Influences policy or funding

Examine

WHY DO WE CONDUCT STUDIES?

Influence Policy

&

Supports evidence-based policies, publications, and academic discourse

Research

Helps program staff, funders, and stakeholders make decisions

Examine

WHY DO WE CONDUCT STUDIES?

Inform Decision

&

Types of research

Purpose

Basic (Purpose)

  • "Pure Research"
  • Advancing theoretical knowledge
  • Desire to understand fundamental principles in science and medicine

Applied

  • Solving real world problems using scientific method
  • Generate evidence that can be directly applied to improvement of facillities

Action

  • Reflective collaborative process to investigate one’s practice of outcomes + increase understanding
  • Cyclical (plan-act-observe-reflect-revise)

Translational

  • Scientific discovery into practical applications
  • Translating basic science into human trials, and then translating the research into practice and policy.

Listen to the audio for more information
Engage StakeHolders

STEP 1

In this first step, it is essential to identify and involve a broad range of stakeholders.

Click to reveal stakeholders
  • Rural health clinic staff
  • Community health workers
  • WIC program representatives
  • Breastfeeding peer counselors
  • Program funders
  • Local policymakers
  • Rural mothers
^ Stakeholders for this program ^

Step 6:

Ensure use and share lessons learned

Engage your interested parties or and translate your results into actionable decision making. You make a plan on how to use your findings and how you plan to disseminate your findings.

  • Design
  • Preparation
  • Feedback
  • Follow-Up
  • Dissemination
  • Additional Uses

Listen to the audio for a full description

Step 5:

Justify results

Focus on obtaining answers by analyzing the data to provide evidence related to the evaluation questions.

  • Standards
  • Analysis & Synthesis
  • Interpretation
  • Judgement
  • Recommendation

Listen to the audio for a full description

Step 2:

Define the program
  • Mission & Objectives
  • Needs: Problem & Oppertunity
  • Expected Effects
  • Activites
  • Resources
  • Stage of Development
  • Context
  • Logic Model

This step is critically important to ensure that your program is clearly defined so it may translate into tangible, measurable outcomes that can be measured.

Listen to the audio for a full description

Funding Sources and Impact

Research

Determines if objectives were met and resources used well

Examine

WHY DO WE CONDUCT STUDIES?

Accountability

&

Evaluation Standards

Ethics

To ensure trust, highest ethical standards need to be maintained. This includes both the planning and implementation to ensure the safeguard of information, all stakeholders and interested parties impacts. Evaluation should be equitable, fair and just with cultural and context factors considered.

Qualitative Data:

Quantitative Data:

Gather credible evidence
Participant surveys at baseline, birth, and six months postpartumMeasuring:
  • Breastfeeding knowledge
  • Intentions
  • Behaviors

Use this side of the card to provide more information about a topic. Focus on one concept. Make learning and communication more efficient.

Use this side of the card to provide more information about a topic. Focus on one concept. Make learning and communication more efficient.

Focus groups with mothers and Interviews with program facilitators Used to identify:
  • Perceived barriers
  • Cultural influences

STEP 4

Data collection must be systematic, ethical, and aligned with the evaluation question.

Title

Qualitative Data

Title

Quantitative Data

Write a brief description here

Write a brief description here

Advances professional fields

Research

Improves efficiency, and outcomes

Examine

WHY DO WE CONDUCT STUDIES?

Improve Practice

&

Evaluation Standards

Transparency

Investigators need to ensure evaluations are transparent to ensure accountability. When an evaluation is completed, the findings should be released in a timely manner and in details for potential replication.

To confirm or disprove scientific assumptions or relationships

Research

Determine if program needs improvement

Examine

WHY DO WE CONDUCT STUDIES?

Answer Question

&

Ensure use and share lessons learned

STEP 6

This final step focuses on communicating findings and facilitating their application.

Describe the program

STEP 2

Step 2 requires clearly articulating the program’s components, purpose, and logic.

CONCLUSION
ANALYSIS
Justify Conclusions

STEP 5

Must be clearly linked to the evidence and acknowledge limitations.
Compare pre- and post-intervention outcomes using statistical methods and thematic coding for qualitative data.