Ooilccdng tzncL Anahjziyig
20"7„ Evaluation Tjctta.
1-577
Outreach Evaluation
National Network of nOCA. irr^Q Pontor
Libraries of Medicine r\ebUUIUt; UCI IICI
NATIONAL
LIBRARY OF
MEDICINE
National Library of Medicine
Planning and Evaluating Health Information Outreach Projects
Booklet
The Planning and Evaluating health Information Outreach series
Booklet 1
Getting Started with Community-Based Outreach
Find potential partners and collect information
about the community (community assessment)
that will generate ideas for health information
outreach projects.
Use booklet 3 to design methods to collect
and analyze community assessment data.
Booklet 2
Including Evaluation in Outreach Project Planning
Take the information gathered during your
community assessment to develop an
outcomes-based project and a plan for
outcomes, pre-program, and process
assessment.
Use booklet 3 to design methods to collect
and analyze outcome, pre-program and
outcomes assessment data.
Booklet 3
Collecting and Analyzing Evaluation Data
Design quantitative and qualitative methods to collect and analyze data for your community assessment
plan developed in Booklet 1 and your outcomes, pre-program, and process assessment plan developed in
Booklet 2.
PROPERTY OF THE
NATIONAL
LIBRARY OF
MEDICINE
C^ollectirig andsAyialMziric)
IBvaluatiATViT^ata,
Cynthia A. Olney, PhD
Evaluation Specialist
CO. Evaluation Consulting LLC
olneyc@triad.rr.com
Susan Barnes, MLS
Assistant Director
National Network of Libraries of Medicine
Outreach Evaluation Resource Center
sjbarnes@u.washington.edu
Planning and Evaluating Health Information Outreach Projects
Booklet
3
2006
National Library of Medicine Cataloging in Publication
Olney, Cynthia A.
Collecting and analyzing evaluation data / Cynthia A. Olney, Susan Barnes. - Seattle,
Wash.: National Network of Libraries of Medicine, Pacific Northwest Region ; Bethesda, Md.
National Library of Medicine, [2006]
(Planning and evaluating health information outreach projects ; booklet 3)
Supplement to: Measuring the difference / Catherine M. Burroughs. [2000]
Includes bibliographical references.
1. Health Education-organization & administration. 2. Community-Institutional Relations.
3. Information Services—organization & administration. 4. Data Collection—methods. I.
Barnes, Susan, MLS. II. National Network of Libraries of Medicine (U.S.) Pacific Northwest
Region. III. National Library of Medicine (U.S.) IV. Title. V. Series.
02NLM: WA 590 051c 2006
NATIONAL UBRARY OF MFDICINE
Additional copies can be ordered from:
National Network of Libraries of Medicine,
Outreach Evaluation Resource Center
Box 357155
University of Washington
Seattle, Washington, 98195-7155
nnlm@u.washington.edu
http://nnlm.gov/evaluation/
This project has been funded in whole with Federal fund* fmm «,« m *•
institutes of Hea,,h, Depa^n, o^
Table of Contents
Preface i
Acknowledgements ii
Introduction 1
Introduction — Quantitative Methods
Step One — Design Your Data Collection Methods — Quantitative Methods______________________5
Step Two — Collect Your Data — Quantitative Methods_______________________________________ 10
Step Three — Summarize and Analyze Your Data — Quantitative Methods______________________13
Step Four — Assess the Validity of Your Findings — Quantitative Methods___________________._____17
Introduction — Qualitative Methods_________________________________________________------ ^
Step One — Design Your Data Collection Methods — Qualitative Methods____________________----21
Step Two — Collect Your Data — Qualitative Methods________________________-----------------23
Step Three — Summarize and Analyze Your Data — Qualitative Methods______.-----------------_25
Step Four — Assess the Validity of Your Findings — Qualitative Methods------------------------29
Take Home Messages___________________.________—.-----------------------------------
References_____________________________________.---.---------------------------------
Tool Kit
Case Example — Using Mixed Methods
Worksheet 1 — Planning a Survey
Worksheet 2 — Planning an Interview
Blank Worksheets__________._______
Checklist_________________________
30
31
Appendix 1 — Examples of Commonly Used Quantitative Evaluation Methods--------------------32
Appendix 2 — Vtoys to Improve Response Rates for Electronic Surveys —.----------------------33
Appendix 3 — Examples of Commonly Used Qualitative Methods----.------------------------
34
35
36
37
38
40
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine. 2006
Preface
This booklet is part of the Planning and Evaluating Health Information Outreach Projects series,
designed to supplement Measuring the Difference: Guide to Planning and Evaluating Health
Information Outreach.[\] This series also supports evaluation workshops offered through the
Outreach Evaluation Resource Center of the National Network of Libraries of Medicine (NN/LM).
The goal of the series is to present step-by-step planning and evaluation methods. Along with
providing information about evaluation, each booklet includes a case study and worksheets to help
you with your outreach planning.
The series emphasizes the relationship between planning and evaluation—this is why both words
are part of the series title. By including evaluation in the planning stage, you are committing to
doing it and you are more likely to make it integral to the overall project. Conversely, in planning
the evaluation you identify outcomes, which in turn help you to carefully assess project activities
and resource needs.
These booklets are aimed at librarians—from the health sciences sphere, particularly—and rep-
resentatives from community organizations who are interested in conducting health information
outreach projects. We consider "health information outreach projects" to be educational or aware-
ness activities designed to enhance community members' abilities to find and use information. A
goal of these activities might be to equip group members to better address their—and their family
members' and peers'—questions about health. Such outreach often focuses on online health in-
formation resources such as the Websites produced by the National Library of Medicine. Projects
may also include other sources and formats of health information.
The first booklet, Getting Started with Community-Based Outreach is designed for those who have
an idea for working with their communities but do not know how to start. It describes these steps:
1. Find partners for health information outreach projects,
2. Learn more about the outreach community, and
3. Inventory resources and assets.
The second booklet, Including Evaluation in Outreach Project Planning, is intended for those who
need guidance in designing a good evaluation plan. It discusses the following:
1. Develop an outcomes-based project plan,
2. Develop an outcomes assessment plan,
3. Develop a pre-project assessment plan, and
4. Develop a process assessment plan.
The third booklet, Collecting and Analyzing Evaluation Data, will probably be more understand-
ZVXZ Wf S°me experience in inducting health information outreach, but those just start-
^Sl^T* °utreach also may find [t useful for Plannin§ their outreach Pr°srams-li
E^n^ ^T^ ^^ (Pr0CCSSeS ^ C°UeCting data and tU™4 them int°
ris_^^rw (processes for coiiectin§ n°n-numeric ^^p^ ^ormati°n
1 • Design your data collection methods,
2. Collect your data,
Collecting and Analyzing Evaluation Data
&assarj_s:_?_-_ai_
Preface ii
3. Summarize and analyze your data, and
4. Assess the validity of your findings.
We strongly endorse partnerships among organizations from a variety of environments, including
health science libraries, community-based organizations, and public libraries. We also encourage
broad participation of members of target outreach populations in the design and implementation of
the outreach project. We try to describe planning and evaluation methods that accommodate this
approach to community-based outreach. Still, we may sound like we are talking to project leaders.
In writing these booklets we have made the assumption that one person or a small group of people
will be in charge of initiating an outreach project, writing a clear project plan and managing the
evaluation processes.
We also encourage evaluation practices that adhere to the Program Evaluation Standards
developed by the Joint Committee on Standards for Educational Evaluation, which can be
found at http://www.eval.org/EvaluationDocuments/progeval.html [2] The utility standards
require that evaluation findings will serve the information needs of the intended users, primarily
those implementing a project or those with some vested interest in it. The feasibility standards
direct evaluation to be cost-effective, credible to the different groups who will use evaluation
information, and minimally disruptive to the project. The propriety standards uphold evaluation
that is conducted ethically, legally, and with regard to the welfare of those involved in or affected
by the evaluation. Finally, the accuracy standards indicate that evaluation should provide
technically adequate information for evaluating a project.
We sincerely hope that you find these booklets useful. We welcome your comments, which you
can email to nnlm(« ii.washington.edu.
Acknowlegements
We are grateful to our colleagues who have graciously provided feedback and input, especially:
Dana Abbey, Consumer Health Liaison, NN/LM MidContinental Region
Renee Bougard, Associate Director, NN/LM South Central Region
Kelli Ham, Consumer Health Coordinator, NN/LM Pacific Southwest Region
Claire Hamasu, Associate Director, NN/LM MidContinental Region
Betsy Kelly, Assessment and Evaluation Liaison, NN/LM MidContinental Region
Michelle Malizia, Outreach Coordinator, NN/LM South Central Region
Heidi Sandstrom, Associate Director, NN/LM Pacific Southwest Region
Debra Stark, Evaluation Specialist, University of Texas Health Science Center at San Antonio
We also deeply appreciate Cathy Burroughs' groundbreaking work, Measuring the Difference: Guide to Planning and
Evaluating Health Information Outreach and thank her for her guidance in our creating the booklets in this update and
supplement, the Planning and Evaluating Health Information Outreach Projects series.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
National Network of Libraries of Medicine, National Library of Medicine, 2006
1 Introduction
While conducting an outreach project, you
will need to make several decisions. As you
monitor project activities, you will need to
decide whether to make changes to your
plans. As the project nears its end, you will
decide how to report the results. You and
others invested in the project, referred to
as stakeholders, will have to decide if your
outreach project should be continued. If
you are going to make good decisions about
your outreach project, you need information
or data. In this booklet we use the word
"data" to include numbers, facts, and written
descriptions of comments gathered through
counting, surveying, observing, interviewing,
or other investigations.
During community and pre-project
assessment, data can help you identify groups
in your community that are in particular
need of health information outreach. Data
also can be used to assess the resources and
challenges facing your project. While you are
implementing your activities and strategies,
data can provide you with feedback for
project improvement — this is called process
assessment. During outcomes assessment,
data can provide the basis for you and other
stakeholders to identify and understand
results and to determine if your project has
accomplished its goals.
Therefore, much care must go into the design
of your data collection methods to assure
accurate, credible and useful information.
To really understand and assess an outreach
project, multiple and mixed methods are
required:
• "Mixed methods" means that a variety of
types of information sources are used to
assess your project.
Good evaluation usually combines both
quantitative and qualitative methods.
Quantitative methods gather numerical data
that can be summarized through statistical
procedures. Qualitative methods collect
non-numerical data, usually textual, that can
provide rich details about your project. Each
approach has its particular strengths and, when
used together, can provide a thorough picture
of your project.
This booklet is organized into two sections:
one for quantitative methods and one for
qualitative methods. After a brief overview,
each section focuses on a specific method
that is common and applicable to a variety of
evaluation projects. In the quantitative section,
surveys are the chosen method. For the
qualitative section, interviewing is the method
addressed.
However, we should note that neither surveys
or interviews are limited to collecting one
type of data. Either method can be designed
to collect qualitative or quantitative data
and, often, they are designed to collect a
combination of both.
You pick the type of method based on the
evaluation question you want to answer.
Figure 1 is designed to help you make a
decision about the type of method to use.
• "Multiple methods" means collecting data
from more than one source and not relying
on one survey or test or focus group to
provide an adequate assessment of your
program.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine. National Library of Medicine, 2006
Introduction 2
Figure 1: Choosing Type of Method
If you are trying to learn...
How many?
How much?
What percentage ?
How often?
What is the average amount?
If you are trying to learn...
What worked best?
What did not work well... ?
What do the numbers mean?
How was the project useful... ?
What factors influenced success or failure?
Choose quantitative methods
(see page 3)
Choose qualitative methods
(seepage 19)
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Introduction — Quantitative Methods
Evaluation Using Quantitative Methods
Stepl
Design Your Data Collection Methods
Write your evaluation questions
Develop data collection tool (e.g., survey)
Pilot test data collection tool
Step 2
Collect Your Data
Decide whether to use a sample or all participants
(census)
Use as many methods as possible to increase
response rate (e.g., multiple mailings, personalized
pre-survey mailings and cover sheets, incentives)
Be sure participants receive informed consent (e.g., in
survey cover letter) before they start the survey
Step 3
Summarize and Analyze Your Data
Compile descriptive data (frequencies percentages,
averages, medians, modes)
Put data into tables to aid analysis
VUite a paragraph describing what each table
indicates about your evaluation questions
Step 4
Assess the Validity of Your Findings
Describe any shortcomings of your data collection
and how it affects your interpretation (e.g., low
response rates; problematic questions)
Collecting and Analyzing Evaluation Data
Planning and Evaluatmg Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Introduction — Quantitative Methods 4
Any data that can be counted is considered quantitative data, including attendance at classes or
events, participation or drop-out rates, test scores, and satisfaction ratings. Quantitative methods
show the degree to which certain characteristics are present, such as frequency of activities,
opinions, beliefs, or behaviors within a group. They can also provide an "average" look at a group
or population. For example, you might use quantitative methods to determine the average number
of times workshop participants look up health information online every week.
The advantage of quantitative methods is the amount of information you can quickly gather and
analyze. The questions listed below are best answered using quantitative methods:
1. How many clinics in our outreach project have bookmarked National Library of Medicine
resources on at least one of their computers?
2. On average, how much did trainees' confidence in using online health information
resources improve after training?
3. What percentage of participants in a PubMed training session said their skills in using the
resource improved as a result of taking the course?
4. How many people visited the resource Website during the grant period?
5. What percentage of visitors to a booth at a health fair showed interest in finding
prescription drug information online?
6. How likely are participants on average to recommend MedlinePlus to others?
7. What percentage of users improved their ability to find good consumer health information
as a result of our sessions?
Appendix 1 describes some typical methods for collecting quantitative data. The rest of this section
will focus on one of the most popular quantitative methods: surveys. This method has been chosen
because of its usefulness at all stages of evaluation. Surveys use a standard set of questions to get
a broad overview of a group's opinions, attitudes, self-reported behaviors, and demographic and
background information. Discussion is limited to written surveys such as those sent electronically
or through the mail.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step One - Design Your Data Collection Method - Quantitative Methods
Step One
Design Your Data Collection Methods — Quantitative Methods
A data collection method is a procedure for gathering information. For surveys, the method
comprises two parts: the questionnaire and the group that receives it. The first step m designing your
survey is to write out the general evaluation questions you want to answer. Evaluation questions are
different from your survey questions, which are specific, carefully formatted questions designed to
collect data related to the evaluation questions.
For instance, listed below are some sample evaluation questions.
• Community or pre-project assessment. During the planning stages of an outreach project, you can
use surveys to assess your outreach community members' beliefs, attitudes, and comfort levels in
areas that will affect your outreach strategies. Evaluation questions may be:
— "What health information resources do people in this community use most often? "
—"How many people are experienced Internet users?"
If you have a logic model, you should review the resource and activities columns to
help you to focus the needs assessment questions.
• Process assessment. Surveys are often used mid-project to get participants' feedback about the
quality of the activities and products of your outreach project. So your evaluation questions
might be:
— "How do participants rate the effectiveness of our teaching methods? "
— "How do participants rate the usefulness of the online resources we are
providing? "
— "How many people are likely to use the health resources after the training
session? "
You should look at the activities and inputs column of your logic model to determine the
questions you might want to ask.
• Outcomes assessment. At this stage, you use surveys to help assess the results of your outreach
project. So questions might include:
— "Do participants use the online resources we taught after they have completed
training? "
— "Have participants talked with their physicians about something they found at
MedlinePlus? "
— "How many health care professionals trained in our study said they retrieved
information from MedlinePlus to give to a patient? "
When designing a survey for outcomes assessment, you should review the outcomes columns of
your logic model.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step One — Design Your Data Collection Method — Quantitative Methods 6
Table 1: Aligning Evaluation and Survey Questions
Evaluation Question Items for the Survey
"How do participants rate the quality of the training session?" • How satisfied were you with the information presented during the training session? (response options: very satisfied/somewhat satisfied/neutral/somewhat dissatisfied/very dissatisfied) • Would you recommend this session to others? (response options: yes/no/don't know) • Do you think you will use the online resources in the future? (response options: yes/no/don't know)
The second step is development of survey questions for your questionnaire to help you answer your
evaluation questions. One approach is to use a format like that shown in Table 1 to align survey
questions with evaluation questions.
Before you actually design your questionnaire, you might want to look at existing ones for their
format and layout. Examples 1-6 will give you some ideas for formatting survey questions. You
also could try contacting colleagues with similar projects. They may be willing to share their
surveys. Journal articles about health information outreach projects sometimes include complete
copies of questionnaires. If not, the article will provide the authors' contact information so that
you can request copies of their surveys. Writing surveys can be tricky, so you should consider
using questions from other projects that already have been tested for clarity and comprehension.
However, if you do copy verbatim from other surveys, always be sure to secure permission from
the original author or copyright holder.
Example 1 Two-Option __
Have you used MedlinePlus since the training session?
□ Yes D No □ Not sure
Comments
• The yes-no item works well for collecting factual information, like peoples' participation in activities,
exposure to publicity materials, or experience with specific online resources.
• Other two-option formats are "true/false," "support/oppose" or "agree/disagree."
• Include a "don't know" or "not sure" option for participants who either cannot remember or are not
sure about the information you are requesting.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step One - Design Your Data Collection Method - Quantitative Methods
Example 2: Best Option
The last time you looked for health information on the Internet, who were you getting it for? (choose one)
D Myself
□ A family member
□ A friend or coworker
D A supervisor
D A client
□ Other (please describe________________________)
Comments
• Best option items are good for collecting information about the respondent's attributes and behaviors.
• Make sure that choices do not overlap so that each person can easily choose only one response.
• Provide an "other" response for options that are not included on the list.
Example 3: Multiple Option
Where do you get health information? (check all that apply)
□ From my doctor or clinic
D Newspapers and magazines
D Television
□ Radio
□ Friends or family members
□ Other (please describe_______________________)
Comments
• This is a faster version of the "yes/no" format: a check means "yes" and blank means "no."
• If your list of options gets to be more than 6 or 7 items, use a "yes-no" format instead. If the list is
too long, people may not consider every item. When forced to respond, they are more likely to look
at each item.
• Use "Other" even if you think you have listed all possible responses. People will use this option if
they are not sure where their option fits.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step One — Design Your Data Collection Method — Quantitative Methods 8
Example 4: Rating Scales
Version 1 Please check the option that indicates your level of agreement with the statement.
Because of the training session, I am much more confident about my ability to find
information about my health concerns.
□ Strongly D Somewhat □ Uncertain D Somewhat □ Strongly
Agree Agree Disagree Disagree
Version 2 Please circle the option that indicates your level of agreement with the statement.
How helpful were the group exercises?
Very Not at all
Ulftl l 2 3 4 5 6 7 u i *i
helpful helpful
Comments
• These two formats are good for collecting information from respondents about their attitudes,
feelings, beliefs, and opinions.
• A neutral point is usually recommended for participants who do not have strong opinions in either
direction about the item.
• You can provide as many response choices as you want, but most experts believe 5-7 options are adequate.
Example 5: Rank-Order ^
Listed below are different health topics that could be included on a consumer health Website. Rank the
features in terms of how important each topic is to you, with "1" as the most important feature and "7" as the
least important.
Specific health conditions
Wellness information
_ Alternative medicine
Prescription drugs
Health insurance, Medicaid, Medicare
Clinical trials
Health news
Comments:
• This format should be avoided. Ranking items is a difficult task for respondents. Also, you may force
respondents to rank two items that are of equal importance to them. When possible, choose a rating scale
(Example 4) instead of a rank-order item.
• Statistical analysis of rank-ordered items is very tricky because responses across individuals are not
comparable. Using the item above as an example, two people may rank Prescription Drugs as the most
important feature of a Website relative to the other features in the list. However, the first respondent may
think everything on the list is important and the second may think nothing is important, so a "1" tells you
nothing about the strength of the importance to each respondent. To analyze this type of data, the best you
can do is show how many times an item was ranked, for instance, as 1 or 2.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step One — Design Your Data Collection Method — Quantitative Methods
Example^: Open_Ended
List at least two important things you learned in the training session today
2.
Comments:
• This format yields qualitative data, but it is often helpful in interpreting the statistical information you
gather on your survey. To analyze open-ended questions, use the methods described beginning with
Step Three of the "Qualitative Methods" of this booklet on page 22.
• Avoid starting a survey with open-ended questions. Open-ended questions can be oveiwhelming and
people may choose to not take the survey. Draw the respondent in with some interesting, easy
quantitative questions and save your open-ended questions for later in the survey.
The visual layout of your survey is also
important. Commercial Websites that offer
online survey software give examples of how to
use layout, color, and borders to make surveys
more appealing to respondents and easier for
them to complete. There are several popular
commercial products to create Web-based
surveys, such as SurveyMonkey
(http://surveymonkey.com/).
Once you have designed your survey, be sure
to pilot test it before you send it to your target
audience. Even if you think your wording
is simple and direct, it may be confusing
to someone else. It is very easy for survey
questions and options to be misunderstood,
and a pilot test will reveal areas that need to
be clarified. First, ask one or two colleagues
to take the survey while you are present and
request that they ask questions as they respond
to each item. Make sure they actually take the
survey, because they will not pick up confusing
questions just by reading it.
Once you have made adjustments to the
survey, give it to a small portion of your target
audience and look at the data. Does anything
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
look out of place? For instance, if a large
percentage of people are picking "other" on a
multiple-option question, you may have missed
a common option. Only after you have piloted
the survey are you ready to administer it.
The design stage also entails seeking approval
from appropriate cornmittees or boards that
are responsible for the safety and well-being
of those participating in your project. If you
are working with a university, most evaluation
research must be reviewed by an Institutional
Review Board. Evaluation methods used
in public schools often must be approved
by the school board and community-based
organizations may have their own review
processes that you must follow. Because
many evaluation methods pose little to no
threat to participants, your project may not
require a full review. Therefore, you should
considering meeting with a representative
from the Institutional Review Board or other
committee to find out the best way to proceed
with submitting your evaluation methods
for approval. Most importantly, it is best to
identify all of these review requirements while
you are designing your methods; otherwise,
your evaluation may be significantly delayed.
Step Two — Collect Your Data — Quantitative Methods 10
Step Two
Collect Your Data — Quantitative Methods
As part of planning your survey, you will
decide whether to collect data from a
subgroup (sample) of your target population
and generalize their responses to the whole
population or to collect data from the entire
group targeted by the survey (census).
Sampling is used when working with large
groups of people where it is impractical to
send a survey to everyone, so you send the
survey to a portion of the group. Random
sampling means everyone in the population
has an equal chance of being included in the
sample. For example, if you want to know
how many licensed social workers in your
state have access to online medical journals,
you probably do not have to survey all
social workers. If you use random sampling
procedures, you can assume (with some
margin of error) that the percentage of all
social workers in your state with access is
fairly similar to the sample percentage. In
that case, your sample provides adequate
information at a lower cost than a census. For
details about random sampling, see Appendix
C of Measuring the Difference. [1]
With smaller groups, it is possible to conduct
a census by sending the survey to everyone.
In this case, any information you summarize is
a description of the group of respondents only.
For instance, if you survey all seniors who
were trained in your outreach project to use
MedlinePlus and 80% of them said they used
it at home one month after the session, you
can describe how many of your trainees used
MedlinePlus after training. This percentage
provides important information about a
result of your outreach project. However,
because you have not randomly sampled
from among all seniors who have ever been
trained on MedlinePlus, you cannot make a
generalization that 80% of all seniors who
get training on MedlinePlus use it within one
month of training.
The quality of your survey data, whether
collected through a sample or a census,
depends heavily on how many people
complete and return your questionnaire. The
percentage of people who return a survey
is known as response rate. When a high
percentage of people respond to your survey,
you have an adequate picture of the group.
But when you have a high percentage of
nonrespondents, characteristics of the group
remain unknown to you, making it difficult
for you to interpret your results. Therefore,
your results may be biased and unreliable.
For instance, the respondents may have
been more enthusiastic or more dissatisfied
compared to nonrespondents. If the survey
was administered electronically, those who
returned the survey may be more computer-
literate. However, though you may suspect
bias when your response rate is low, you may
not know how or by how much.
Statisticians seldom agree about what
constitutes an adequate response rate, but
few would accept levels below 50%. Using
techniques like those described in Figure 2,
survey researchers usually obtain response
rates in the range of 50-80% [3], which seems
to be the acceptable standard among most
survey researchers.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
11 Step Two — Collect Your Data — Quantitative Methods
Figure 2: How to administer surveys
1. When using mail surveys, always send a personalizedpre-survey letter to the target audience
from someone influential or well-liked by the group. For electronic or on-line surveys, send a
personalized pre-survey e-mail message announcing that a survey will be sent via email within
the next week.
2. Within a week of the pre-survey letter, send the survey with a personalized cover letter (e.g.,
"Dear Jane Smith ") or personalized email with a link to the survey.
3. Within a week after sending the survey, send a personalized reminder postcard or email.
4. Within two weeks, send or email another survey, again with a personalized cover letter.
5. Keep track of undeliverable surveys. If you mail surveys, be sure to use first class mail so
undeliverable surveys are returned to you. If you send surveys through email, keep track of the
returned emails and, if possible, send print surveys to those participants. This mixed-method
approach has been shown to increase response rates for electronic surveys.
6. Consider using these tips to increase your response rates:
• Certain survey design principles may increase response rates. Be sure to start your survey
with interesting questions that are easy to answer. Do not start with open-ended questions
because they may make the survey seem overwhelming to respondents. Most research shows
that demographic questions should be at the end of the survey because respondents find them
boring or, in some cases, offensive.
• Incentives may help your response rate. For mailed surveys, research indicates that the
best time to send an incentive is with the first survey, not after the survey has been returned
to you.[5] For web surveys, one study showed that being entered into a lottery for a larger
financial incentive seemed to work better than prepaid or postpaid incentives. [6] It is
important to note, however, that most survey researchers think that making multiple contacts
(such as those described in this box) has an equal or greater positive effect on response
rates compared to incentives. So if you have to choose between incentives or postage for
replacement surveys, choose the latter.
Figure 2 defines a typical protocol for administering mailed surveys. Studies show that these
procedures are effective for surveys sent either through regular mail or email. [3,4] Because online
surveys are becoming increasingly popular, Appendix 2 of this booklet presents more detailed
suggestions for designing and sending electronic surveys that may help to increase response rates [4]
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step Two — Collect Your Data — Quantitative Methods 12
Getting a high response rate can be difficult,
even when you implement procedures for
improving it. If you fail to get a return rate of
50% or more, you may wonder if the data are
worth analyzing. Very few evaluators would
discard data. Instead, they would analyze it
but try to discern where the bias might be.
If resources allow, they also may attempt to
contact nonrespondents with a short version
of the survey to assess the level of bias in the
sample. Evaluators also may compare their
findings from surveys against information
they have collected through focus groups,
interviews, and other qualitative methods
to see if the numbers are consistent with
survey findings. The important thing is that
you report your data along with the potential
biases so that readers of your report can make
an informed assessment of the credibility of
the findings.
The cover letter is an important part of the
survey process. It should include information
that might affect an individual's decision
to participate. On the one hand, it is a
motivational tool to induce the recipient to
take the time to respond to the survey. The
cover letter can also serve as a vehicle to
inform the individual of any potential risks
to participation. This is called "informed
consent." If you must have your project
reviewed through an institutional review board
(IRB) or some other type of review board, you
should get specific details of what should be in
the letter. If you are not working with an IRB,
evaluation ethics still require you to provide
some standard information for respondents
before they take the survey:
• Why you are conducting the survey and
why their participation is important,
• How you plan to protect the respondent's
confidentiality or anonymity,
• The risks and benefits to the respondents
who choose to participate,
• The voluntary nature of their participation
and their right to withhold answers at any
point in the survey, and
• How their responses will be reported and
to whom.
Once you have received the last of your
surveys, you will have accumulated raw data
that you must try to understand. To do so, you
must summarize the raw data so you can then
analyze it.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
13
Step Three — Summarize and Analyze Your Data — Quantitative Methods
l^s^TsZ^z^
8
Summarize and Analyze Your Data — Quantitative Methods
The first step in analyzing quantitative data is to summarize the responses using descriptive statistics.
When you collect and summarize quantitative data, your result is a distribution of scores for each
item on your survey (except open-ended items). A distribution is simply the collection of all ratings
or scores for a particular item, ordered from the lowest to the highest value. Table 2 presents some
of the most common descriptive statistics: frequency counts, percentages, and measures of central
tendency (mean, median, and mode).
Table 2: Examples of Descriptive Statistics
Question: Please indicate your level of agreement with this statement. I am more confident about finding prescription drug information on the Web after taking this training session.
Response Strongly agree Somewhat agree Uncertain Somewhat disagree Strongly disagree Total Missing
Response value (5) (4) (3) (2) (D
N 100
Frequencies 54 36 5 2 0 97 3
Percent 54.0% 36.0% 5.0% 2.0% 0.0% 97.0% 3.0%
Valid Percent 55.7% 37.1% 5.2% 2.1% 0.0%
Mean 4.41
Median 5
Mode 5
Definitions
N
Frequencies
Percent
Valid Percent
Mean
Median
Mode
Number of people responding to the survey. (Note: 100 people returned a survey, but only 97
responded to this particular question.)
The number of respondents choosing each response.
The number of those choosing that response divided by the number of people who completed the
survey
The number of respondents choosing that response divided by the number of respondents who
answered the question. In this example, we had 100 people complete the survey, but only 97 actually
responded to this particular question.
The mean is the "average" response in your distribution. It is computed by adding all responses and
dividing by the number of respondents who answered the question.
The median is the score that is in the middle of the distribution, with half of the scores above and
half below. To find it, sort your distribution from highest to lowest ratings, then find the number that
equally divides the distribution in half. For the 97 people who completed this distribution the 49,h
score divides the distribution in half. The 49lh (median) score is a "5." When the majority of ratings
fall either at the high or low end of a rating scale, as they do here, the median is usually the preferable
measure of central tendency because it is not affected by a few extremely low or high ratings
The mode is the most frequent response. For many demographic and two-option questions the mode
is the onlv measure of central tendency that can be reported. This is also true for questions'that ask
respondent to provide more than one response, such as "check all that apply" questions
Collecting and.Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine. National Library of Medicine, 2006
Step Three — Summarize and Analyze Your Data — Quantitative Methods 14
Table 3: Participants' Self-Report of Confidence in Using Databases N=50
Strongly 4 Agree Agree _-----,---:— Neither Agree or Disagree F"....................""■............... Disagree Strongly Disagree
The training session helped me develop more confidence in using MedlinePlus. 23 46% 16 32% 9 18% 2 4% 0 0%
The training session helped me develop more confidence in using PubMed. 10 20% 22 44% 13 26% 3 6% 2 4%
Analysis: The majority of respondents agreed or strongly agreed that the training sessions helped them gain confidence in using the NLM online resources. Ratings seemed to be slightly more positive for MedlinePlus. This indicates that we achieved our objective of increasing confidence in use of online resources with the majority of our participants.
Tables are very helpful for understanding your data. Tables 3-7 show formats that will help
you analyze your descriptive data. After you compile a table, write a few notes interpreting the
numbers.
You may simplify your data to make the positive and negative trends more obvious. For instance,
in Table 4, the "Strongly Agree" and "Agree" responses were combined into a "Positive" category
and the "Disagree/Strongly Disagree" responses were put into a "Negative" category.
Table 4: Participants' Self-Report of Confidence in Using Databases N=50
Positive (Strongly Agree/ Agree) Neutral i (Neither Agree or Disagree) Negative (Di sagree/Strongly Disagree)
The training session helped me develop more confidence in using MedlinePlus. 39 78% 9 18% 2 4%
The training session helped me develop more confidence in using PubMed. 32 64% 13 26% 5 10%
Analysis: This table makes the pattern of positive ratings more obvious for the items introduced in Table 3. It also confirms that ratings were more positive for the MedlinePlus session compared to the PubMed session. One explanation might be that PubMed is more difficult to use and requires a longer training session or more training sessions compared to MedlinePlus
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
15 Step Three — Summarize and Analyze Your Data — Quantitative Methods
Table 5: Average Number of NLM Resources Used Before and One Month After Training N=80
Average # of Websites Average # of Websites Before Training One Month After Training Difference
How many of the following Websites have you used in the past month. (Check all that apply of 6 resources.) 1.85 3.37 1.52
Analysis: Of the six Websites we demonstrated in the training session, participants on average had used less than two of them before training. One month after training, they had, on average, visited more than three of the Websites. This finding suggests that we chose Websites that our participants found to be useful.
Sometimes, you may want to see how participants' attitudes, feelings, or behaviors have changed
over the course of the project. Table 5 also shows you how to organize pre-project and post-
project data into a chart that will help you assess change. Table 5 also presents means rather
than percentages. Data that represent a wide range of scores, such as attendance rates for a large
number of training sessions, sometimes are easier to analyze using averages. You could also use
means or medians in place of percentages if you have rating scales such as those presented above
in Step 1 (see Example 4).
You may wonder if the findings vary for the different groups you surveyed. For instance, you may
wonder if nurses, social workers, or members of the general public found your resources as useful
as the health librarians who had your training. To explore this question, you would create cross-
tabulation tables.
Table 6: Average Number of NLM Resources Used Before and One Month After Training
Broken Down by Profession N=80
N w..........ra ...............................................................---- Average # of Websites Before Trairiing Average # of Websites One Month After Training Increase in Use
Health Science Librarians 20 3.7 4.3 .6
Social Workers 20 1.3 3.0 1.7
Nurses 20 2.2 3.6 1.4
General public 20 .2 2.6 2.4
Analysis: We did not seem to increase the variety of Websites used by the health science librarians, probably because, on average, they already had used more than half of the Websites we demonstrated. Our training seemed to have the greatest impact on the general public, who had used very few of the Websites. For planning future sessions, we may want to conduct a preliminary survey to find out what Websites are popular with health science librarians so we can adjust our training content to cover Websites they do not know.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step Three — Summarize and Analyze Your Data — Quantitative Methods 16
Table 7: Comparison of Those Who Used Resources After Training Compared to Targets in Objectives
Actual
Goal
Difference
Numbers of participants
using MedlinePlus after
training
62%
50%
+12%
Number of participants using
PubMed after training.
45%
50%
-5%
Analysis: We exceeded our criterion for the number of participants who used MedlinePlus after they
took our training sessions. However, we were slightly under our goal for PubMed. On the other hand,
because PubMed is more academic and MedlinePlus is more consumer-oriented, it is possible our
users simply had more occasion to use MedlinePlus the month following the session. We may want to
explore this in a follow-up interview with a few users who took both sessions to see if there are ways to
improve the PubMed training.
Finally, you also may want to compare your findings against the criteria you identified in your
objectives.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine. National Library of Medicine, 2006
17 Step Four — Assess the Validity of Your Findings — Quantitative Methods
Step Four
Assess the Validity of Your Findings — Quantitative Methods
Validity refers to the accuracy of the data
collected through your survey: did the survey
collect the information it was designed
to collect? It is the responsibility of the
evaluator to assess the factors that may affect
the accuracy of the data and present those
factors along with results. Threats to validity
of surveys usually fall in one of the following
categories:
• Response rate. As mentioned above,
when small percentages of respondents
return surveys, the potential for bias must
be acknowledged. Even when using the
strategies discussed earlier in Step 2 (see
Box 1), you may not obtain an adequate
response rate. If resources allow, you
can assess the degree of bias somewhat
with follow-up interviewing or surveying
of nonrespondents. For instance, if
you suspect that those who responded
were biased in the favorable direction,
you could conduct a phone survey with
a random selection of 10% of your
respondents with a few simple questions
to explore the extent of bias.
• Low completion rate of specific sections
of surveys. If many respondents do not
complete certain sections of the survey,
you will have to question the findings
of that part of the survey. For instance,
respondents may not finish the survey,
leaving final sections or pages blank. To
avoid this problem, keep your surveys as
short as possible. For electronic surveys,
provide a "progress bar" that tracks the
percentage of questions completed as the
respondent proceeds through the survey.
Low completion rate of questions. Even
if you have a respectable response rate,
you may have questions that are left blank
by a number of respondents. There are
several reasons why respondents do not
answer particular questions. They may
not find a response that applies to them,
the question format may be confusing, or
they do not understand the question. The
best strategy for avoiding this problem is
to carefully pilot your questions. If your
survey asks questions that are sensitive or
threatening, your best strategy for getting
responses is to conduct an anonymous
survey.
Socially desirable responding.
Sometimes respondents are embarrassed
to answer questions truthfully. If
possible, avoid using questions that ask
people to disclose information that may
be embarrassing or threatening. This
challenge may occur if your survey asks
respondents to report health behaviors
such as drinking, drug use, or even dietary
habits. If you must ask such questions,
providing anonymity may enhance the
accuracy of responses. You may be able
to find published studies that estimate
the extent to which people in general
overestimate or underestimate certain
health behaviors (such as daily calorie
consumption).
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step Four — Assess the Validity of Your Findings — Quantitative Methods 18
You cannot prove validity. You must build
your case for the credibility of your survey by
showing that you used good design principles
and administered the survey appropriately.
After data collection, you assess the
shortcomings of your survey and candidly
report how they may impact interpretation of
the data.
Surveys allow you to collect a large amount
of quantitative data, which then can be
summarized quickly using descriptive
statistics. This approach can give you a
sense of the experience of participants in
your project and can allow you to assess
how closely you have come to attaining your
goals. However, based on the analysis given
for each table on pages 15 and 16, you may
notice that the conclusions are tentative. This
is because the numbers may describe what the
respondents believe or feel about the questions
you asked but they do not explain why
participants believe or feel that way. Even
if you include open-ended questions on your
survey, only a small percentage of people are
likely to take the time to comment.
For evaluation, the explanations behind the
numbers usually are very important, especially
if you are going to make changes to your
outreach projects or make decisions about
canceling or continuing your efforts. That is
why most outreach evaluation plans include
a combination of qualitative and quantitative
methods.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
19 I ntroduction — Qualitative Methods
Evaluation Using Qualitative Methods
Stepl
Design your data collection methods
Write your evaluation questions
Develop data collection tool (e.g., interview guide)
Pilot your interview guide
Step 2
Collect your data
Interview a purposeful sample of participants. Have
an idea of your sample size, but be flexible (add more
to answer new questions; stop interviewing if you hear
nothing new)
Provide informed consent information to participants
before starting the interview
In preparation for Step 3, make notes immediately
after each interview
Summarize and analyze your data
Read through all text and generate a list of themes
Code all interview data systematically
Organize data by theme
Interpret the findings
Step 4
Assess the validity of your findings
Describe any information that could affect your conclusions
(exceptions to the typical themes, alternative explanations)
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Introduction — Qualitative Methods
Qualitative methods produce non-numerical
data. Most typically these are textual data
such as written responses to open-ended ques-
tions on surveys, interview or focus group
transcripts, journal entries, documents, or field
notes. However, qualitative researchers also
make use of visual data such as photographs,
maps, or videos.
The advantage of qualitative methods is
that they can give insight into your outreach
project that you could never obtain through
statistics alone. Qualitative methods seem
particularly useful for answering the following
types of questions:
1. Why were certain activities more effective
than others?
2. What important changes happened with
clients as a result of their training?
3. How did our clients use the resources
outside of training?
4. Why did some clients continue to use the
resources while some did not?
5. What barriers were discovered in
implementing the project? Which ones
were dealt with effectively and which ones
continued to be a problem?
6. What unexpected outcomes (positive
or negative) occurred as a result of our
project?
7. How was the intervention valuable to
clients and different stakeholder groups?
Qualitative evaluation methods are
recommended when you want detailed
information about some aspect of your
outreach project. Listed here are some
examples of the type of information best
collected through qualitative methods:
• Community or pre-project assessment.
Qualitative methods are useful for
identifying factors in the community that
may impact the implementation of your
project. These may include readiness
of different groups in the outreach
community to use the technological
resources you want to introduce,
community resources that can help your
outreach effort, or level of support among
community leaders for your project. This
type of information is usually discovered
better through qualitative methods
like interviews and observations of the
community.
• Process assessment. Qualitative methods
are useful for getting specific feedback
about outreach activities from those
involved in the project and answering the
"why" questions of process assessment:
Why are morning training sessions more
popular than evening ones? Why do
we have more women signing up for
training sessions than men? Who in the
community is not signing up for training
sessions and why?
• Outcomes assessment. Qualitative
methods can provide compelling
examples of your results in a way that
numbers will never capture. WTiile
numbers may tell you how many people
use MedlinePlus after a training session,
you will get examples of how they
used it through qualitative methods like
interviewing or responses to open-ended
questions. Because of the exploratory
nature of most qualitative methods,
you also are more likely to find out
about unexpected outcomes (positive
and negative) when you talk with those
involved in the project.
Appendix 3 describes some typical qualitative
methods used in evaluation. Interviewing
individual participants will be the focus of
the remainder of this booklet because it is a
qualitative method that has broad application
to all stages of evaluation.
As with quantitative methods, your first step
in an interviewing project is to write your
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
21 Step One — Design Your Data Collection Method — Qualitative Methods
Step One
Design Your Data Collection Methods — Qualitative Methods
evaluation questions. The process for writing
evaluation questions is the same as the one
described under quantitative methods. In fact,
you may decide that you want to use both
quantitative and qualitative methods to answer
the same evaluation questions. For instance, if
the evaluation question is
"Do participants use the online resources we
taught after they have completed training? "
Table 8: Types of Questions
You may decide to include a quantitative "yes/
no" question on a survey that is sent to all
participants, but you may decide to interview
ten or twelve participants to see how they used
it.
Your next step is to design an interview guide:
a list of questions that you plan to ask each
interviewee. Interviewing may seem less
structured than surveys, but preparing a good
interview guide is essential to gathering good
information. An interview guide includes all of
the questions you plan to ask and ensures that
you collect the information you need. Patton
discusses different types of interview questions
such as those presented in Table 8. [7]
Type of Question
Experience/behavior
Sensory questions
Information collected
What did respondents do?
Opinion/Value questions
Feeling questions
Knowledge questions
Background/Demographic
What did respondents
experience through their five
senses? (This is a variation
on the experience/behavior
question but focuses on what
they saw, heard, touched,
smelied, or tasted.)
What do respondents think or
believe to be important?
What were respondents'
emotional reactions?
What factual information does
the respondent know?
What are the characteristics of
your respondent?
Example
"The last time you needed
health information, where did
you go to get it?"
"How did your doctor act when
you showed her the information
you found at MedlinePlus?"
"What do you like best about
MedlinePlus?"
"How did you feel when you
could not find information
about your child's health
condition?"
"What are the busiest times of
day for the computer lab?"
"What do you do for a living'
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step One — Design Your Data Collection Method — Qualitative Methods 22
The order of the questions also can influence
the interview. You need to start with questions
that will allow you to gain rapport with the
interviewee. Patton includes the following tips
for developing and ordering interview questions
[7]:
• Start with noncontroversial experience
or behavioral questions that are easy to
answer, straightforward, and do not rely
on much recall. Sometime interviewees
can provide better opinions and feelings
if participants first describe an actual
experience.
• Questions about the present are easier to
answer than questions about the past and
future. If you plan to ask about the future
or past, ask a "baseline" present question
like "Where do you usually go when you
need to find health care information?"
Then you can ask "Have you gotten health
information anywhere else?" followed
by "Are there other sources of health
information you know about that you might
use in the future?"
• Knowledge and skill questions may be
threatening when posed out-of-context.
Try embedding them with experience
questions. For instance, you might first
ask, "WTiat training sessions have you
taken to learn about online consumer health
resources" followed by, "What are some
things you learned in those sessions?"
• Use some demographic question like "how
long have you worked in the medical
center?" to establish rapport with the
interviewee. You also may need to ask
this type of background question to make
sense of the rest of the interview. However,
keep demographic questions to a rninimum
because they can be boring and they may
be too personal to be asked early in the
conversation.
• Avoid questions that can be answered with
one word or phrase. Rather than asking
"how effective was the training session?"
which sounds a lot like a survey question,
ask "What did you learn at the training
session?" or "How did the training session
help you?"
• Try to ask about one idea per question.
You might introduce a line of inquiry with
multiple ideas in a statement like "Now I
want to ask about what you like and dislike
about PubMed." But focus by asking,
"First, what do you like?"
• Be sure to use language that the interviewee
understands. It is sometimes difficult to
recognize jargon or acronyms, so you
might want to pilot test your questions with
someone outside of your field to make sure
the language is understandable.
• Avoid starting questions with "why." WTry
questions tend to be unfocused and you
may not get the information you really
want. Less focused questions are also more
difficult for the interviewee to answer.
Instead of asking, "Why did you decide to
become a hospital volunteer?" you might
ask "What attracted you to becoming a
volunteer at this hospital?" or "WHien you
decided to become a volunteer, what made
you choose to work in a hospital."
As with a survey, it is a good idea to pilot your
interview questions. You might pilot your guide
with someone you are working with who is
familiar with your interviewees. (This step
is particularly important if your interviewees
are from a culture that is different from your
own.) Sometimes evaluators consider the first
interview a pilot interview. Any information
they gather on the first interview is still used,
but they revisit the question guide and make
modifications if necessary.
Finally, be sure your interview project is
reviewed by the appropriate entities. Interviews
are so personal, they may not seem like research
and you may forget they are subject to the same
review procedures as surveys. So do not make
this assumption, or you may face a delay in
collecting your data.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
23 Step Two — Collect Your Data — Qualitative Methods
Step Two
Collect Your Data — Qualitative Methods
Like quantitative methods, interviewing requires
a sampling plan. However, random sampling
usually is not recommended for interviewing
projects because the numbers of interviewees
are so small. Instead, most evaluators use
purposeful sampling (sometimes called
purposive sampling), in which you choose
participants that you are sure can answer your
questions thoroughly and accurately.
There are a number of approaches to purposeful
sampling and use of more than one approach is
highly recommended. The following are some
examples described by Patton [7]:
• You may want interviewees who represent
the "typical" user or participant, such as
the typical health information consumer
or typical health care provider in your
community.
• To illuminate the potential of your project,
you may decide to interview people who
have made the most out of the training you
have offered.
• To explore challenges to your strategies and
activities, you might choose to interview
those who did not seem to get as much from
the project or chose not to participate in
outreach activities.
• You may decide to sample for diversity,
such as interviewing representatives from
all of the different stakeholder groups in the
project.
• You might set criteria for choosing
interviewees, such as participants that
completed 3 of 4 training sessions.
• If you have difficulty identifying potential
interviewees, you can use a snowball
or chain approach where you ask
knowledgeable people to recommend other
potential interviewees.
There are occasions where random sampling of
interviewees is warranted. In some cases, you
will increase credibility of your results if you can
demonstrate that you chose participants without
knowing in advance how they would respond
to your questions. In some circumstances, this
is an important consideration. However, you
must realize that a random sample generated for
qualitative evaluation projects is too small to
generalize to a larger group. It only shows that
you used a sampling approach that would rule
out biases in choosing interviewees. [7]
Convenience samples, in which participants
are chosen simply because they are readily
accessible, should be avoided except when
piloting survey methods or conducting
preliminary research. The typical "person-on-
the-street" interviews you sometimes see on the
evening news is an example of a convenience
sample. This approach is fast and low-cost, but
the people who agree to participate may not
represent those who can provide the most or best
information about the outreach project.
A common question asked by outreach teams is
"how many interviews do we need to conduct?"
That question can be answered in advance for
quantitative procedures, but not for qualitative
methods. The usual suggestion is that you
continue to interview until you stop hearing new
information. However, resource limitations
usually require that you have some boundaries
for conducting interviews. Therefore, your
sampling design should meet the following
criteria:
• You should be able to articulate for
yourself and stakeholders the rationale for
why you have selected the interviewees in
your sample.
• Your list of interviewees should be
adequate in number and diversity to
provide a substantial amount of useful
information about your evaluation
questions.
• The number and diversity of your
interviewees should be credible to the
project's stakeholders.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step Two — Collect Your Data — Qualitative Methods 24
As you plan, always be prepared to add a few
interviews in case you find information that
should be pursued further. Your interviews may
uncover some exciting, unexpected responses
that you will want to explore further.
The ethics of interviewing require that you
provide introductory information to help the
interviewee decide whether or not to participate.
You can provide this information in writing,
but you must be sure the person reads and
understands it before you begin the interview. If
your project must be reviewed by an institutional
review board, you must follow its guidelines
for providing informed consent to interviewees.
However, with or without institutional review,
you should provide the following information to
your interviewees:
• The purpose of the interview and why their
participation is important;
• How their responses will be reported and to
whom;
• How you plan to protect the interviewee's
confidentiality;
• The risks and benefits of participation;
• The voluntary nature of their participation
and their right to refuse to answer questions
or withdraw from the interview at any time.
If you want to record the interview, explain what
will happen to the recording (e.g., who else will
hear it, how it will be discarded). Then gain
permission from the interviewee to proceed with
the recording.
Step Three talks about summarizing and
analyzing your interview data. In preparation for
this step, you should take reflective notes about
what you heard. These notes differ from the
notes you take during the interview to describe
what the participant is saying. Reflective notes
are taken shortly after the interview (preferably
within 24 hours) and include your commentary
on the interaction. Miles and Huberman [8]
suggest these memos should take from a few
minutes to a half hour. Some of the issues
you might include in reflective notes are the
following:
• What do you think were the most important
points made by the interviewee? Why
do you consider these important (e.g., the
respondent talked about die topic several
times or no other interviewee mentioned
these points.)
• How did the information you got in this
interview corroborate other interviews?
• What new things did you learn? Were there
any contradictions between this interview
and others?
• Are you starting to see some themes
emerging that are common to the interviews?
• Be sure to add descriptive information about
the encounter: time, date, place, informant.
• Start to generate a fist of codes with
each reflective note and write the codes
somewhere in the margins or in the corner of
your memo.
Miles and Huberman [8] also offer other
suggestions for these reflective notes:
Was there any underlying "meaning" in what
the informant was saying to you?
What are your personal reactions to things
said by this informant?
Do you have any doubts about what the
informant said (e.g., was the informant not
sure how open he or she could be with you)?
Do you have any doubts about the quality of
the information from other informants after
talking with this person?
Do you think you should reconsider how you
are asking your interview questions?
Are there other issues you should pursue in
future interviews?
Did something in this interview elaborate
or explain a question you had about the
information you are collecting?
Can you see connections or contradictions
between what you heard in this interview
and findings from other data (such as
surveys, interviews with people at other
levels of the organization, etc.)?
These notes may include things like themes
that seem to be emerging, questions that have
arisen during a specific interview, or conclusions
you may want to confirm at another interview.
This practice will make Step Three a little less
overwhelming.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
25 Step Three — Summarize and Analyze Your Data — Qualitative Methods
Step Three
Summarize and Analyze Your Data — Qualitative Methods
As with quantitative data, you must develop a
plan for compiling and analyzing qualitative
data. Analysis may seem overwhelming
because of the sheer volume of the
information you collect, but it will seem more
manageable if you approach it in phases.
Plan. First, during planning and interviewing
keep the amount of data you collect under
control. As described in Step One, you should
check your interview guide against your
evaluation questions to make sure you only ask
questions that are relevant to your project. This
will prevent you from collecting unnecessary
data. As discussed in Step Two, you should
keep notes that will help you become familiar
with your data as you collect it.
Code. Once you have completed most of
your interviews, the next step is to code the
data. In this step, you identify, categorize,
and label the themes or patterns in your data.
Review your transcripts, reports, and notes,
indicating major themes in the margins. Make
a list of the themes as you read. You can
also read your notes keeping your evaluation
questions in mind. For instance, you may have
conducted interviews to learn how participants
in a training session are using the training
and whether they have recommendations
for improving future sessions. Therefore,
you may read through the notes looking for
examples that fit themes related to "results"
"unexpected outcomes," "barriers to project
implementation," and "suggestions for
improvement." It is perfectly acceptable to
have a list of themes ahead of time and to add
themes as you read.
Once you have reviewed the material and
generated a list of major themes, go back to
your documents and code more systematically.
You do this by identifying "units" of
information and categorize them under one of
your themes A unit is a collection of words
related to one main theme or idea and may
be a phrase, sentence, paragraph or several
paragraphs. You can tell you have too many
words if you need more than one major theme
to categorize the unit.
One simple approach to coding is to highlight
each unit of information using a different
color for each major theme. You can print
the data and use highlighting markers, but the
highlighting function of a word processing
program also works nicely.
Organize. Next, put all the units with the
same highlight color together on one page
with a heading that reflects the category they
represent. You might want to use bullets
to separate the different units. Now, read
through each list and see if you can find
subthemes. For instance, under results, you
might find "results affecting participants" and
"results affecting the community." You could
use the comment function in Word to note
these subthemes, but it might be easier to print
the list with a large right margin and write the
subthemes in the margins.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Step Three — Summarize and Analyze Your Data — Qualitative Methods 26
Figure 3: Coding Interview Data
The following section is from a fictional interview with a lay health adviser from a faith-based outreach
program. It has been coded using the highlighting method described in the text. The colors have the
following codes:
=uses of MedlinePlus
outcomes
barriers
= suggestions for
improving the program
Interviewer: Describe some ways you have used MedlinePlus in your work here?
Respondent 1: This lady from the community came to see me because she was having terrible heartburn
- almost every day. We looked up heartburn on MedlinePlus.
Interviewer: What did you find?
Respondent 1: We found out there are better medicines than what she was taking and she did not have to
get a prescription. She talked to the pharmacist because she is on other medication, because MedlinePlus
said don't mix these pills with other pills. But the pharmacist told her it was okay for her to take them, but
that if the heartburn comes back she should see her doctor.
This woman said the medicine got rid of her heartburn almost immediately.
Interviewer: Do you have any other examples?
Respondent 1: There was a woman whose sister was diagnosed with breast cancer and she was so worried.
We read a little bit about it and found out that "stages" tell you now serious the cancer is. She went back
and asked her sister about her breast cancer and found out it was stage 1. That means her sister has a really
good chance of surviving it.
So this lady was so relieved.
Also, everyone was hearing about this bird flu and we were coming up on Thanksgiving. The ladies who
come to our Thursday brown-bag lunch meeting were saying they didn't know if they should serve turkey
this year. So the other lay health adviser and I printed some information off of MedlinePlus, passed it
around and we discussed it.
We discovered that bird flu is not in the United States, so we can have turkey for Thanksgiving as always!
Interviewer: Have you had any problems finding information for people?
Respondent lTNo, we can always find information on the topics people bring up. But sometimes people
don't want to tell us too much about their problems, especially if it is kind of a sensitive topic. We all know
each other around here, so people don't always want you to know things about them.
Interviewer: So how do you help them?
Respondent 1: We try to just show them in general how to search for a health topic, then give them privacy
with the computer. It works okay as long as they know a little bit about using a computer.
Interviewer: What kind of help could the librarian give you with getting MedlinePlus known in your
community?
Respondent 1: We have some new lay health workers starting in a month or so and she does a good job of
showing how to use MedlinePlus, so it would be good if she could come to some of their training sessions.
Transcript, Page 4
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects. Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries o\ Medicine. National Library of Medicine, 2006
27 Step Three — Summarize and Analyze Your Data — Qualitative Methods
Figure 4: Organizing and Analyzing the Coded Data
The "Uses of MedlinePlus" theme has been organized onto one page and subthemes have been identified. A description is also provided for each theme and subtheme. Note that the interviewee is identified so that the coder can go back to read the original interview. You might also want to put the page number of the unit.
Code "Uses of MedlinePlus" Code Description: Uses of MedlinePlus by Health Advisors
• Respondent 1: This lady from the community came to see me because she was having terrible heartburn - almost every day. We looked up ^ heartburn on MedlinePlus.[p4] Learn about health problem
• Respondent 1: We found out there are better medicines than what she was taking and she did not have to get a prescription. She talked to the pharmacist because she is on other medication, because MedlinePlus ^ said don't mix these pills with other pills. But the pharmacist told her it was okay for her to take them, but that if the heartburn comes back she should see her doctor. [p4] _ Learn about prescription drug
• Respondent 1: There was a woman whose sister was diagnosed with breast cancer and she was so worried. We read a little bit about it and found out that "stages" tell you now serious the cancer is. She went ^ back and asked her sister about her breast cancer and found out it was stage 1. That means her sister has a really good chance of surviving it. [p4] Learn about a loved one's health problem
• Respondent 1: Everyone was hearing about this bird flu and we were coming up on Thanksgiving. The ladies who come to our Thursday brown-bag lunch meeting were saying they didn't know if they should -^-serve turkey this year. So the other lay health adviser and I printed some information off of MedlinePlus, passed it around and we discussed it. [p4] Learn about current health topics Get information for presentation
• Respondent 1: We try to just show them in general how to search for a health topic, then give them privacy with the computer. It works okay 4 as long as they know a little bit about using a computer. [p4] — Teach use of M+
Notes: One of the projected outcomes of teaching lay health advisers about M+ was that people in the community would have better access to useful health information. Our interview with Respondent 1 gave us an idea of how the lay health advisers use M+. Respondent 1 used it one-to-one to help community members find information about health conditions and about drugs. She helped another person look up information about a family member's health condition. This is an important use of M+ because this woman was quite worried but she couldn't go to her doctor to ask about her sister's illness. Because she was not her sister's caretaker, she could not talk to her sister's doctor. Where else could she learn about breast cancer? The lav health workers also used the information to inform a group about a timely topic that has been in the news a lot. Finally, they tried to help community members who do not want to disclose their illness bv just givinn general instructions on how to use M+
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine. National Library of Medicine, 2006
Step Three — Summarize and Analyze Your Data — Qualitative Methods 28
The process described here is just one of many
approaches that can be used. For instance, a
method using the "text-to-table" function in
Microsoft Word is described in a publication
at http://idde.syr.edu/Krathwohl/Chapterl4/
Considerations.htm [9]. For complicated
projects involving a great deal of data there
are a number of software packages on the
market designed specifically for qualitative
data analysis, like ATLAS.ti (http://www.
atlasti.com) and NVivo 7 (http://www.
qsrinternational.com.)
Interpretation. The interpretation stage
involves making sense of the data. The most
basic approach is to summarize the themes
that you identified in the data. (See "Notes"
in Figure 4.) Then, you could use some of the
following approaches to further analyze your
data:
• Write answers to some of your evaluation
questions like "What results did we
get?" "What worked well?" "What were
the challenges?" and "What can be
improved?"
• See if you can come up with a
classification scheme for your data. For
instance, you might be able to classify
your interview data into categories of how
MedlinePlus is used after training.
• The analysis might even involve some
counting. For instance, you might count
how many users talked about looking up
health information for themselves and
how many used it to look up information
for others. This will help you assess
which uses were more typical and which
ones were unusual. However, remember
these numbers are only describing the group
of people that you interviewed; they cannot
be generalized to the whole population.
• See if the themes differ by group. For
instance, you may find that users in
the health professions and general
public users value different features of
MedlinePlus.
There are numerous approaches to analyzing
qualitative data. Two excellent resources
for beginners are "Analyzing Qualitative
Research" at the University of Wisconsin-
Extension Website, [10] or Glesne's Becoming
Qualitative Researchers. [11] Qualitative
Data Analysis by Miles and Huberman [8]
also provides methods for analysis, although a
little more advanced.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects. Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
29 Step Four — Assess the Validity of Your Findings — Qualitative Methods
Step four
Assess the Validity of Your Findings — Qualitative Methods
As with surveys, you will need to assess the
validity of your interview data. Qualitative
researchers use the word 'trustworthiness"
instead of validity, but the concept is the same.
Validity actually refers to the accuracy of the
data collection instrument. In interviewing,
you as the interviewer are the "instrument,"
so you need to assess the steps you took
to guarantee that the interview data you
collected is as thorough, accurate, inclusive
of all viewpoints, and unbiased as possible.
Following some of the steps listed below will
help you assess the validity of your findings:
• Be sure you can articulate the rationale
behind your sample.
• As you identify themes and patterns,
seek information that does not support
your findings. For instance, if you are
interviewing participants from an online
resource training project and getting
glowing responses, seek out some
interviewees who did not seem to get as
much from the training.
• Use multiple methods of data collection
and look for consistency. This is called
t4triangulation." When you interview, you
should use at least one other source of
data and see if the sources corroborate one
another. For instance, you may compare
your data to some focus group data from
the same project. You do not have to
triangulate with other qualitative data. In
evaluation, it is not unusual to compare
interview findings with survey data.
• Have more than one person code and
analyze the data. Both coders should work
independently at first, then come together
to compare and discuss findings. The
coders are not likely to have identical
findings. However, there will be some
overlap in concepts and the dissimilarities
are likely to provide a more thorough
interpretation.
• Ask participants to read your
interpretations. They can tell you if you
are representing their views thoroughly
and accurately.
• Get an outsider to review your evaluation
data, data collection processes, and
methods to see if he or she agrees with
your conclusions.
You can find more information about
validating your qualitative data in the
references listed at the end of Step 3 [8,10,11].
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Take Home Messages 30
SH-
Take Home
Messages
Collecting and Analyzing Evaluation Data
3.
Be prepared to mix qualitative and
quantitative data. Mixed approaches
often tell the whole story better than either
approach alone.
Quantitative methods are excellent for
exploring questions of "quantity": how
many people were reached; how much
learning occurred; how much opinion
changed; or how much confidence was
gained.
The two key elements of a successful
survey are a questionnaire that yields
accurate data and a high response rate.
With surveys, descriptive statistics usually
are adequate to analyze the information
you need about your project. Charting and
making comparisons also can help you
analyze your findings.
Qualitative methods are excellent for
exploring questions of "why": why your
project worked; why some people used the
online resources after training and others
did not; or why some strategies were more
effective than others.
A good interview study uses a purposeful
approach to sampling interviewees.
Analysis of interview data entails
systematic coding and interpretation of the
text produced from the interviews. Multiple
readings of the data and revised coding
schemes are typical.
In interviewing, you as the interviewer are
the "instrument," so you need to assess
the steps you took to guarantee that the
interview data you collected is as thorough,
accurate, inclusive of all viewpoints, and
unbiased as possible.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine. National Library of Medicine, 2006
31 References
1. Burroughs C. Measuring the difference: guide to planning and evaluating health information
outreach. [Web document]. Seattle, WA: National Network of Libraries of Medicine, Pacific
Northwest Region, September, 2000 [cited 26 June 2006]. .
2. The Joint Committee on Standards for Educational Evaluation. The standards for program
evaluation. Thousand Oaks, CA: Sage, 1994.
3. Cui, WW. Reducing error in mail surveys. [Web document]. Practical assessment, research &
evaluation 2003;8(18) [cited 14 June 2005]. .
4. Dillman DA., Tortora RD, Bowker D. Principles for constructing web surveys (technical report
98-50). [Web document]. Pullman, Washington: SESRC, 1998 [cited 26 June 2006]. .
5. Armstrong JS. Monetary incentives in mail surveys. Public Opinion Quarterly 1975; 39: 11-116.
6. Bosnjak M, Tuten TL. Prepaid and promised incentives in web surveys. Social science
computer review 2003; 21(2): 208-217.
7. Patton, MQ. Qualitative research and evaluation methods. 3rd ed. Thousand Oaks, CA: Sage,
2002.
8. Krathwohl, DR Considerations in using computers in qualitative data analysis methods of
educational and social science research: an integrated approach. [Web document]. Online
revision of 2nd ed. Chapter 14. Long Grove, IL: Waveland Press 2005 [cited 6 October 2005].
.
9. Taylor-Powell ET, Renner M. Analyzing qualitative research. [Web document]. Madison, WI:
University of Wisconsin Extension 2003 [cited 4 October 2005]. .
10. Glesne C. Becoming qualitative researchers. 2nd ed. New York: Longman, 1999.
11. Miles MB, Huberman M Qualitative data analysis 2nd ed. Thousand Oaks, CA: Sage, 1994.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Appendix 1 32
Examples of Commonly Used Quantitative Evaluation Methods
Method St Examples of Sources Examples of information collected
End-of session • Trainees • Satisfaction with training
evaluations or surveys • Service recipients • Intentions of using the resources in the future • Beliefs about the usefulness of the resources for various health concerns • Confidence in skills to find information
Tests • Trainees • Ability to locate relevant, valid health
(best if conducted before information
and after training) • Ability to identify poor quality health information
Follow-up surveys • Trainees • Usefulness of resources for health
(conducted some time • Collaborative partners concerns (becoming more informed about
period after training) treatments, learning more about a family
• Attitude or opinion member's illness)
scales (e.g., strongly • Use of resources as part of one's job
agree, agree, etc.) • Level of confidence in using the resource
• Dichotomous scales • Sharing the resource with other co-
(yes/no) workers, family members, etc. • Use and usefulness of certain supplemental products (listservs and special Websites)
Records • Website traffic • Hits to Website
• Frequency counts information • Amount of participation on listservs
• Percentages • Attendance records • Training participation levels
• Averages • Distribution of • Retention levels (for training that lasts
materials more than one session) • Numbers of people trained by "trainers" • Number of pamphlets picked up at health fairs
Observations • Trainee behavior • Level of participation of trainees in the
• Absence/presence • Site characteristics sessions
of some behavior or • Ability of trainee to find health
property information for the observer upon request
• Quality rating of • Number of computers bookmarked to
behavior (Excellent to resource Website
Poor) • Number of items promoting the resources made available at the outreach site (handouts, links on home pages)
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects. Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
33 Appendix 2
Ways To Improve Response Rates for Electronic Surveys
Electronic surveys provide an excellent alternative to mail or telephone surveys. In general, they
can be much less expensive. Companies like SurveyMonkey [http://surveymonkey.com] make
creating Web-based surveys easy for novices and fairly affordable. Research has provided some
insight into best practices for electronic surveys.
1. Carefully consider how the choice of electronic survey may affect response rates. Some
groups, like employees in an organization with Internet access, members of professional
organizations, or listserv participants may be computer-oriented and may prefer electronic
surveys. Others may have limited use of technology or choose not to use it.
2. Use the general principles of administering surveys described on page 11 in Figure 2. Send
a preliminary, personalized cover letter, alerting respondents to the coming web-based
survey. If possible, make sure the letter comes from someone they trust or like and make
sure the respondent can see the name without opening the email (such as in the "FROM" or
"SUBJECT" field.) If people do not recognize the sender of an email message, they may not
open it.
3. Keep the survey as simple as possible so that it will load quickly.
4. Start with a simple, interesting question. Use recognizable formats (two-option questions;
rating scales) that look like questions respondents have seen on print surveys. Be sure that the
respondent can see each item and related responses on one screen.
5. Use question formats similar to those seen on written surveys.
6. Do not have items that force respondents to answer before they can move on to the next item.
Such items frustrate respondents and could cause them to before finishing.
7. Give instructions for the respondents with the least amount of computer experience. Some
people may not understand how to scroll for more questions, how to use drop-down boxes,
etc. If you find that the instructions take up too much space, consider different formats for
respondents with different levels of computer experience.
8. Use grouping mechanisms (like color or boxes) to help respondents connect questions and
responses.
9. Give participants an indication of the survey's length. When possible, put all questions on one
screen so respondents can see the length of the survey. For short surveys, put all questions
on one page. For surveys with multiple pages, use a "progress bar" available in many online
survey software packages or notations like (Page 1 of 6) on each page. In the introductory
screen, give information such as the number of total questions, number of screens, or
estimated time to complete the survey. If respondents tire of answering questions and see no
end in sight, they are likely to quit before finishing.
Source: Dillman DA., Tortora RD, Bowker D. [4]
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Appendix 3 34
Examples of Commonly Used Qualitative Methods
Method
Description
Examples
Interviews
People with knowledge of the
community or the outreach
project are interviewed to get
their perspectives and feedback
Interviews with people who have special
knowledge of the community or the
outreach project
Focus group interviews with 6-10
people
Large group or "town hall" meeting
discussions with a large number of
participants
Field
observation
An evaluator either participates
in or observes locations or
activities and writes detailed
notes (called field notes) about
what was observed
Watching activities and taking notes
while a user tries to retrieve information
from an online database
Participating in a health fair and taking
notes after the event
Examining documents and
organizational records (meeting
minutes, annual reports)
Looking at artifacts (photographs,
maps, artwork) for information about a
community or organization
Written
documents
Participants are asked to
express responses to the
outreach project in written form
Journals from outreach workers about
the ways they helped consumers at
events
Reflection papers from participants in
the project about what they learned
Electronic documents (chats, listservs,
or bulletin boards) related to the project
Open-ended survey questions to add
explanation to survey responses
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
35 Tool Kit — Case Example
Using Mixed Methods
Part 1: Planning a Survey
A health science library is partnering with a local agency that provides services, support, and
education to low-income mothers and fathers who are either expectant parents or have children
up to age 2. The projects will provide training on search strategies to staff and volunteers on
MedlinePlus and Household Product with a goal of improving their ability to find consumer health
information for their clients. The objectives of the project are the following:
Objective 1: At the end of the training session, at least 50% of trained staff and volunteers will
say that their ability to access consumer health information for their clients has improved because
of the training they received .
Objective 2: Three months after the training session, 75% of trained staff and volunteers will
report finding health information for a client using MedlinePlus or Household Products.
Objective 3: Three months after receiving training on MedlinePlus or Household Products, 50%
of staff and volunteers will say they are giving clients more online health information because of
the training they received.
All staff and volunteers will be required to undergo MedlinePlus training conducted by a
health science librarian. Training will emphasize searches for information on maternal and
pediatric health care. The trainers will teach users to find information with Health Topics, Drug
Information, Directories, and Clinical Trials. The training will also include Household Products.
To evaluate the project outcomes, staff and volunteers will be administered a survey one month
after training. Worksheet 1 demonstrates how to write evaluation questions from objectives, then
how to generate survey questions related to the evaluation questions. (This worksheet can be
adapted for use with pre-program and process assessment by leaving the objectives row blank.)
Part 2: Planning an Interview
After six months of the training project, the team considered applying for a second grant to expand
training to clients. They have decided to do a series of interviews with key informants to explore
the feasibility of this idea. Worksheet 2 demonstrates how to plan an interview project. The
worksheet includes a description of the sampling approach, the evaluation questions to answer, and
some interview questions that could be included on your interview guide.
Blank versions of the worksheets used in the case example are provided on pages 38 and 39 for
your use.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Tool Kit — Worksheet 1 36
Planning a Survey
([Objective 1 At the end of the training session, at least 50% of trained staff and volunteers will say that their ability to access consumer health information for their clients has improved because of the training they received.
fEvaluation 1 Questions ! '1 • Do staff and volunteers think the training session improved their ability to find good consumer health information? • Did the training session help them feel more confident about finding health information for their clients?
Survey Questions • ........ * • The training session on MedlinePlus improved my ability to find good consumer health information. (strongly agree/agree/neutral/disagree/strongly disagree) • The training session on MedlinePlus made me more confident that I could find health information for the agency's clients. (strongly agree/agree/neutral/disagree/strongly disagree)
Objective 2 1 Three months after the training session, 75% of trained staff and volunteers will report finding health information for a client using MedlinePlus or Household Products.
Evaluation "* Questions • Did the staff and volunteers use MedlinePlus or Household Products to get information for clients? • What type of information did they search for most often?
Survey Questions • Have you retrieved information from MedlinePlus or Household Products to get information for a client or to answer a client's question? (yes/no) • If you answered yes, which of the following types of information did you retrieve (check all that apply) ( ) A disease or health condition () Prescription drugs ( ) Contact information for an area health care provider or social service agency () Clinical trials () Information about household products ( ) Other (please describe _)
Objective 3 1 Three months after receiving training on MedlinePlus or Household Products, 50% 1 of staff and volunteers will say they are giving clients more online health information j because of the training they received.
Evaluation Questions • Are staff helping more clients get online health information more often now that they have had training on MedlinePlus or Household Products? • What are some examples of how they used MedlinePlus or Household Products to help clients?
Survey Questions • The training I have received on MedlinePlus or Household Products has made me more likely to look online for health information for clients, (strongly agree/agree/not sure/disagree/strongly disagree) • Since receiving training on MedlinePlus or Household Products, I have increased the amount of online health information I give to clients, (strongly agree/agree/not sure/disagree/strongly disagree) • Give at least two examples of clients' health questions that you have answered using MedlinePlus or Household Products, (open ended)
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
37 Tool Kit —Worksheet 2
Planning an Interview Project
Interview Group Staff
Sampling Strategy • Agency director • Volunteer coordinator • 2 staff members • 2 volunteers • 2 health science librarian trainers
Evaluation Questions • How ready are the clients to receive this training? • What are some good strategies for recruiting and training clients? • How prepared is the agency to offer this training to their clients? • Do the health science librarians have the skill and time to expand this project?
Sample Questions for the Interview Guide • What are some good reasons that you can think of to offer online consumer health training to clients? • What are some reasons not to offer training? • If we were to open the training we have been offering to staff and volunteers to clients, how likely are the clients to take advantage of it? • What do you think it will take to make this project work? (Probe: recommendations for recruitment; recommendations for training. ) • Do you have any concerns about training clients?
Interview Group j Clients
Sampling Strategy Six clients recommended by case managers: 1 • All interviewees must have several months experience with the agency and must have attended 80% of sessions in the educational plan written by their case manager. 3jM • At least one client must be male j • At least one client should not have access to the Internet from home or work
Evaluation Questions -jH • How prepared and interested are clients to receive training on online consumer health resources? M • What are the best ways to recruit agency clients to training sessions? if| • What are the best ways to train clients?
Sample Questions for the Interview Guide M • When you have questions about your health, how do you get that J9 information? J| • How satisfied are you with the health information you receive? M • If this agency were to offer training to you on how to access health information online, would you be interested in taking it? Jjj • What aspects of a training session would make you want to come? ^3] • What would prevent you from taking advantage of the training'?
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Tool Kit — Blank Worksheet 1 38
Planning a Survey
Objective
Evaluation Questions
Survey Questions
Objective
Evaluation Questions
Survey Questions
Objective
Evaluation
Questions
Survey
Questions
r-.\.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects. Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine. National Library of Medicine. 2006
39 Tool Kit — Blank Worksheet 2
Planning an Interview Project
Interview Group
Evaluation Questions
Sampling Strategy
Sample Questions for the Interview Guide
Interview Group
Evaluation Questions
Sampling Strategy
Sample Questions for the Interview Guide %
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
Tool Kit — Checklist 40
Checklist for Booklet Three Collecting and Analyzing Evaluation Data
Consider whether your question is best answered using quantitative methods, qualitative methods, or both.
Quantitative Methods - Surveys
Step One Design Your Data Collection Methods
D □ □ □ Write evaluation questions that identify the information you need to gather. Write survey questions that are directly linked to the evaluation questions. Pilot test the questionnaire with a small percentage of your target group. Have your methods reviewed by appropriate individuals or boards.
Step Two Collect Your Data
□ □ □ Decide whether to administer the survey to a sample or to everyone in your target group. Follow procedures known to increase response rates. Write a cover letter to motivate and inform respondents.
Step Three 1 Summarize and Analyze Your data
□ □ □ □ Summarize your survey data using descriptive statistics. Organize your data into tables to help answer your evaluation questions. If assessing outcomes, compare findings to targets in your objectives. Write a brief description of the results.
Step Four Assess the Validity of Your Findings
□ □ Critically review your data for shortcomings. Candidly report to stakeholders how any shortcomings may affect interpretation.
Qualitative Methods - Interviews
Step One Design Your Data Collection Methods .. *,.>"«->>.
□ D □ □ Write evaluation questions that identify the information you need to gather. Write an interview guide using open-ended questions. Pilot test the interview guide with one or two people from your target group. Have your methods reviewed by appropriate individuals or boards.
Step Two Collect Your Data
□ □ n Design a purposeful data collection plan. Include information to motivate and inform respondents. After each interview, spend a few minutes to write notes about the interview.
Step Three .............. 'UjpMui i'iiiiu.imiiiii ii.uj..4tuiiiui.,;,.jillii........................wemnmwmmi « ........------mr-Su mm arize and Analyze Your Data
□ □ □ □ □ Read through your interview transcripts and notes to develop a code list. Write a brief description of each theme. Code all your interview data systematically. Organize the coded text by code or theme. Interpret the findings.
1 Step Four Assess the Validity of Your Findings 'iSfilflk... , :VC; k
□ □ □ □ □ □ Revisit the rationale behind your purposeful sample. Look for data that disproves your conclusions or seems to contradict main themes. Look for corroboration of your conclusions through other evaluation data. Have two or more coders work on the same data and discuss different interpretations. Ask participants to review your conclusions to see if descriptions are accurate and thorough. Get an outside reviewer to look at the data and see if he or she agrees with your conclusions.
Collecting and Analyzing Evaluation Data
Planning and Evaluating Health Information Outreach Projects, Booklet 3
Outreach Evaluation Resource Center
National Network of Libraries of Medicine, National Library of Medicine, 2006
^
NATIONAL LIBRARY OF MEDICIN
NLM 031fllb23 0
NLM031816230
&
4A
"■'"?■'"■'
IP iff* ' tt Kits' '
*».\-
S#
' • vv.;. v7»' •►'
■v > -r-:.w
'PA
x -jr.;
•V.Htf-y.V
V:"f i
<>.■ -<
^
'«**
■».!•
rj '.V
NLM035605306