PROPERTY OF THE NATIONAL LIBRARY OF MEDICINE th^ V Offerenc& Guide to Planning and Evaluating Health Information Outreach National Network of Libraries of Medicine Pacific Northwest Region National Library of Medicine PROPERTY OF THE NATIONAL LIBRARY OF MEDICINE th&Difference/ Guide to Planning and Evaluating Health Information Outreach Catherine M. Burroughs, M.LS., Principal Author National Network of Libraries of Medicine, Pacific Northwest Region cburroug@u.washington.edu Fred B. Wood, D.B.A., Project Officer National Library of Medicine September 2000 National Library of Medicine Cataloging in Publication Burroughs, Catherine M. (Catherine Mary) Measuring the difference: guide to planning and evaluating health information outreach/Catherine M. Burroughs, principal author; Fred B. Wood, project officer. — Seattle, Wash.: National Network of Libraries of Medicine, Pacific Northwest Region; Bethesda, MD.: National Library of Medicine, [2000]. "September 2000." Includes bibliographical references. 1. Information Services—standards. 2. Community-Institutional Relations. 3. Planning Techniques. 4. Program Evaluation—methods. I. Wood, Fred B. II. National Network of Libraries of Medicine (U.S.). Pacific Northwest Region. III. National Library of Medicine (U.S.). IV. Title. 02NLM: Z675.M4 B972m 2000 Additional copies can be obtained from: National Network of Libraries of Medicine, Pacific Northwest Region (NN/LM, PNR) Box 357155 University of Washington Seattle, Washington, 98195-7155. nnlm(5)u. washington.edu www.nnlm.nlm.nih.gov/pnr Development of this publication was a collaborative effort between NN/LM, PNR and: Office of Health Information Programs Development National Library of Medicine 8600 Rockville Pike Bethesda, Maryland, 20894 www.nlm.nih.gov Table of Contents Preface__________________________________.------------------------------------ * Foreword______________________________________________________.---------------- m Acknowledgments__________________________________________-------------------- 1V Introduction_____________________________________________--------------------- v11 How is this document organized?_________________________________________________ v" What are the benefits of evaluation?_____________________________._________________ v" How realistic is planning and evaluation for small scale outreach programs?_____________viii Why are health behavior theories important?______________________________________ vr11 Challenges for evaluation________________________.______________________________— 1X Stage 1: Conducting a Community Assessment_______________________________.— 1 Identify the targeted community ________________________._______________.---------- 2 Conduct a community assessment___________________________________________-----3 Obtain user input__________________________________________________------------ 5 Methods of data collection_________________________._______________----------- 5 Utilize results_______________________________________________________---------- 7 Tool Kit References______________________________________________.-----------------— 9 Selected readings_________________________________________________---------- 10 Tips for questionnaire development_________________________________------------ 12 Gowan Library case example______________________________.________________.--- 14 Stage 2: Developing Goals and Objectives_______________________________________15 Setting goals___________.___________________________.----------------------.----- 15 Identifying objectives based on outcomes and indicators__________________________.---- 15 Indicators_________________________________________.-----------------------— 1" Constructing objectives________________________-------------------------------- 18 Process Objectives____________________________________________________.------ 18 Educational Objectives______________________------------------------------.— 18 Behavioral and Environmental Objectives_______________________________________18 Program Objectives _____________.______________----------------------------- 18 Tool Kit References_________________.__________________------------------------------- 20 Goals and objectives workform________________________________________________ 21 Gowan Library case example_________________.--------------------------------- 22 Stage 3: Planning Activities and Strategies___.___________________________________ 23 Theories about behavior change.__________________.-------------------------------- 23 Social Learning Theory_______________________.------------------------------- 24 Extended Parallel Process Model (EPPM)________________________________________ 26 Stages of Change Model _______________.---------------------------------- 28 Diffusion of Innovations Theory_______________________________________________29 Community Organization__________________.----------------------------------- 30 Planning for activities_______________________________________________________31 How does an audience assessment fit in?__________________________________________32 How is an audience assessment conducted?________________________________________ 33 Table of Contents Tool Kit References___________________________________________________________________ 35 Selected readings____________________________________________________________36 Sample outreach strategies_______________________________________________._____ 37 Planning outline workform_____________________________________________________38 Task list workform___________________________________________________________ 39 Gowan Library case example___________________________________________________ 40 Stage 4: Planning Evaluation________________________________________________ 43 Developing an evaluation plan_____________________________________________________ 43 Establishing evaluation objectives__________________________________________________ 44 Process (formative) evaluation objectives________________________________________ 45 Accountability____________________________________________________________ 45 Program improvement______________________________________________________ 45 Replication _______________________________________________________________ 46 Summative evaluation objectives___________________________________________________ 47 Overall program effectiveness__________________________________________________ 47 Program effects - What else happens as a result of outreach? _______________________ 47 Evaluation Methods_____________________________________________________________ 47 Quantitative method__________________________________________________________ 48 Qualitative method___________________________________________________________ 48 Selecting an evaluation design_____________________________________________________ 49 Experimental design__________________________________________________________ 51 Quasi-experimental design_____________________________________________________ 51 Non-experimental design______________________________________________________ 52 How much evaluation is feasible?__________________________________________________ 52 Tool Kit References and selected readings_______________________________________________ 54 Workform for process evaluation objectives______________________________________ 55 Gowan Library case example___________________________________________________ 57 Stage 5: Gathering Data and Assessing Results_______________________________________ 59 What does evaluation measure?____________________________________________________ 59 Methods of data collection________________________________________________________ 60 Quality of data collection_________________________________________________________ 60 Reliability___________________________________________________________________ 62 Validity_____________________________________________________________________ 62 Cultural appropriateness_________________________________________________62 Data analysis____________________________________________________________________ 63 Coding_______________________________________________________________________ 63 Quality control_______________________________________________________________63 Types of analysis_______________________________________________________________63 T-tests_______________________________________________________________________64 Univariate analysis____________________________________________________________65 Bivariate analysis_____________________________________________________________65 Multivariate analyses__________________________________________________________65 Table of Contents Tool Kit References and selected readings_____________________________________________ 66 Workform or measuring process______________________________________________ 67 Workform for measuring outcomes ___________________________________________ 68 Gowan Library case example________________________________________________ 69 Stage 6: Utilizing and Reporting the Results__________________________________ 71 Utilize results_______________________________________________________________ 71 Report preparation___________________________________________________________ 72 Report structure _____________________________________________________________ 72 Dissemination of results_______________________________________________________ 73 Tool Kit References_______________________________________________________________ 74 Gowan Library case example________________________________________________ 75 Appendices by Stage Stage One A. Online Access Survey______________________________________________________A-l B. Question Formats_________________________________________________________B-l C. Sampling________________________________________________________________C-l Stage Two D. Sample Goals and Objectives_______________________________________________D-l Stage Three E. Diffusion of Innovations Theory ____________________________________________E-l F. Self Efficacy Measure______________________________________________________F-l G. Sample Measures for Behavior Change Theories_______________________________G-l H. Audience Assessment______________________________________________________H-1 I. Sample Planning Outline____________________________________________________ 1-1 J. Sample Task List__________________________________________________________ J-1 Stage Four K. Sample Process Evaluation Objectives _______________________________________K-l L. Sample Ways to Measure Process ___________________________________________L-l M. Sample Exit Questionnaire _________________________________________________M-l N. Sample Ways To Measure Outcomes_________________________________________N-l O. Sample Measures of Behavior Outcomes _____________________________________O-l General P. Bibliography_____________________________________________________________P-l Index_____________________________________________________________________Index-1 Preface l The National Library of Medicine (NLM) maintains an enduring interest in and places great value on evaluation as a tool to enable important management decisions and to assess the quality and impact of its programs and services. Some noteworthy examples: • In the early 1980's, NLM closed the card catalog, and management was faced with the decision to install one of two very early online systems. A comparative evaluation was undertaken in the reading room as a controlled field experiment; one system was found preferable and it provided exceptional service to our users and staff for many years.1 • In the late 1980's, NLM helped usher in the era of CD-ROM technology with nationwide field tests in library and clinical settings. Countless new end-users had their first introduction to easy MEDLINE searching.2 • At about the same time, NLM adapted a novel methodology, the Critical Incident Technique, once used to evaluate the performance of World War II bomber pilots. In the present instance, the intent was to document and assess the impact of using MEDLINE-derived information on profes- sional activities, especially on clinical decisions and patient outcomes. We found that MEDLINE does, indeed, make a difference.3 • NLM has sponsored the development of evaluation frameworks for telemedicine and for health information privacy,4 and has asked its contractors to apply these frameworks where appropriate.5 During this past decade, outreach to underserved populations, including those in minority or rural communities, became one of NLM's highest priorities. Yet, effectively evaluating outreach has also been one of our toughest challenges. A five-year review carried out in the mid-1990s of literally hundreds of outreach projects had among its recommendations that "NLM and the Regional Medical Libraries (RMLs) should work together to develop further expertise in evaluation methodology... [and that]... evaluation components should be an integral part of all NLM-sponsored outreach.6 With this objective in mind, NLM and the Pacific Northwest Regional Medical Library, along with a stellar group of advisors, undertook to develop an evaluation guide for the health sciences library community. The underlying theme is that planning and evaluating an outreach initiative is one and the same process, and that asking the right questions at the beginning is essential for getting useful results at the end. Moreover, the guide would be practical in purpose, theory-based, and offer a range of methodological possibilities and strategies that can be adapted to the most simple or complex of outreach projects. Not an easy task. To what extent we have succeeded remains to be evaluated. We hope that the guide will be used in the field—a true "field manual"—by the RML and other librarians, health information professionals, and, in general, persons from the varied organizations that conduct outreach to users of health infor- mation. The "field" that we have in mind ranges from rural to urban to inner city, and spans a diver- sity of racial, ethnic, and cultural community settings. We very much need and welcome your feed- back on use of the guide. Elliot R. Siegel, Ph.D. Donald A.B. Lindberg, M.D. Associate Director for Director Health Information Programs Development National Library of Medicine National Library of Medicine n References 1. Siegel, E.R., Kameen, K., Sinn, S.K., and Weise, F.O.: A comparative evaluation of the technical performance and user acceptance of two prototype online catalog systems. Information Technol- ogy and Libraries. March 1984, 3(1), 35-46. 2. Rapp, B.A., Siegel, E.R., Woodsman", R.M. and Lyon-Hartmann, B.: Evaluating Medline on CD- ROM: An overview of field tests in library and clinical settings. Online Review. June 1990, 14(3), 172-186. 3. Lindberg, D.A.B., Siegel, E.R., Rapp, B.A., Wallingford, K.T., and Wilson, S.R.: Use of Medline by physicians for clinical problem solving. Journal of the American Medical Association. June 23/30 1993, 269(4), 3124-3129. 4. Marilyn J. Field, Editor, and Committee on Evaluating Clinical Applications of Telemedicine, Institute of Medicine, Telemedicine: A Guide to Assessing Telecommunications in Health Care (Washington, DC: National Academy Press, 1996); Committee on Maintaining Privacy and Security in Health Care Applications of the National Information Infrastructure, Computer Science and Telecommunications Board, National Research Council, For the Record: Protecting Health Information (Washington, DC: National Academy Press, 1997). 5. See NLM National Telemedicine Initiative, http://www.nlm.nih.gov/research/telemedinit.html. 6. Wallingford, K.T., Ruffm, A.B., Ginter, K., Spann, M.L., Johnson, F.E., Dutcher, G.A., Mehnert, R., Nash, D.L., Bridgers, J.W., Lyon, B.J., Roderer, N.K., and Siegel, E.R.: Outreach activities of the National Library of Medicine: A five-year review. Bulletin of the Medical Library Associa- tion. April 1996 (Supplement), 84(2), 1-60. Foreword Hi Health science librarians strive to ensure that health professionals and those who use health care services are knowledgeable about health information resources, and that anyone who needs access to library services can get it. This endeavor often requires reaching out to groups who are not our typical users. However, after conducting an outreach program we are often left wondering what, if any, impact we have had. In the absence of a comprehensive guide to outreach planning and evalua- tion each of us is left to develop our own strategies. The result is published studies whose outcomes cannot be compared. National Network of Libraries of Medicine staff, with outreach as a core mission, have been especially concerned about this lack for a number of years. Recognizing this need, in 1997 the National Library of Medicine began a collaborative project with the Pacific Northwest Regional Medical Library to conduct a multidisciplinary study about outreach planning and evaluation. Elliot Siegel, National Library of Medicine's Associate Director for Health Information Programs Development, provided the impetus for this work. He and Fred Wood, project officer, provided leadership in the conceptualization and realization of the study and the development of this guide. A multidisciplinary expert advisory committee provided content as well as assisted with the develop- ment process. All Pacific Northwest Regional Medical Library staff contributed to the refinement and testing of the guide. Catherine Burroughs, librarian with the Pacific Northwest Regional Medical Library and principal author of the guide, directed the project. She took a vision of what we wanted to achieve and shaped it into reality. Demonstrating a special interest in this area, Catherine is now training and consulting about planning and evaluating outreach programs. We hope that this guide will prove helpful to librarians and others engaged in health information outreach activities and we look forward to hearing about your experiences using it. We thank all who contributed to this work. Sherrilynne Fuller, M.L.S., Ph.D. Neil Rambo, M.L.S. Director Associate Director National Network of Libraries of Medicine, National Network of Libraries of Medicine, Pacific Northwest Region Pacific Northwest Region Director Health Sciences Libraries and Information Center University of Washington iv Acknowledegements The National Library of Medicine (NLM) conceived, funded, and oversaw the study conducted by the National Network of Libraries of Medicine, Pacific Northwest Region (NN/LM, PNR) upon which this manual is based. An integral part of the NLM's vision was to convene a group of 18 national experts to advise on its development and content. Among the advisory group, seven contributed white papers that review best practices and research in their field most relevant to the mission and goals of health information outreach among minority communities. For full text versions of each paper, see http://www.nnlm.nlm.nih.gov/pnr/eval/reviews.html. Much of this manual is based on these white papers as well as on feedback from the entire advisory panel, invited reviewers and NLM and NN/LM, PNR staff. This work was partially supported by funding from the National Institutes of Health Evaluation Set-Aside Program. Advisory Panel Sherrilynne S. Fuller, Ph.D., Chair University of Washington George D. Baldwin, Ph.D. California State University, Monterey Judith Bendersky, M.P.H. Alaska Regional Assistance Center/SRCC John E. Bowes, Ph.D. University of Washington Laura Cailloux Skagit Valley Community College Ted Mala, M.D. Association of American Indian Physicians Joanne G. Marshall, Ph.D. University of North Carolina, Chapel Hill Carrie Paton Alaska Native Medical Center, Anchorage Everett M. Rogers, Ph.D. University of New Mexico Kim Witte, Ph.D. Michigan State University Josephine Dorsch, M.A.L.S. University of Illinois at Chicago William H. Dutton, Ph.D. University of Southern California Mark P. Haselkorn, Ph.D. University of Washington Walter B. Hollow, M.D. University of Washington Peter J. House, M.H.A. University of Washington Carol G. Jenkins, M.L.S. University of North Carolina, Chapel Hill Stephen T. Kerr, Ph.D. University of Washington Frances Marcus Lewis, Ph.D. University of Washington Invited Reviewers Deborah Lines Andersen, Ph.D. University at Albany State University of New York James Andrews, Ph.D. University of Kentucky Lynda Baker, Ph.D. Wayne State University Marion Ball, Ed.D. First Consulting Group Carol Barry, Ph.D. Louisiana State University Donald Case, Ph.D. University of Kentucky Keith Cogdill, Ph.D. University of Maryland Acknowledgements V Pat Fisher, Ph.D. University of Tennessee Peter Hernon, Ph.D. Simmons College Carol Hert, Ph.D. Syracuse University Silvia Patrick, Ph.D. San Jose State University Verna Pungitore, Ph.D. Indiana University Paul Solomon, Ph.D. University of North Carolina at Chapel Hill Bor-sheng Tsai, Ph.D. Pratt Insitute Barbara Wildemuth, Ph.D. University of North Carolina at Chapel Hill National Library of Medicine lone Auston, M.L.S. Becky Lyon M.L.S. Nancy Roderer, M.L.S. Angela Ruffm, Ph.D. Mary Beth Schell, M.L.S. Elliot Siegel, Ph.D. Karen Wallingford, M.L.S. Fred Wood, D.B.A. National Network of Libraries of Medicine, Pacific Northwest Region Susan Barnes, M.L.S. Maryanne Blake, M.S.L.S. Michael Boer Catherine Burroughs, M.L.S. Del Chafe Pat Chinn-Sloan Patricia Devine Sherrilynne Fuller, Ph.D. Linda Milgrom, M.S.L.S. Neil Rambo, M.L.S. Nancy Press, M. Libr. Roy Sahali Planning and Evaluating Outreach Stage One Stage Two Identify Target Community Conduct Community Assessment Establish Goals and Objectives Stage Three Stage Six Share Results and Modify Program as Needed Develop Activities and Strategies Conduct Audience Assessment to Tailor Strategies Develop a Plan to Implement Activities Stage Five Analyze Results for Answers to Evaluation Objectives Carry out Plan for Summative Evaluation to Assess Progress and Impact Carry out Plan to Implement Outreach Activites Stage Four and Five Carry out Plan for Process Evaluation to Monitor and Improve Activities Establish Evaluation Objectives (Process or Summative) Select Design and Data Collection Methods Develop Evaluation Instruments Introduction vii |_| I ealth information outreach programs are based on the commonly held assumption that access to information results in improved delivery of health care. Even as advances in electronic technologies are ever improving, the realities of adequate access and exchange of health information are far from universal especially among minority and underserved populations and the health providers who serve them. Thus, the overall goals of outreach are to affect the capacity of the individual, organization, or community to effectively utilize health information resources and to address problems and barriers to accessing them. Many types of institutions share goals to bridge the health information gap through outreach activi- ties, including community organizations, churches, social service agencies, public libraries, as well as hospitals, clinics and health sciences libraries. This guide presents ideas for planning and evaluating these outreach programs to help improve and document their success. How is this document organized? This guide presents a programmatic and goal-oriented approach to outreach, in which activities are directed toward the accomplishment of thought-out goals and objectives. A fundamental premise of this approach is that evaluation is an integral part of the program development, beginning with an understanding of the needs and perspectives of the targeted audience and the priorities for outreach considered most important. Priorities might be difficult to shape because it seems that there is so much to be done. However, outreach programs cannot do everything, and by setting a strategic direction and incorporating evaluation into the process, activities are leveraged for effective impact. There are several stages in planning and evaluation that contribute to the process called program development. Some textbooks describe program development as 1) identifying a target audience and conducting a community needs assessment, 2) developing written goals and objectives, 3) implement- ing activities to accomplish those objectives, and 4) evaluating the overall quality and success of those activities vis-a-vis the stated objectives. However, the implication of this model should not be that evaluation only occurs after the program has started or, worse yet, after it is completed. Evaluation starts with assessing and understanding audience needs, which becomes the cornerstone for setting goals and objectives, from which activi- ties and strategies are determined, upon which their implementation is monitored for progress, and finally their ultimate impact is assessed. The six stages described in this manual show how the various phases of evaluation are integrated into the whole process of planning and implementing outreach activities. Please refer to the flow chart Planning and Evaluating Outreach for an overview. Various "tool kits" are provided at the end of each stage, such as lists of additional resources, fill-in-the-blank work forms, and a case example about the fictitious Gowan Library outreach program to illustrate key points of the respective stage. What are the benefits of evaluation? Evaluation research has been done in several outreach programs (1), mostly to assess needs and improve practice. This manual adds an emphasis on outcomes-based evaluation to determine what changes have been effected. That is, even if evaluation shows that activities are implemented and processes are monitored and perhaps even improved - what is accomplished as a result of all that work? Tracking outcomes helps answer that "so what?" question. viii Introduction Overall, evaluation helps programs refine and sharpen their focus; provide accountability to funders, managers, or administrators; improve quality so that effectiveness is maximized; and better under- stand what is achieved and how outreach has made a difference. Limited attention to evaluation can result in continuation of outreach activities that are ineffective and/or inefficient; failure to set priorities; or an inability to demonstrate to funding agencies that the outreach activities are of high quality. It's true that planning requires time and resources, and evaluation adds another layer to that process. But the time and effort spent to do even a minimum of planning and evaluation will provide many benefits. How realistic is planning and evaluation for small scale outreach programs? The scale of work implied in the planning and evaluation process may seem daunting or unrealistic for settings with limited resources and staff. In reality, there are different levels of expectations that planners can assume when using this manual. It is not intended as a prescription for what must be done to plan and evaluate a program. Even though comprehensive evaluation is not necessary, an understanding of the basic principles involved in all phases of planning and evaluation might help direct useful small scale assessments so they can derive many of the benefits evaluation has to offer. Just the steps to identify the target audience and prioritize the program goals and objectives with input from the community will help ultimate effectiveness. Developing several objectives that address 1) what outreach will do (e.g., conduct x number of workshops) and 2) the effect these activities may have in changing information seeking behaviors will help maintain a clear focus. Baseline data about the skills, attitudes, knowl- edge, or beliefs can be compared with post-outreach data on the same variables. Gathering data after outreach has been completed will be important to understand sustained impact. Thus, with a basic roadmap to evaluation, there is much discretion left to planners about what will be useful and doable in their specific programs. For example, one might choose not to evaluate the skill, attitudes, knowledge, and behavior change outcomes resulting from every outreach activity. Rather, several representative activities might be selected to get an overall impression of results. It is also not necessary to use this planning and evaluation manual only when beginning a new program or selecting a new audience. It could be a guide for reassessing what you are currently doing - the audience you are targeting, the program goals and objectives you may be following - if only informally. For example, one outreach program decided to re-evaluate the audience they assumed to be part of their target community after conducting a very informal and non-rigorous poll of visitors to exhibit booths at several conferences over the course of a year. There was a consistent finding that the majority of visitors already knew about PubMed, though they were interested in updates or improved skills. While improving skills is a valid outreach objective, the staff began to rethink whether the awareness-raising objectives primary to exhibit activities were being well executed with these audiences. Perhaps there was a need to retarget the types of conferences chosen for future exhibits. Why are health behavior theories important? In Stage Three, this manual introduces several theories from the fields of health education and health communications that explain what can motivate or influence changes in behaviors, including: Introduction ix Social Learning Theory Extended Parallel Process Model Stages of Change Model Diffusion of Innovations Theory Community Organization The premise for introducing these theories is that successful outreach requires sustained adoption of new information seeking behaviors by the targeted audience. Thus, outreach often involves interven- tions (i.e., activities) to influence and change attitudes, skills, and behaviors in using electronic health information systems and resources. Outreach studies have already identified several barriers to effective use of electronic information sources, and ways that successful outreach can increase certain skills and motivate sustained use of those skills. Behavior change theory enhances that knowledge by explaining the factors that shape behavioral action. Outreach planners need not be experts at understanding the theories introduced here, but the principles discussed can be effectively used in both planning and evaluating outreach activities. According to Witte, the key to successful outreach activities is the use of a theory to guide the intervention and evaluation. Theories cut the guesswork, increase efficiency, and allow one to understand why an intervention is or is not working (2). Challenges for evaluation The evaluation designs, methods, and tools described in this guide are meant to provide an overall picture of what can be involved in an evaluation process. There will be exceptions and difficulties in carrying out or using some of the methods and techniques. For example, the rigor required for experimental designs with randomized control groups will be beyond the resources or need of most projects. However, a discussion of the experimental design, with comparison to less rigorous ap- proaches, is provided as a point of departure for those who can apply it to their situations. Similarly, though surveys are frequently used in evaluations and needs assessments, other types of data collection (such as focus groups, interviews, or feedback forms) may be appropriate depending on the purposes of the research. Developing and conducting survey research is resource intensive, especially when statistical validity is crucial to obtaining data truly representative of the targeted population. If exploratory research is the focus (such as getting a better understanding of an audience or to pilot test a new program), making generalizations from a sample survey to the larger population will probably not be necessary or appropriate. References 1. Marshall JG. A review of health sciences library outreach and evaluation. Seattle, WA: National Network of Libraries of Medicine/Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pnr/eval/marshall.html, 1997. 2. Witte K. Theory-based interventions and evaluation of outreach efforts. Seattle, WA: Na- tional Network of Libraries of Medicine, Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pnr/eval/witte.html, 1998. Stage 1: Conducting a Community Assessment Topics • Identify Target Community • Conduct a Community Assessment • Obtain User Input • Methods of data collection • Utilize Results Figures Figure 1: Sample Focus Group Questions Tool Kit • References • Selected Readings • Tips for Questionnaire Development • Gowan Library Case Example Stage One Conduct Community Assessment What to Find Out How to Find Out c 3 E £ o 4- s. |2 C c ft) E VI VI VI VI E E o u 3 TJ C O What populations do you serve? Which communities are most in need? Which communities can you best reach and influence? Etc... Analyze demographics, health status, patterns of health care. Use secondary sources. Use national and local data sources. Ask stakeholders. What to Find Out How to Find Out Who are users of health information? What health information is needed and used? WTiat are the barriers to getting the information? Can outreach help? How? Etc... Consult the Literature Work with the community • Get feedback from key contacts, leaders. • Ask users directly. • Conduct exploratory interview or focus groups. • Distribute questionnaires. Considerations for questionnaires • Can you adopt questions from tested surveys? • If a new survey is needed, how will you use the results? • Will you want to generalize? If so, consider developing a valid and reliable survey instrument. Analyze Results Review when setting agenda of goals and objectives Conducting a Community Assessment 1 A # % lthough the term "outreach" is used frequently in the library and information science literature, it is by definition not limited to a library setting. Instead, outreach tends to be defined by the specific activities undertaken by librarians and others vested in the public's social and health well being as they attempt to reach beyond the boundaries of their traditional on-site services and address the problems or needs of a targeted clientele (1). The general public as well as the personnel and organizations that create the public's social and health network share the need for access to quality health information. The growing capability of electronic information storage and retrieval technologies have helped surpass boundaries of traditional information services delivered within library walls. However, the availability of electronic health resources also creates a need for outreach activities to pro- mote, train, and facilitate online heath informa- tion access, exchange, and use. A basic assumption of this guide is that out- reach activities are most effectively planned and conducted when based on an overall outreach program. That means that specific outreach efforts are parts in a "package" of activities that together are intended to produce a specific result. To be successful, outreach programs require goals and objectives combined with methods for satisfying the objectives and thereby reaching the goals (2). The methods selected to reach outreach objectives might include some types of the following activities: • Promoting a local public library as a place to find health information through resources such as MEDLINEplus; • Staffing an exhibit to promote health information resources at an annual meeting of environmental health officers, public health nurses, veterinarians, school nurses, podiatrists, optometrists, physicians, nurses, or other health professional groups; • Developing a cooperative effort among partner organizations to create a website with links to local health resources and other reputable medical Web sites; • Conducting train the trainer programs for health care and social services personnel who will teach their patients, students, or clientele effective skills in accessing health information; • Assisting with Internet connectivity and training for a migrant worker clinic, long term care facility, or community agency; • Assisting Hispanic American or American Indian/Alaska Native communities to improve technology infrastructure and learn self-sustaining health information skills. The activities listed above have the common goal of facilitating effective access, use and exhange of health information for health providers and the public. Reaching this goal does require objectives to develop or improve information seeking skills by individuals. Theories that help reach these types of objec- tives are described in Stage 3. But, skills will not be adopted as information seeking behavior unless accompanied by conditions that help sustain or support their use, such as convenient access to relevant and valued information resources and the support or influence of gatekeepers, opinion leaders, or peers in the work or community environment. Outreach programs thus are more effective if objectives to effect information seeking skills of individuals are accompanied by objectives to effect social or environmental factors in their community that may facilitate or impede access. For example, conducting training classes for an audience without understanding the value that their social or work environment places on computerized resources or without building a foundation of technical capability (e.g. adequate hardware or connectivity with onsite or local expertise) will introduce search skills that are unlikely to be sustained. The outreach planning process thus begins with a community assess- ment to understand the context of the group 2 Conducting a Community Assessment being reached, and to develop mutual goals for ways that outreach can help. Stage 1 of outreach program development includes the process of identifying and discover- ing the needs of a targeted community; referred to as a community assessment. This process is a critical beginning to planning and evaluating a health information program as it sets the stage for developing overall program goals and objectives. A community assessment provides answers to questions such as: • What will be the target community? • What are the health information needs of that community ? • What are their access problems and needs? • What problems should have the highest priorities? • What groups within the community can outreach best reach and influence? For the health information outreach planner, a community assessment helps test, revise, or refine assumptions about the need for and priorities of the program. Outreach programs that do not conduct community assessments are basing their activities on what is assumed to be needed, not necessarily on what is most needed. Note to the reader: Another form of assess- ment, the audience assessment, is discussed in Stage 3. The difference between a community and audience assessment is purpose and scope. The community assessment helps set the stage for determining the goals and objectives of an overall program that might include any number of outreach activities. An audience assessment, conducted prior to a specific outreach activity, gathers data about the specific information needs, behaviors and attitudes of the activity participants (e.g., registrants for a training workshop). Data from the audience assessment helps refine the content and strategies used in promoting and conducting that activity. Identify the Target Community Before developing a community assessment, a decision needs to be made about what commu- nity will receive outreach. A community represents a group of individuals who share functional or structural characteristics. Func- tional characteristics are non-geographic, such as age, occupation, culture, or special interest (e.g. health condition). Structural communities are organized by spatial boundaries, such as an inpatient hospital setting, neighborhood, parish, or ghetto; or legally established communities, such as a village, town, city, county, state, or nation (3). Before narrowing to a community, first consider the population your organization serves. For example, the populations served by a public library can be defined by the demographics of the library service area. Clientele served by a hospital library may include hospital staff and patients, as well as the public in the hospital's local area. Organizations with state or regional responsibilities will cover a wide range of populations within a large geographic area. Given the probability that the population(s) served by your organization are numerous or large, the next step is to prioritize communities in most need of outreach. Populations that would likely benefit from improved access to and use of health information resource include those experiencing a disproportionate lack of access to health services or those at risk of health disparities, such as AIDS. You can identify communities lacking access to health services by minority or socioeconomic status, such as ethnic and cultural communities, sexual minorities, or low income communities in rural or urban areas. To discover populations most in need, you can avoid wasting time and resources on extensive data collection efforts by finding out what is already known. Depending on the scope of population your library serves, socioeconomic data and health status might be found in city, county, regional, state, or Federal health sources (e.g. look for federally designated Medically Underserved Areas). National health data Conducting a Community Assessment 3 sources provide a general idea of the extent and pattern of healthcare, including the availability of manpower and the organization of service delivery. Health status indicators allow you to compare national with state averages to obtain an overall picture of the health disparities most prevalent in your state or region. Once you have identified the populations you serve that will likely benefit from outreach, establish priority community(ies) you might target. As defined above, the term "commu- nity" is broad and can be defined by common interests or by spatial or legal boundaries. The communities you choose for outreach may be the social and health occupations that target underserved populations or populations with health disparities, such as: • rural primary care professionals • school nurses • health or school educators • local agency personnel • health promotion departments • state and local health departments • community health associations If you include the public as part of your service population, the communities you choose may be health consumers in underserved neighborhoods or rural areas, or those individuals that have or are at risk for the health disparities prevalent in your state or regional populations. With a list of candidate communities, consider which of these can you most effectively reach. Think about your potential strengths and weaknesses of working with each community. What do you have to offer that will be relevant to their situation and need? What are the types of organizations that address the communities' health concerns? What key groups will be important targets or partners in your efforts? Selecting the community(ies) for your outreach efforts is an important first step in planning your outreach efforts. A reasonable and rationale approach does not mean extensive research, but will require some thinking about where you are both most needed and can be most effective. Part of the final selection decision will include matching available time, resources and staff with the level of outreach effort that is needed. Example: The medical library at a large state university received funding to extend its out- reach to health providers throughout the state. Realizing a systematic approach toward plan- ning and evaluating this effort would benefit the program, the library decided to prioritize the candidate communities for outreach. First, they reviewed the goals of the funding agency which were to bring all health professional within easy reach of health information resources, espe- cially those that do not currently have direct access. With this in mind, outreach staff re- viewed population areas in the state that have low socioeconomic status and are designated Medically Underserved Areas (MUA). Several parts of the state are considered MUA, and the library needed to select among them. Staff then consulted morbidity/mortality rates for indica- tors of poor health status and narrowed down their choice by the underserved area containing the county with the highest incidence in the state of several poor health status indicators, including AIDS and tuberculosis. Health provider communities who address these health issues were identified as primary care provid- ers, local public health workers, and school nurses. With these candidate communities narrowed by health issue and geographic location, staff decided to target primary care providers in clinics designated as Community Health Centers under public law 330 of the Public Health Services (PHS) Act. Conduct a Community Assessment With a community identified for outreach, a community assessment will provide a deeper understanding of the needs and problems an outreach program might address and the inter- mediaries to work with. A primary objective in conducting a community assessment is to develop a mutual agreement with the commu- 4 Conducting a Community Assessment nity about the types of outreach activities needed and the hoped for outcomes. To begin, establish a broad understanding about the targeted group of health information users and their environment, including: • Type of health information needed and for what purpose • Numbers and types of health providers • Sources of information used • Availability, adequacy of information technology and infrastructure • Availability, adequacy of information services • Environmental, political, or social barriers to technology or information use The literature is an excellent resource when researching a community's information needs. Chimoskey studied rural physicians in the state of Washington to determine use of MEDLINE (4). Dorsch cites several studies that specifi- cally address the information needs of rural health professionals (5). Marshall lists studies of the information needs of a variety of health professionals including nurses in the work environment, physicians in office practice, and primary care physicians and their opinion leaders (1), (6). Baird et al. published an annotated bibliography about the needs assess- ments of health professionals (7). Rambo published a report on a study to understand the varied use and need for information resources and technology by different segments of the public health workforce (8). The Environment of Local Public Health Departments Adopted from Dragonfly, the newsletter of the NN/LM PNR So you want to work with your local public health department? As with reaching out to serve and collaborate with any group, it pays to know something about who they are and what they do. What do you know about your local public health department? Who are their "customers?" Who funds them? To whom do they report? What does a local health department do? Many health departments do provide some patient care (e.g., immunizations, STD clinics, prenatal screening, and nutrition counseling). But local public health has become much more than that. It is a mix of services designed to meet the needs of communities in preventing the spread of disease, protecting people from unsafe drinking water, air, and hazardous waste, and ensuring that people have the information and resources needed to live healthy lives. Who are the health professionals on staff? You may find physicians and nurses who also care for patients at the hospital or clinic. There are public health nurses who work in a variety of roles with childcare centers and school districts, mental health and drug and alcohol treatment programs, and law enforcement agencies. There are environmental health specialists who inspect drinking water, who work with solid waste programs, who inspect restaurants and train food workers. In larger jurisdictions there will be epidemiologists and others trained in tracking infectious disease outbreaks. The list is a long one and it depends on local needs and programs. Information needs are very broad and overlap with subject areas that we don't usually think of as being health-related. Local health departments are stronglv oriented toward the state health department. It's a good idea to spend some time combing through the state department's web site to get an idea of what resources and data are there. This will be a limited view because it s only what is publicly available; nevertheless the web site will give you a glimpse of what's happening and some of that will be reflected at the local level Conducting a Community Assessment 5 Obtain User Input After reading the literature, it is helpful to conduct some sort of study particular to your community. You might confirm or reject the needs identified in other studies, and identify needs unique to your targeted community. Direct user input is preferred when trying to establish a basic understanding about problems, satisfaction, and unmet information access needs of a community. If possible, get feedback from key contacts and leadership within the community to help gather facts and establish a mutual agreement about the need for outreach. Methods of Data Collection The methodology you use to gather data will vary according to the purpose of your assess- ment and how you want to apply the results. See Stage 5 for additional discussion of evalua- tion methodology, which will be introduced here. There are two basic approaches to data collec- tion, including: • Extensive data collection • Intensive data collection These two approaches vary quite a bit and their choice will depend on the purpose of your research and how you intend to use the results. With extensive data collection, much is already known about the situation and the possible variables or factors involved. The purpose is to collect data about a community that can be considered truly representative of the entire user population. Data collected can be both qualita- tive and quantitative (described below). Statis- tical validity and reliability are key criteria, meaning that the research instrument measures exactly what was intended and, if repeated, results would be the same or very similar. Random sampling is also important, so that all people being researched have an equal chance of responding. (For more discussion of random sampling, see Appendix C). In situations where little is known about the audience, it may be helpful to use a more exploratory data gathering approach called intensive data collection. The purpose here is to understand patterns of behavior or identify particular impacts or problems impeding desired results. With intensive data collection, you want a practical understanding of what is happening, but not to make generalizations. You can get both qualitative and quantitative feedback that does not strive for statistical validity, but does provide data to help under- stand your audience. Each approach mixes two methods of data collection traditionally termed quantitative and qualitative. Quantitative methods provide systematic and standardized way of gathering data, through the use of predetermined catego- ries into which all responses must fit. Written questionnaires are typically used to gather quantitative data, whether informally via a feedback questionnaire, or through a statisti- cally valid survey. Quantitative methods produce hard data expressed in numbers, such as numbers of computers in a worksite, percent- age of respondents with Internet experiences, or scores about attitudes towards computers. Qualitative methods are concerned with record- ing feelings, experiences, and impressions according to the subjects' own words, either spoken or written. To understand users from their own perspectives, qualitative methods use open ended questioning techniques such as: • Focus groups • Open-ended survey questions • Critical incident surveys • Internal staff feedback • User interviews Other qualitative methods include observations, diaries, or a review of records and documents. As mentioned earlier, the approach you choose for data collection will depend on the purposes of your assessment. If you have worked with a 6 Conducting a Community Assessment user population and have noticed patterns of behaviors and needs that you hope to confirm or disprove through statistically valid research, the extensive data collection approach should be considered. A study by Bowden et al, 1990, is an example of extensive data collection in a community assessment. A questionnaire was mailed to all physicians in five Texas counties to determine differences between those with access to medical libraries and those practicing in remote areas without local access to medical information. Demographic variables, profes- sional practice characteristics, and patient characteristics were compared. Information resource use, particularly reasons for use and non-use of MEDLINE, was explored. Ques- tions also were asked about the availability of various types of information technology. The results indicated that statistically significant differences did exist between the two groups in the use of MEDLINE and libraries (9). Should you decide to conduct extensive data collection using statistically valid methods, there is greater assurance that other outreach programs can rely on your results. However, developing a well-designed data collection instrument requires considerable training and skill. If possible, seek assistance from survey research experts within your institution or local area. For a classic resource on survey develop- ment, please refer to Dillman (10). You may prefer methods of intensive data collection to gain a practical understanding of the community needs your program will ad- dress. There are several ways to do this, including developing and distributing informal questionnaires. Following principles of ques- tion development (see Appendix B), feedback can be collected that may not be generalizable (statistically valid), but will provide a thought- ful understanding of the community's needs. Informal pre-testing of the questionnaire will help to improve its reliability, as described on page 62 of Stage 5. Or, adopt questions from already developed questionnaires. Selected needs assessment studies with published questionnaires, standard sources for identifying needs assessment, and tips on question develop- ment are described in the Tool Kit at the end of this chapter. Also, see the online access survey in Appendix A, assessing a local public health department's access to computers and electronic communications and the need for training. Another intensive data collection method is to interview community stakeholders. Stakehold- ers are those with a vested interest in the availability of health information resources. Depending on your community, stakeholders might be: • Health providers • Health care administrators • Continuing education officers • Public or rural health officials • Faculty • Consumers • Health educators • School nurses • Public librarians • State and local health personnel Local medical societies, public health associa- tions, and other associations or collegial net- works can help identify major stakeholders and opinion leaders. In American Indian communi- ties, it is especially important to contact tribal leaders directly or through an individual who has established contact with tribal leadership. By just asking stakeholders how health informa- tion is used, what are the information resources they believe are needed, what type of outreach activities are needed, or similar questions, issues and assumptions can be quickly discov- ered. Though the results are not generalizable to the whole population. This can be the simplest and most effective way to gather information (11). The focus group is another intensive data collection technique. According to Biblarz, focus groups have the advantage of obtaining perceptions in a permissive, non-threatening atmosphere. Questions are asked in a non- Conducting a Community Assessment 7 directive way, allowing information to surface that a structured interview might block. For those readers interested in a detailed explana- tion of conducting focus groups, you are referred to the text by Glitz (12). For a practical example of focus group research to discover health professionals' information needs, see Mullaly-Quijas et al. (13). Selected questions from this text are shown in Figure 1. Figure 1: Sample Focus Group Questions Specific services 1. Are you familiar with the National Library of Medicine and the ser- vices it provides? 2. For those familiar with the services, how familiar are you with them? How did you come to learn about them? 3. How frequently do you utilize the service(s)? 4. What are your perceptions regarding the service(s)? Information-seeking behavior 1. What sources do you use to obtain medical information? 2. Do you utilize a library? For what percent of information needs? What are your perceptions of this source? 3. What factors play a role in your decision to use various sources of information? 4. What are the biggest barriers to gaining access to this information? (Probe for time, money, equipment and knowledge/skills) 5. How do you use the information? How do you determine the quality of the information? 6. Describe the ideal information system. How would it work and what information should it contain? Where would it exist and how would you access it? Utilize Results To be useful, the information gathered from interviews, focus groups, or question- naires in a community assessment should be analyzed to help set an agenda for outreach goals and objectives. To know what the results mean might not be a straightforward matter. Identifying "what is" in a community assessment does not automatically make clear "what should be." When examining results, organize the data to fill in answers to the following questions: 1. What is the targeted community (as specific as possible)? What does this community need (or what are they lacking) according to your perspec- tive? 3. What does the community need (or what are they lacking) according to their perspective? 4. What does the community need (or what are they lacking) according to (funding source, management, etc) perspective? 5. Are outreach resources adequate to deal with the problem? 6. Will outreach make a difference in the problem? 7. Is the group responsive to solutions or ready for change? 8. What work is already underway? CO 8 Conducting a Community Assessment 9. What is the political landscape of the problem in this group? If planners focus on describing a community's information seeking problems and then examine a) the types of changes that outreach can facilitate and b) information resources and services that offer solutions relevant to the needs of the population, community assessment becomes a very useful tool for planning. Tool Kit - References 9 References: 1. Marshall JG. A review of health sciences library outreach and evaluation. Seattle, WA: National Network of Libraries of Medicine/Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pnr/eval/marshall.html, 1997. 2. Dignan MB, Carr PA. Program planning for health education and promotion. Philadelphia: Lea&Febiger, 1992:164. 3. Cassel JC. Community diagnosis. In: Omran AR, ed. Community Medicine in Developing Countries. New York: Springer Publishing, 1974:18. 4. Chimoskey S, Norris T. Use of MEDLINE by rural physicians in Washington state. Journal of the American Medical Informatics Association 1999;6(4):332-3. 5. Dorsch J, Pifalo V. Information needs of rural health professionals: a retrospective use study. Bulletin of the Medical Library Association 1997;85(4):341-7. 6. Marshall JG. Using evaluation research methods to improve quality. Health Libraries Review 1995;12:159-172. 7. Baird L, Meakin F, Bailey M, Shipman J. Assessing the information needs of health profes sionals: an annotated bibliography. Baltimore, MD: National Network of Libraries of Medi cine, Southeastern/Atlantic Region, University of Maryland at Baltimore, 1991. 8. Rambo N, Dunham P. Information needs and uses of the public health workforce—Washing ton, 1997-1998. MMWR Weekly 2000;49(6): 118-120. 9. Bowden V, Kromer M, Tobia R. Assessment of physicians' information needs in five Texas counties. Bulletin of the Medical Library Association 1990;82(2): 189-96. 10. Dillman DA. Mail and telephone surveys : the total design method. New York: Wiley, 1978. 11. Biblarz D, Bosch S, Sugnet C. Guide to library user needs assessment for integrated informa tion resource management and collection development. American Library Association, prepress. 12. Glitz B. Focus groups for libraries and librarians. New York: Forbes, 1998. 13. Mullaly-Quijas P, Ward DH, Woelfl N. Using focus groups to discover health professionals' information needs: a regional marketing study. Bulletin of the Medical Library Association 1994;82(3):305-11. 10 Tool Kit - Selected Readings Selected library research articles with published questionnaires Bowden VM, et al. Assessment of physicians' information needs in five Texas counties. Bulletin of the Medical Library Association 1990;82(2): 189-96. Burnham JF, Perry M. Promotion of health information access via Grateful Med and Loansome Doc: why isn't it working? Bulletin of the Medical Library Association 1996;84(4):498-506. D'Alessandro D. Barriers to rural physician use of a digital health sceinces library. Bulletin of the Medical Library Association 1998;86(4):583-93. Dorsch JL. Equalizing rural health professionals' information access: lessons from a follow-up outreach project. Bulletin of the Medical Library Association 1997;85(l):39-47. Hall EF. Physical therapists in private practices: information sources and information needs. Bulletin of the Medical Library Association 1995;83(2): 196-201. Haug JD. Physicians' preferences for information sources: a meta-analytic study. Bulletin of the Medical Library Association 1997;85(3):223-32. Huber JT, et al. Assessing the information needs of non-institutionally affiliated AIDS service organizations in Texas. Bulletin of the Medical Library Association 1995;83(2):240-3. Obst O. Use of Internet resources by German medical professionals. Bulletin of the Medical Library Association 1998;86(4):523-33. Pifalo V. The impact of consumer health information provided by libraries: the Dele ware experience. Bulletin of the Medical Library Association 1997;85(1): 16-22. Shelstad K. Information retrieval patterns and needs among practicing general surgeons: a statewide experience. Bulletin of the Medical Library Association 1996;84(4):482-9. Urquhart CJ, et a. Comparing and using assessments of the value of information to clinical decision- making. Bulletin of the Medical Library Association 1996;84(4):482-9. Verhoeven AA. Use of information sources by family physicians: a literature survey. Bulletin of the Medical Library Association 1995;83(l):85-90. Wood FB, et al. Transitioning to the Internet: results of a National Library of Medicine Bulletin of the Medical Library Association 1997;85(4):331-40. Tool Kit - Selected Readings 11 Additional Sources for Needs Assessments: Soriano FI. Conducting needs assessments: a multidisciplinary approach. Thousand Oaks: Sage Publications, 1995. Databases Health and Psychosocial Instruments Database (HaPI) Cumulative Index to Nursing and Allied Health Literature (CINAHL) Print Sources Anderson JG, et a. Evaluating health care information systems: methods and applications. Thousand Oaks: Sage Publications, 1994. Cork RD, Detmer WM, Friedman CP. Development and initial validation of an instrument to measure physicians' use of, knowledge about, and attitudes toward computers. Journal of the American Medical Informatics Association 1998;5:164-176. Marshall JG. Evaluation instruments for health sciences libraries. Chicago: Medical Library Association (MLA Dockit #2), 1990. The Bulletin of the Medical Library Association publishes questionnaires with some articles reporting survey results 12 Tool Kit - Tips for Questionnaire Development The following tips provide some general guidelines for presenting, sequencing, and choosing yp questions. of The questionnaire or interview should begin by explaining the purpose of the study and w y individual's responses are important. Include a cover letter and stamped, addressed return envelope with mailed questionnaires, explain- ing the need for the information and how to supply it. Udinsky, Osterlind, and Lynch (1981) have developed the following guidelines for writing a cover letter: 1. The letter should contain a clear, brief, yet adequate statement of the purpose and value of the questionnaire. 2. It should be addressed to the respondent specifically. 3. It should provide good reason for the respondent to reply. 4. It should involve the respondent in a constructive and appealing way. 5. The respondent's professional responsibility, intellectual curiosity, personal worth, etc., are typical of response appeals. 6. The letter should establish a reasonable but firm return date. 7. An offer to send the respondent a report of the findings is often effective, though it carries with it the ethical responsibility to honor such a pledge. 8. The use of a letterhead, signature, and organizational endorsements lends prestige and official status to the letter. 9. The letter should guarantee anonymity and confidentiality. 10. Each letter should be signed individually by the researcher. 11. The researcher should include a stamped, self-addressed envelope for the return of the instru- ment. From Evaluation Resource Handbook: Gathering, Analyzing, Reporting Data (p. 120), by B.F. Udinsky, S.J. Osterlind, and S.W. Lynch, 1981, San Diego, CA: EdITS Publishers. Reprinted by permission of EdITS Publishers. For telephone or face-to-face interviews, the introduction about the purpose of the study can be followed by general questions to put the respondent at ease or to develop a rapport between the interviewer and the respondent. For written questionnaires, start with interesting questions that will draw the respondent in. Leave questions about demographics for the end. The response rate for written questionnaires is typically low. Short questionnaires and those that clearly explain the need for the information are more likely to be returned. Questionnaires should be attractive, easy to read, and offer ample space for the respondent's answers. Write clear and unbiased questions. Avoid leading questions ("How have you enjoyed the class?"'! that might guide the answer. • Keep a question close to direct experience (i.e., avoid the need for extensive recall). Give a specific time frame whenever possible. Tool Kit - Tips for Questionnaire Development 13 • Avoid two-part (double-barreled) questions. For example, "Using PubMed is easy and fun" - Strongly disagree to Strongly agree - is a double-barreled question because it assesses (1) if PubMed is easy and (2) if PubMed is fun. What happens if the respondent thinks PubMed is fun but not easy? S/he cannot accurately answer the question. • The most structured or closed types of questions have yes-no or multiple-choice responses, typi- cally used for knowledge questions. These are the easiest to tabulate, but also force the respondent into a choice that may not reflect his or her own perceptions. Use an "other" category to give the person another option. Involve several targeted audience members in the testing and formation of the questions to ensure that the most common responses to questions are included in the multiple choices. • Attitude questions generally use less structured formats. Scales, such as Likert or semantic differentials, are often used. The respondent chooses a response along a continuum, generally ranging from a five- to a seven-point scale. Likert scale example: I am at risk for falling behind current medical knowledge Strongly 12 3 4 5 6 7 Strongly Disagree Agree Semantic differential example: PubMed is: Undesirable 12 3 4 5 6 7 Desirable • Unstructured or open-ended questions, such as short-answer questions, journals or logs, may be used to gain descriptive information. They are generally not used for quantitative data because the response categories are not specific and may be difficult to code for analysis. However, they can provide impressions, in-depth information, and outcomes that you may not have anticipated. 14 Tool Kit - Gowan Library Case Example You are library director of the Gowan Library, a state university medical center library. QUtreach to the university includes outreach to statewide constituents. You want to extend the library s select rural health professional not affiliated or located within the library's immediate service area^ ^ ^ the community you will target, you decide to focus on a rural area with the highest inci health indicators in the state. Rural health provider settings in your selected area include Geneva Health, which has 4 primary care clinics serving a four county district. There are 46 health providers, including 16 physicians, 6 nurse practitioners, 6 physician assistants, 12 LPNs, 1 outreach counselor, 1 health educator, and 1 migrant outreach coordinator. Their patient population reflects demographics of the area: • 80% of the population are Caucasian • 20% Hispanic • 38% live at or below the poverty level, most without health insurance. The administrator at Geneva Health is contacted and sounds enthusiastic about discussing an outreach program with Gowan Library. Among other facts, you find out that: • Few clinicians use electronic resources, including email or video communication for consulta- tions, mostly relying on telephone • Health provider recruitment and retention is low, due to rural isolation • Geneva Health does not yet have desktop Internet access for staff • The nearest library is 50 miles away You also talk with other stakeholders, such as several clinic health professionals, the state's rural health organization and the local chapter of the American Academy of Family Practice Physicians. You want to know: Current information needs of clinic health professionals Barriers to accessing health information What information resources are known about and used How outreach could help What do these stakeholders want from an outreach program—what would "successful outreach" mean to them. From these conversations, you are able to obtain a snapshot of the telecommunications infrastructure at the various health settings, the types of information needed and sources currently used. This informa- tion helps to understand the context of information needs and to discover what these stakeholders want to get from the outreach program. With this data, your next step is to identify the mutual goals and objectives that will address the problems or factors that contribute to inadequate access to information. Stage 2: Developing Goals and Objectives Topics • Setting Goals • Identifying Objectives Based on Outcomes and Indicators • Indicators • Constructing Objectives • Process Objectives • Educational Objectives • Behavioral and Environmental Objectives • Program Objectives Figures Figure 2: Sample Outcomes and Indicators Tool Kit • References • Goals and Objectives Workform • Gowan Library Case Example Stage One 1 Conduct Community Assessment 1 ' Stage Two ^H Develop Goals and Objectives What to Establish a WTiat are the priority needs and problems in accessing health information? What effect do you hope that outreach will have? What does the community want to see happen as a result of outreach? } ' Identify Outcomes WTiat will need to change or happen (the hoped-for outcomes) to achieve outreach goals? WTiat indicators will provide evidence of these changes? i ' .2 4-U ft) o 4-O t_ 4-(/> C o Which objectives will address what outreach will do? (e.g.: conduct x classes, recruit y participants) What objectives will address hoped-for outcomes? What outcomes will be measured? (e.g.: knowledge, skill, attitudes and behavior) o o Tips for best results Work with the community O ^ Use results of community profile Come to mutual agreement Be realistic Developing Goals and Objectives 75 ^/ tage 2 describes the process of constructing goals and measurable objectives—important steps in developing an outreach agenda. Goals allow you to prioritize the needs of your tar- geted audience and develop relevant objectives. Once goals and objectives are identified, it is easier to plan the necessary activities and strategies, as described in Stage 3. If well developed, objectives will specify outcomes, or expected results, and the ways they can be measured (the indicators). Objectives provide criteria for measuring outreach, and are useful for both the process and summative evaluation phases described in Stage 4. Outreach evaluations have typically measured outcomes such as numbers of exhibits or training sessions conducted and numbers of audience reached (e.g. training class partici- pants). These number counts do not reflect the impact of outreach on participants' learning and behavior outcomes, such as gained knowledge, changed attitudes, changed beliefs, developed skills, or increased use. Nor do number counts measure other factors that can influence access, such as adequate technology or attitudes of decisionmakers or opinion leaders. The factors that influence objectives to change or influence information seeking behaviors are more fully described in Stage 3, but they are important elements to consider when develop- ing outreach objectives and will be introduced in this chapter (see Figure 2). Setting Goals Goals are long-range statements describing a desired condition or future that outreach is working toward fulfilling. Goals describe, in general terms, the conditions that will exist when outreach has been successful. To formulate goals, ask yourself and key contacts from your targeted audience: • In the long run, what effect do I hope to have on information access problems for this community? • What is the overall improvement I want to achieve? • What are the goals of my targeted audience - what do they want to achieve or see happen as a result of the outreach program? The concept of setting goals with input from the outreach audience is an important principle borrowed from health education. Rather than unilaterally deciding what you think should happen, develop an agenda based on the community's needs and concerns. You will be far more likely to achieve change if plans are based on the community's perceived needs and concerns rather than a personal or agency agenda (1). For example, goals for an outreach program to the public might be: • Residents of XYZ county will have access to current and relevant health information resources with ease and convenience. • Staff of local hospital and public libraries will have a sustainable working partnership. In the above example, goals reflect the mutual priorities of the target audience and outreach program. For residents, "ease and convenience" of access is paramount to use. For the outreach staff, the ultimate goal of improving access require cooperation among partners with similar interests. Identifying Objectives Based on Outcomes and Indicators Goals describe an ultimate ideal. However, to reach that ideal, smaller steps are implied. These steps include various types of objectives that are considered essential to realizing the goals and the outcomes that will hopefully result (2). For example, typical goals for an outreach program are to improve access, use, and ex- change of health information. The objectives to reach these goals would hopefully include outcomes that influence changes in information 16 Developing Goals and Objectives seeking behavior, including: • Cognitive outcomes such as awareness of Internet-based health resources • Affective outcomes such as attitudes toward Internet-based health resources • Skills outcomes such as knowledge and ability to find health information • Behavior outcomes such as utilization of Internet-based health resources • Environmental outcomes such as sustained commitment to maintain information services • Social and community outcomes that support initial and sustained behavior changes • Quality of care outcomes such as improved patient care decisionmaking As discussed under "Constructing Objectives," there can be process objectives that state what the outreach staff will do (e.g., conduct X number of skills training workshops). Consider also developing outcome-based objectives that measure the impact of outreach on participants' learning, behavior, and environment. There are learning, behavioral, and environmental objec- tives that are measured not by what the staff has done (e.g., facilitate Internet connectivity), but by how that new technology has impacted outreach participants or their environments. In other words, outcomes-based objectives are linked to results. Indicators In considering possible objectives, it is impor- tant that they be both realistic and measurable. Making them measurable means identifying the indicators that provide some type of logical evidence that the intended outcome has oc- curred. For example, a desired outcome of outreach might be a change in attitude toward the Internet. But what can indicate an attitude change? Asking the audience if their attitudes have improved after outreach is not precise enough. Something needs to be identified as an "indicator" of an attitude, such as "fear of information overload." Be realistic about the indicators you choose. For example, you may want to measure an outcome related to improved quality of health care. You hope that outreach can influence this outcome, given the assumption that more informed decisions ultimately lead to better health care. The indicator of interest here would not be some long term measure of improved health, such as changes in morbidity or mortality rates. These measures would be very difficult to link to your outreach activities. However, you could measure indicators for quality of care by gathering data about use of online resources for patient care decision making. See Figure 2 for more examples. Developing Goals and Objectives 17 Figure 2: Selected Sample Outcomes & Indicators Outcome: Environmental support to enable access • Worksite funding for professional librarian/library • Worksite policies allow Internet access at work • Adequate hardware and software for Internet connectivity • Interlibrary loan services Outcome: Awareness of choices in finding health information • Beliefs or thoughts that useful health information on the Internet exists • Ability to name specific sources Outcome: Online information seeking skills • Knowledge of search skill concepts • Knowledge of criteria to evaluate websites • Self-confidence in skill to find health information Outcome: Attitudes about Internet-based resources • Feelings about online resources Outcome: Use of Internet resources • Frequency of online use • Repeated use of online resources • Information found online is discussed with doctor or between health care professionals Outcome: Support of Social Network • Ongoing promotion of online health resources by opinion leaders • Repeat requests for outreach activities Outcome: Quality of Care • Information found online used for patient care decision making 18 Developing Goals and Objectives Constructing Objectives As stated earlier, objectives can be defined as the steps required to reach a goal and outcomes specify the results you hope to achieve. Having prioritized the overall outcomes you hope to achieve, the next step is to develop objectives that include indicators to measure progress toward your intended outcomes. Include several types of objectives that together contribute to the outcomes you envision. In the health education literature, these types of objectives are hierarchical, leading to the ultimate objectives of a program {program objectives). The following discussion presents the four types of objectives as described by McKenzie et al. (3). A. Process Objectives The process objectives are what you do to accomplish all other levels of objectives. Think of them as the inputs and process components needed to carry out the program. For a very comprehensive process evaluation, you may choose to create specific objectives that will track all possible components, which could include: • Program resources (materials, funds, space) • Type and appropriateness of activities • Target population exposure and attendance B. Educational Objectives Educational outreach objectives can be divided into four general categories: awareness, knowledge, attitudes, and skill development. The premise of this hierarchy is that if the targeted audience is to adopt and maintain information-seeking behaviors to alleviate health information needs, they first must be aware of the need or of the value of current information. Second, they must expand their knowledge of available and appropriate re- sources. Third, they must adopt and maintain beliefs in the effectiveness of these resources and their own ability to use them. And fourth, they need to possess the actual skills to obtain information efficiently. C. Behavioral and Environmental Objectives The third level of objectives includes the behavioral changes that resolve health informa- tion needs, thus moving toward the ultimate program objectives for improved health care. Environmental objectives can be loosely defined as those that remove physical and social barriers to enacting the behavioral changes. D. Program Objectives Program objectives are the ultimate objectives of an outreach program, expressed as the outcomes of individual and community change in using or providing health information. Although it may seem burdensome to develop four types of objectives, it is important for getting a complete picture of what is happening and why. For example, you may be able to detect an increase in use of health information resources, but it might be less than your stated behavioral objectives. If you use this as your only criteria for success, you have missed the possibility of measuring other outcomes, such as: • Increased awareness about the value and effectiveness of using Internet resources to answer questions; or • A strengthened social network of modeling and support from opinion leaders or community resources that will encourage eventual adop- tion and maintenance of new behaviors. Much of the health education literature recom- mends developing objectives that are specific, time-limited, and measurable. The clarity of your objectives will provide direction to plan- ning pertinent activities. According to McKenzie (1994), an objective should include the following elements: 1. The outcome to be achieved, or what will change. 2. The conditions under which the outcome will be observed, or when the change will occur. Developing Goals and Objectives 19 3. The criterion for deciding whether the outcome has been achieved, or how much change. 4. The target population, or who will change. The first element - outcome - is the consequen- tial action or behavior that will change as a result of the program. Outcomes are usually identified as verbs of the sentence, such as cause, connect, convert, demonstrate, develop, eliminate, reorganize, and supply. McKenzie emphasizes that outcome verbs must refer to something measurable and observable; thus appreciate, know, internalize, or understand by themselves are not good choices for outcomes. The second element - conditions - describes how or when the outcome will be observed. Typical conditions might be "upon completion of the class," "as a result of participation," "by the year 2005," "three months after the pro- gram," or "during the class session." The third element of an objective is the criterion for deciding when the outcome has been achieved or how much change has occurred. This element is the standard by which you measure whether the outcome is performed in an appropriate or successful manner. Examples might include "30% of class participants," "100 flyers," "ten opinion leaders," "five follow-up classes," etc. The last element of an objective is mention of the target audience, or who will change. Ex- amples are "all professional clinic staff" or "constituents of the Miloxi tribal reservation." Sample objectives, constructed according to McKenzie's four elements, are provided in Appendix D. A work form to fill-in goals and objectives for your program is provided in the Tool Kit at the end of this chapter. If you are accustomed to objectives that use action verbs, the structure of the objectives presented in Appendix D may seem awkward. For example, outreach planners may be accus- tomed to an objective such as: • To provide training in the use of medical bibliographic databases with emphasis on Pub Med. Consider revising the above objective to focus less on what outreach staff does (conduct classes) and more on what the audience does that provides evidence of progress toward improved information access, thus: • During the next twelve months, at least 50% of health providers in each of four rural clinics will participate in one outreach promotional or educational activity Then develop additional objectives that focus on the learning and behavioral outcomes you hope to achieve, such as: • By the end of the workshop, at least two out of three class participants will correctly answer a true/false question about how to access Medline Plus. • By the end of the year, at least 30% of class participants will consult PubMed for answers to clinical questions. These revised objectives emphasize more accountability for outcomes that predict or demonstrate changes in information access. V) Tool Kit - References References 1. Nyswander D. The open society, its implications for health educators. Health Education Monographs 1966;1:3-13. 2. Dignan MB, Carr PA. Program planning for health education and promotion. Philadelphia: Lea & Febiger, 1992. 3. McKenzie JF, Smeltzer JL. Planning, implementing, and evaluating health promotion pro- grams: a primer. Boston: Allyn and Bacon, 1997. Tool Kit - Goals and Objectives Workform 21 Outreach Goal : Process Objective(s):___ Outcome (what): Target population (who): Conditions (when): Criterion (how much): Educational Objective(s):__ Outcome (what): Target population (who): Conditions (when): Criterion (how much): Behavioral Objective(s): __ Outcome (what): Target population (who): Conditions (when): Criterion (how much): Environmental Objective(s): Outcome (what): Target population (who): Conditions (when): Criterion (how much): Program Objective(s): ____ Outcome (what): Target population (who): Conditions (when): Criterion (how much): 22 Tool Kit - Gowan Library Case Example In Stage 1, you were able to obtain useful data for the Gowan Library outreach program from a community assessment. Your next step is to develop goals and objectives that are relevant and of mutual interest with your targeted community. Your staff reviews the interviews and data collected in the community assessment and develops a list of hoped for outcomes. Some examples include: • The interest of the rural health organization in having up-to-date Internet technology at clinics that could lure students and new professionals • • The interest of primary care providers for continuing education, current diagnosis and treatment information, and credible patient information in Spanish at a low literacy level The interest of the health care administrator in showing use of current health care practice guidelines that impact patient care decisions and improve patient outcomes Since the outreach program is limited by time (one year) and funding, outreach staff identified several other hoped for outcomes. One hope is that Geneva clinics will develop collaborations with other agencies or community based organizations to fund and maintain technology infrastructure initiated by this outreach program. Also, the library staff know from previous studies that outreach has a greater impact on continued use (after outreach ends) when there are onsite library resources or personnel. The Geneva clinic sites do not have an information resources "advocate" (ideally, a librarian), so your outreach staff identify another hoped for outcome —to train personnel at each outreach site who could provide continued information access support or services. In reviewing the needs and hopes for outcomes, your staff notices that some outcomes require envi- ronmental objectives to improve technology infrastructure and other outcomes require educational and behavioral objectives to motivate and reinforce use of electronic resources. Staff want to develop measurable objectives that describe what should happen to meet the objectives. They decide to measure certain indicators such as awareness, attitude, knowledge, satisfaction, use, and impacts on health care decisions or behaviors. They also identify the criteria (how much or what) and the conditions (when) that will guide them in determining their accomplishments. The draft list of goals and objectives thus developed are presented in Appendix D. You then decide to share the list of goals and objectives with your key contacts made during the community assessment to confirm that the list is both relevant and realistic. Stage 3: Planning Activities and Strategies Topics • Theories About Behavior Change • Social Learning Theory • Extended Parallel Process Model (EPPM) • Stages of Change Model • Diffusion of Innovations Theory • Community Organization • Planning for Activities • How Does an Audience Assessment Fit In? • How is an Audience Assessment Conducted? Figures Figure 3: Social learning Theory Figure 4: Techniques to Encourage Self-Efficacy Figure 5: Definitions from the Extended Parallel Process Figure 6: Outreach Messages Using EPPM Model Figure 7: Stages of Change Model Figure 8: Diffusion of Innovations Theory Figure 9: Community Organization Figure 10: Theory-based Variables Tool Kit • References • Selected Readings • Sample Outreach Strategies • Planning Outline Workform • Task List Workform • Gowan Library Case Example Stage One 1 Conduct Community Assessment 1 i 1 Stage Two Develop Goals and Objectives 1 i 1 Stage Three 1 Plan Activities and Strategies V) > ° O What to Establish Will there be promotional activities? Will there be logistical activities? Will there be educational activities? 1 ' 10 en *> 01 ** I" Plan to use best practices from previous outreach studies. Select behavior change theories for strategies that seem feasible. Conduct an audience assessment. Refine strategies and identify specific activities to implement them. Develop a timeline to schedule related tasks. Consider for Best Results Questions in an audience profile O f assess variables important to theories you choose. Include questions that provide a baseline for comparison post- outreach. Implement Activities Planning Activities and Strategies 23 V V ith goals and objectives identified, Stage 3 includes several steps for selecting and developing effective outreach strategies and planning the activities to implement them. Three topics are covered in this stage: 1. Theories about factors and strategies that influence behavioral and environmental changes; 2. Use of an implementation plan as an important tool for effective planning; 3. Use of evaluation to tailor outreach activities and to obtain baseline data for comparison with post outreach measures. The major thrust of strategy and activity planning is finding those that will best address the outreach program's objectives. No single activity is likely to solve the problems of information access, as there are too many levels of need and factors contributing to the problems. According to Marshall (1), research and evaluation studies on health sciences library outreach have identified the following barriers to effective information seeking and use: • Lack of time • Lack of financial resources • Lack of interest in conducting literature searches as a basis for clinical decision-making • Preference for synthesized information ready for application to patient care • Lack of search skills • Lack of equipment • Lack of telecommunications infrastructure • Lack of computer skills • Lack of an onsite library • Slow turnaround time for document delivery • Need for non-literature types of information (networking with colleagues, statistical data, program planning, directory and referral information) • Increased demand on local resources without increased support Outreach activities to address these problems, needs, and barriers generally fall into three broad categories: • Promotional activities to persuade or motivate interest and awareness (e.g. exhibits, bro- chures); • Logistical activities to facilitate adequate onsite resources (e.g. equipment, connections, development of local resources, search services, document delivery); and • Educational activities to develop knowledge and skills in effective access (e.g. training classes, demonstrations). Theories about Behavior Change Reaching outreach objectives for improved access to health information can be challenging. Chang- ing behavior patterns, such as information seeking behavior, requires more than just information. Strategies are needed to help motivate, facilitate, and reinforce change. Outreach studies have identified several factors found successful in outreach initiatives, as cited by Burnham and Perry (2). These include: • Train one-on-one • Provide a variety of follow-up interventions • Change information seeking behavior • Focus on patient care • Stress education/CME • Provide money for computer equipment • Identify and cultivate a site liaison Personal contact between the target audience and librarians has also been shown to help develop and sustain changes in information seeking habits (3). The health education theories described in this chapter both reinforce and expand upon knowl- edge gained from library research about what works when trying to influence behaviors and facilitate effective access. In adapting health communications theory to information seeking behaviors, there are three factors that shape behavioral action. • Predisposing factors provide the motivation or reason behind a behavior. They include knowledge, attitude, beliefs, and readiness to change. 24 Planning Activities and Strategies • Enabling factors make it possible for a motiva- tion to be realized; that is, they "enable" persons to act on their predisposition. En- abling factors include available resources, skills, and technology. • Reinforcing factors come into play to reward a behavior, therefore increasing the probability that it will continue. Community or institu- tional support, peer influence, and opinion leader involvement are factors that reinforce and predispose behavior change. According to these factors, if outreach planners hope to change behaviors, outreach strategies should address the following objectives: • Increase awareness • Increase knowledge • Influence attitudes • Influence beliefs • Facilitate technology access • Develop skills • Reinforce behaviors • Build community or institutional support The following sections summarize five selected theories and models that will help guide strategies to address these objectives: • Social Learning Theory • Extended Parallel Process Model • Stages of Change Model • Diffusion of Innovations Theory • Community Organization These health education theories offer more than strategies to use when planning or conducting activities. Each theory identifies important variables and how they work together. As will be discussed, assessing these variables in an audience assessment and then again after outreach is completed can help explain why outreach was successful (hopefully) or why it didn't work as planned. Social Learning Theory In the 1970s, Albeit Bandura published a compre- hensive framework for understanding human behavior which he named the Social Cognitive Theory, often called Social Learning Theory (4). According to Social Learning Theory, factors that play a role in behavior change are behavioral capability, outcome expectations, self-efficacy, and observational learning (Figure 3). Behavioral capability maintains that a person needs to know what to do and how to do it; thus, clear instructions and/or training may be needed. Outcome expectations are the outcomes that a person thinks will occur as a result of recom- mended action. Self-efficacy, which Bandura considers the single most important aspect of efforts to change behav- ior, is self-confidence in one's ability to success- fully perform a specific type of action. Figure 3: Social Learning Theory Variable Concept Outreach application Behavioral Capability Knowledge and skills about recommended action Provide information and training about recommended action (e.g. online searching). Expectations Beliefs about likely results of action Demonstrate searches that provide relevant results. Self-Efficacy Confidence in ability to take action and persist in action Teach skills in small steps; give feedback and encouragement; give in-class exercise problems that provide challenge. Observational Learning Beliefs based on observing others like self Point out others' experience; provide demonstrations by role models (e.g. clinician, senior citizen, member of minority population). Planning Activities and Strategies 25 Example: In order for health workers to adopt the use of electronic health resources, they need to know what online resources work best and how to use them properly (behavioral capability); to believe that the information they need is potentially available (expectations); and to have the confi- dence in themselves to refine or adjust their search queries if they face initial difficulties in getting what they need (self-efficacy). With today's overabundance of available informa- tion, people can easily feel overwhelmed and have low self-confidence in their search abilities. Without self-efficacy, people who experience failure or difficult challenges are apt to readily abandon skills they have been taught (4). quicker to discard or refine failed strategies, do not give up as easily, are good at time manage- ment, and know how to learn from mistakes and avoid feeling deflated (5). How can outreach activities increase self-effi- cacy? Self-efficacy can be nurtured through skill development, using the techniques presented in Figure 4. Observational learning is often referred to as "modeling," that is, people learn what to expect through the experience of others. People can gain a concrete understanding of the consequences of their actions by noting whether modeled behaviors are desirable or not. The advantages of greater self-efficacy include higher confidence in the face of obstacles and better chances of persisting over time outside a situation of formal instruction. Specific to elec- tronic search skills, people of high efficacy are Observational learning is most powerful when the person being observed is respected or considered to be like the observer. Example: When conducting an outreach pro- Figure 4 Techniques to encourage self-efficacy Guided mastery or modeling A person who is held in respect and is similar to the observer (student) gives a hands-on demonstration of an online search. This helps persuade students that if someone similar to them can do it, so can they. Because searching is also an intellectual skill, it is important that the model verbalize aloud how decisions are made about the search process. It is efficient and just as effective to video-tape a guided mastery session geared for a specific targeted audience (e.g. senior citizens) so that live models need not be recruited for every outreach session. Proximate goals Class exercises are designed to help students master skills progressively. Depending on the student's level of ability and "stage of change," assigned tasks may range from learning to use the mouse to finding a specific answer to a clinical question. When students reach proximate goals, they benefit from self-satisfaction about their progress. Feedback Feedback can enhance self-efficacy by providing clear information about how to best perform a skill, and by strengthening beliefs in personal capability. Feedback may be self-demonstrated by successfully performing an assigned task. And, if students are assisted in finding alternative solutions for ineffective searches, their ability to leam from search mistakes is enhanced. 26 Planning Activities and Strategies gram for seniors about the use of online resources for accessing health information, have a senior citizen from a local senior center or the local chapter of the American Association of Retired Persons model a prototypical search in a live or videotaped demonstration. The Extended Parallel Process Model (EPPM) The EPPM is a model for motivating action through both cognition (thoughts) and feelings (primarily fear). It is formally called a "fear appeal theory" because it focuses on the use of fear as a motivator to action. Most risks are inherently fear-producing. For example, fear might be induced by feelings of not knowing how to use the Internet, not having adequate or up-to- date information regarding patients' conditions, or being perceived as ignorant or behind-the-times (6). The EPPM specifies how to channel that fear into productive, adaptive action. If underlying fears are not addressed in outreach messages, they may cause one to engage in maladaptive actions such as denial of the need to learn the Internet. Thus, fear can either motivate or inhibit productive action, depending on the type of message given to clients or audience members. According to the EPPM, some fear needs to be induced to motivate action. The theory suggests that if people do not believe there is a consequence from failing to use Internet resources (for ex- ample), they will not be motivated to use them. If, however, individuals feel sufficiently threatened by the possible consequences of not using avail- able resources (e.g., potential malpractice suits, falhng behind in current medical knowledge, being embarrassed because everyone else has used the Web, etc.), then they will be motivated to act. Perceived efficacy of the recommended action determines how people act (in outreach, the recommended action is to use the Internet to access health information). If people are moti- vated to act because they feel threatened in some way, and believe they are able to perform an effective recommended response to diminish this threat, then they will control the danger and engage in the recommended action. In this case, a person's fear motivates them to act in an adaptive, protective manner (i.e., they attend a class on how to use the Internet). In contrast, if people feel motivated to act because they feel threatened in some way but do not believe they are able to engage in an effective response that would diminish the threat, they will be motivated to control their fear (because they feel unable to control the danger). In this case, clients or audiences might deny they need Internet resources and engage in reactance (a type of defensive reaction where individuals lash out in anger, e.g., "this is just another time waster, we want no part of it"). Figure 5 shows important definitions in the EPPM and how they might relate to outreach. Overall, research on the EPPM has demonstrated that high threat/high efficacy messages motivate substantial and long-lasting behavioral change. See Figure 6 for examples of how outreach activities can use the EPPM theory. (Message "A" is the threat portion of the message; B-D address self-efficacy perceptions by increasing one's perceived ability to perform a recommended response; and E addresses perceived response efficacy by focusing on whether or not the recommended response "works" in averting the threat.) Please note that threatening messages motivate action - whether positive or negative - while audience perceptions of self-efficacy and response efficacy toward the recommended response determine whether that action is adaptive or maladaptive. For most effective outreach, develop high threat/high efficacy messages to motivate long-lasting and consistent behavioral changes. Caution: if it is difficult or impossible to promote strong perceptions of efficacy (i.e. PubMed has the answers you need), you probably should not use fear-arousing messages which may backfire. Decisions about using the EPPM will depend on your ability to convey motivational messages and Planning Activities and Strategies 27 Figure 5: Definitions from the Extended Parallel Process Model Variable Dimension of Variable Definition Outreach Application THREAT Severity of Threat Susceptibility to Threat The severity or seriousness of the problem. The degree to which one is at risk of experiencing the problem. Individuals don't believe that lack of information is a serious problem; your message should outline the hazards of not being up-to-date on medical information. Individuals don't think they themselves will experience negative consequences if they don't use the Internet; your message should give examples of people just like them who experienced negative consequences (e.g., were sued because they didn't use up-to-date medical information). EFFICACY Self-Efficacy Response Efficacy The degree to which one feels able to do what's recommended to avert the problem. The degree to which one feels that what's recommended to avert the problem works. Individuals may not know where Internet resources are or how to use the Internet; messages should state where classes are held and/or give relevant sites. Individuals may not believe the information on the www is accurate or useful; messages should give examples of how and where useful information is found and how it can be life-saving. OUTCOME Danger Control Fear Control Adaptive, protective actions taken when one is motivated to act and believes s/he can act. Maladaptive, defensive actions taken when one is motivated to act but doubts s/he can do anything (a sense of futility, hopelessness). Individuals take courses and use the Internet regularly. Individuals deny they need to use resources and/or respond defensively (and sometimes angrily) at the suggestion that these resources might be helpful; this type of response usually suggests a need to increase perceived efficacy (above). Figure 6: Outreach Messages Using EPPM Convey outreach "messages" in promotional materials, or during discussion in classes or demonstration workshops: (A) about the threat of not using the Internet; (B) about how easy it is to use the Internet; (C) about specific skills-training classes offered; (D) about where Internet-connected computers are located in the work setting or community, and (E) about the effectiveness of Internet usage in avoiding a threat (i.e., "resources on the Internet provide the most up-to-date information on how best to treat your patients") 28 Planning Activities and Strategies on the relevance of using fear appeal messages with your audience. Messages can be delivered in printed educational materials, through electronic media, or in classes and demonstrations. Promote your messages through channels that are credible sources to your audience. For consumers, get cooperation for promotional messages on grocery bags, radio, or TV, or through doctors' offices or clinics. Channels that are credible sources for those in a clinical setting might be employers or colleagues, a department chair, a noted expert, a professional association or publication, or a conference exhibit. The Stages of Change Model The Stages of Change Model provides a frame- work for explaining how behavior change occurs (7). As displayed in Figure 7, there are five stages of change. People at different points in the change process can benefit from different interventions, matched to their stage at that time (8). The principles of this theory are easily incorpo- rated into any strategy development. Using the Stages of Change helps remind you that change is a process and not an event. For example, outreach activities may falter if you assume that your audience wants to change their information seeking behaviors and are willing to use computer resources for their work. If your assumption is incorrect and the audience is still in the Contem- plation stage, they might better respond to aware- ness/promotional activities (e.g. a lively demon- stration) that help persuade further action. At the other end of the Stages of Change process, if outreach is not designed to include efforts for building infrastructure or follow-through, the process of change may not be maintained. Example: Dr. Wu, a busy physician practicing in rural Montana, has not learned to use Internet resources and wonders if it would be worth his time (precontemplation). At a recent conference, he saw a demonstration of PubMed and was impressed by how easy it is to use. In his rural practice, Dr. Wu misses the opportunities to stop colleagues in the hall for a quick consult and worries that sometimes he might not have enough information for quick decisions. He wonders if it would be worth his time to learn how to use the Internet (contemplation). He decides to look into Internet training about PubMed and signs up for a class (preparation). On the day of the training, Dr. Wu hears from the instructor that the president of his local medical society took the same class and continues to use the skills gained almost daily. Dr. Wu was asked to bring a recent patient problem. He brings a question about the accu- racy of prenatal ultrasound in determining Figure 7: Stages of Change Model Variable Concept Outreach application Precontemplation Not thinking of changing a behavior Introduce awareness of health information sources Contemplation Thinks about using the Internet for information access Increase awareness of the need for change Preparation Makes plans to learn information seeking skills via the Internet Facilitate computer access; offer skills training with varied formats personalized to local need Action Uses Internet sources when seeking new information Assist with technical support; publish articles about search tips; train onsite liaison to offer support or provide intermediary searches Maintenance Continues new information seeking behaviors Offer advanced and refresher classes; continue to partner with opinion leader advocates to reinforce new behaviors Planning Activities and Strategies 29 congenital hydrocephalus. The instructor shows him how to use PubMed s clinical queries and finds the information in a relevant abstract right away. Armed with this positive experience, Dr. Wu resolves to take the time in the future and begins using his computer (action). However, several weeks pass and Dr. Wu tends to put off trying it again on his own (relapse). Then, he makes a phone call to a respected col- league for a quick consult. She says she has recently taken a course on computers, and says that Dr. Wu could have gotten the answer quicker than waiting for her return phone call by looking on PubMed. With this friendly reminder, Dr. Wu tries his own search with success (success). With this success, Dr. Wu now regularly uses the Internet for questions (maintenance). Diffusion of Innovations Theory Based on social science research conducted in the 1940's by Everett Rogers, Diffusion of Innovations Theory addresses how new ideas or products spread within a society or from one society to another (9). Key principles of the diffusion process are: Figure 8: Diffusion of Innovations Theory Variable Concept Outreach application Relative Advantage The degree to which an innovation is seen as better than the idea, practice, program, or product it replaces Point out unique benefits of product (e.g. PubMed), such as getting time-sensitive info faster; having access in a remote area miles from a library Compatibility How consistent the innovation is with values, habits, experience and needs of potential adopters Promote products that have relevant information needed by targeted audience (e.g. AIDSLINE for an AIDS outreach program). Complexity How difficult the innovation is to understand and/or use Tailor training to level of computer experience Trialability Extent to which the innovation can be experimented with before a commitment to adopt Provide hands-on training for trial practice in a very safe environment (e.g. presentation at a health fair). Observability Extent to which the innovation provides tangible or visible results Use relevant examples tailored to actual need of targeted audience (e.g. safety in sports for a group of teens). • Most people consider adopting an innovation, not on the basis of scientific research by experts, but because people they respect (opinion leaders or early adopters) endorse it. • Innovation is adopted first by people who are considered innovators (2.5% of individuals in a system). The next 13.5% to adopt an innova- tion are considered "early adopters." • Critical mass is the point at which enough individuals have adopted an innovation that any further rate of adoption becomes self-sustain- ing. Early adopters and opinion leaders are instrumental in getting an innovation to the point of critical mass. If the use of technology to answer health informa- tion questions is considered an innovation, the Diffusion of Innovation theory describes a pattern of adoption followed by an outreach audience. Outreach activities should target innovators and early adopters because they can help persuade others about the benefits of using these resources, encourage continued use, and might even promote the role of the library for consultation, training, or resource access. 30 Planning Activities and Strategies Example: When planning your skills training classes, contact opinion leaders and early adopt- ers from your audience to encourage them to help influence the success of your efforts to train end user information seeking behaviors. Suggestions for participation by opinion leaders could include: • Attending a training session or providing a testimonial about their experience in using the Internet; • Offering their endorsement for use in promo- tional literature; • Agreeing to "spread the word" in conversa- tions with colleagues about the message you want to convey (e.g., "making time to learn how to find and share useful information will help you and your patients''). Another principle of the Diffusions of Innovation Theory states that innovations perceived by individuals as having greater relative advantage, compatibility, trialability, observability, and less complexity will be adopted more rapidly than other innovations. For illustrations of how outreach can apply this principal, see Figure 8 and other examples in Appendix E. Community Organization Community Organization is not a theory in itself, but a process by which community groups are helped to identify common problems or goals, mobilize resources, and develop and implement strategies for reaching their goals. The sense of group identity promotes motivation for change. Outreach planning may not literally strive to "organize" a community to change at a grassroots level. However, principles of community organi- zation will help outreach planners consider a community level perspective, with measures that consider social or cultural factors of the commu- nity involved. The conceptual framework for community organi- zation in the public health literature is that health promotion initiatives are designed to serve com- munities and targeted populations, not just single individuals (8). Similarly, outreach programs with a community perspective see their work toward successful outcomes involving more than just individual change. There are various community approaches that have key concepts in common (see Figure 9). The process of empowerment is intended to stimulate problem solving and activate community members. Community competence is building the confidence and skills to solve prob- lems effectively. Participation and relevance involve citizen activation and a collective sense of readiness for change. Issue selection concerns identifying "winnable battles" as a focus for action, and critical consciousness stresses the active search for root causes of problems (8). According to Bowes (10), success in courting community participation can result in labor savings (through volunteers and local supervi- sion), Unking of influential community leaders to project goals, and adapting programs to local idioms. This type of "localization" can help sustain the effect of an outreach program long after outreach funding has expired. Example: An outreach program in the Pacific Northwest called Tribal Connections works with the communities of 16 American Indian/Alaska Native tribes. The goal is to help tribes reach their own tribal-wide health information access goals (empowerment), interpreting health in the broadest sense according to the needs of each community (relevance). The methodology is community-based, encourag- ing development of a sense of involvement within and across tribes (competence). It is hoped that the project will broaden its focus beyond improved network connections to improved human connec- tions. For example, the tribes will share develop- ment of a project website that will provide access to first hand tribal information as well as links to credible secondary resources, thereby promoting better communication between tribal communi- ties. One of the objectives will be to create a sustainable online community of individuals interested in the promotion of tribal health. So far, one tribe reports that involvement in this project has opened doors between tribal agencies in their Planning Activities and Strategies 31 Figure 9: Community Organization Concept Definition Outreach application Empowerment Process of gaining mastery and Give individuals and communities power over oneself/one's tools and responsibility for making community, to produce a change decisions that affect them Community Competence Community's ability to engage in Work with community to identify effective problem solving problems, create consensus, and reach goals Participation and Learners should be active Help community set goals within Relevance participants, and work should the context of pre-existing goals, "start where the people are" and encourage active participation Issue Selection Identifying winnable, simple, Assist community in examining specific concerns as focus of how they can communicate the action concerns, and whether success is likely community; for example, it has greatly increased communication between the tribe's Department of Health and Human Services and the school. Planning for Activities Using one or more of the above-described theories in your outreach activities will help make your efforts "theory-based." But before deciding what theories to use, think again about your outreach objectives and the results you hope to achieve. Then, develop a written plan that will provide a roadmap for steps to implement your intended activities. The plan should summarize information gathered about the community, its members, and their needs, and include a program implementation outline and a timeline for the various activities. A written plan holds the outreach program account- able and ensures that steps are not taken randomly. First, review your list of objectives and notice that the process and educational objectives provide an outline of the overall activities and outcomes to be achieved while the outreach is ongoing. If process and educational are accomplished, the behavioral, environmental and program objectives will hopefully result, and they are typically measured when outreach is completed, or during follow-up. So, operation planning for your program means identifying the activities and related strategies to reach the process and educational objectives. Creating the plan helps to think through the rational or logic about how the activities will achieve the intended results. For example, your educational objectives may be to effect the motivation and ability of your targeted audience to access health information. To do that, you will want to plan what educational activities you will conduct and what theories or best practices you will use as strategy. Thus, when developing an implementation plan, each process and educational objective must be thought out regarding activities and strategies. Look at best practices documented in outreach studies and also study the health education theories discussed in this chapter. Sample Out- reach Strategies in Stage 3 Tool Kit presents a summary of sample strategies for factors related to outreach objectives, based on selected theories and best practices identified in this chapter. You may want to select a theory you think would make sense and then get audience feedback on variables important to the theory before deciding if and how you can apply it. One way of getting that feed- back is to conduct an audience assessment, described in the next section. Tasks to obtain feedback should be included in your implementa- tion plan. An implementation plan in Stage Three should: • Describe the overall community and its needs 32 Planning Activities and Strategies • List program goals • List process objectives • List learning, behavioral, environmental, and overall program objectives • Specify activities related to each process and educational objectives • Specify theory-based strategies and best practices to carry out each activity • Identify interim tasks to be accomplished (e.g., design and conduct audience assessment) • Include a timeline • Identify who is responsible for each activity Workforms with fill-in steps to develop an outline and a task list by activity, are included in Stage 3 Tool Kit. See Appendix I for a sample planning outline and Appendix J for a sample timeline by task and person responsible. How Does an Audience Assessment Fit In? In the library science literature, an audience assessment is typically called a "needs assess- ment," gathering data about: • Types of information needed • Purpose • Frequency • Sources used (colleagues, journal articles, etc.) • Factors determining sources used • Previous computer experience • Barriers to gaining access Some of the above information may already be gathered in a community assessment (described in Stage 1) to help inform outreach program goals and objectives developed in Stage 2. For the purposes of this manual, an audience assessment is different than a community assessment; it is a type of formative evaluation that gathers data to refine the strategy selected for a particular outreach activity. The audience assessment may collect data about variables typically studied in "needs assessments," but also will profile the audience according to variables relevant to the theory or theories you hope will motivate, facilitate, or reinforce information seeking behavior (see Figure 10). Thus, the audience assessment is discussed here in Stage 3 as a tool for helping to plan and develop specific outreach activities. For example, prior to scheduling a training activity, you could ask potential class participants about their attitudes or beliefs regarding Internet use, or stage of readiness in adopting new infor- mation seeking behaviors. Based on their re- sponses, you would then develop strategies based on the EPPM model and Stages of Change. Collecting data on variables relevant to selected outreach theories prior to an outreach activity also provides baseline data for comparing with mea- surements taken after outreach has happened. For example, suppose you will be conducting a training activity to improve Internet search skills, and plan to use theory about self-efficacy. You might create a self-efficacy rating scale about Internet searching by adapting questions from the survey example in Appendix F, originally created to rate self-efficacy in conducting a CD-ROM literature search. The factors you choose to rate self-efficacy are assessed prior to outreach to detennine areas of focus needed in skills framing. Based on the Social Learning Theory, ways to increase self-efficacy, such as guided mastery, proximate goals, and feedback, are used in the outreach session. Then, self-efficacy is measured again at the end of the workshop to determine if there has been any change (hopefully an increase). Called a pre-test/post-test, this type of evaluation design is typically used to assess changes that may have resulted from an outreach activity. However, it is a weak design if there is not also a control or comparison group. Please see Stage 4 for further discussion of evaluation designs. Example: To tailor an upcoming training work- shop to the needs of participants, outreach staff conducted an audience assessment. Questions were based on several theories of behavior change. For example, outreach staff wanted to determine whether demonstrations about Pub Med would be more appropriate than starting immediately with hands-on skills training. Survey Planning Activities and Strategies 33 responses revealed that many had not heard of Pub Med, or thought about using it, so a lively demonstration seemed a better start. The survey also asked questions to determine baseline levels of confidence on a variety of computer and Internet skills, ranging from 1-"Barely Confident" to 5 -"Very Confident." The questions were designed with the intention of asking them again at the completion of outreach. With that data, outreach staff developed afollowup hands-on workshop that focused on skills needing attention. The workshop also included demonstration searches by a local health worker from the community clinic (following the principle of observational learning in Social Learning Theory). Finally, using the Diffusion of Innovations prin- ciple that suggests people are more likely to adopt Figure 10: Theory-based variables Social Learning Theory • How much skill and knowledge does the audience have about finding health information on the Internet? (behavioral capability) • Do they expect that the information they need exists and is available? (expectations) • How effective do they feel they are themselves in finding health information on the Internet? (self- efficacy) Extended Parallel Process Model • Does the audience perceive any negative consequences for being misinformed or lacking information? (perceived threat) • Does the audience believe that using information technology works in accessing accurate health information? (perceived response efficacy) • Does the audience believe they have the access, skills, and knowledge needed to effectively use information technology? (perceived self-efficacy) Stages of Change Model • At what stage of readiness are they in using Internet or email (precontemplation, contemplation, preparation, action, maintenance) Diffusion of Innovation • Who are their opinion leaders? • What people or groups might be influential or motivate their use of electronic resources? • How might Internet-based resources be presented so that the audience perceives them to have greater advantage, compatibility, trialability, observability, and less complexity than alternative sources? an innovation if there is a perceived advantage, another audience assessment question asked for specific examples of a recent time when informa- tion was needed but not found. These responses were later used to develop search examples based on actual need and to show where Internet resources would have helped. How is an Audience Assessment Conducted? Decisions about how to gather data for an audi- ence assessment will depend on how that data will be used. Most of the time, outreach programs will not have the resources or need to conduct rigorous survey research, where generalizations are made to a larger population based on statistically valid results. Results from an audience assessment are used to tailor a specific outreach activity, so gathering generalizable data is really not needed 34 Planning Activities and Strategies or appropriate. Informal feedback questionnaires or exploratory research, such as open ended questions, interviews, or focus groups, will serve the purpose of gaining a better understanding of your specific audience to help improve the strategy you plan to use. See "Methods of Data Collection" in Stage 1 for further discussion of ways to gather data. If you plan to repeat the audience assessment questions post-outreach, conducting pre- and post- interviews or feedback questionnaires might be easier than pre- and post-focus groups. Appendix G presents sample questions to ask for each theory to be used. Appendix H provides a sample audience assessment survey. On the sample survey, note that some questions are designed to be asked again on a post-outreach evaluation. Tool Kit - References 35 References 1. Marshall JG. A review of health sciences library outreach and evaluation. Seattle, WA: National Network of Libraries of Medicine/Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pm/eval/marshall.html, 1997. 2. Burnham J, Perry M. Promotion of health information access via Grateful Med and Loansome Doc: why isn't it working? Bulletin of the Medical Library Association 1996;84(4):498-506. 3. Dorsch J. Equalizing rural health professionals' information access: lessons from a follow- up outreach project. Bulletin of the Medical Library Association 1997;85(l):39-47. 4. Bandura A. Self-efficacy: The exercise of control. New York: W.H. Freeman and Co., 1997. 5. Debowski S, Wood R, Bandura A. Impact of guided mastery and enactive exploration on self- regulatory mechanisms and knowledge construction through electronic inquiry, in press. 6. Witte K. Theory-based interventions and evaluation of outreach efforts. Seattle, WA: Na- tional Network of Libraries of Medicine, Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pm/eval/witte.html, 1998. 7. DiClemente CC, Prochaska JO. Processes and stages of change: Coping and competence in smoking behavior change. In: Shiffman S, Willis, T.A., ed. Coping and substance abuse. San Diego: Academic Press, 1985:319-334. 8. Glanz K, Rimer BK. Theory at a glance: a guide for health promotion practice, http:// rex.nci.nih.gov/NCI%5FPub%5FInterface/Theory%5Fat%5Fglance/HOME.html. U.S. Public Health Service: National Institutes of Health, September 1997:17. 9. Rogers EM, Scott KL. The diffusion of innovations model and outreach from the National Network of Libraries of Medicine to Native American Communities. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pm/eval/rogers.html, 1997. 10. Bowes JE. Communication and community development for health information: constructs and models for evaluation. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/bowes/, 1998. 36 Tool Kit - Selected Readings Behavior Change Theories Diffusion of Innovations Theory Rogers EM. Diffusion of innovation. Glencoe, 111: The Free Press, 1962. Rogers EM. Communication of innovations. (2nd ed.) New York: The Free Press, 1971. Rogers EM, Scott KL. The diffusion of innovations model and outreach from the National Network of Libraries of Medicine to Native American communities. Seattle, WA: National Network of Librar- ies of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/ rogers.html, 1997. Community Organization Baldwin GD. Planning and evaluating information outreach among minority communities: model development based on Native Americans in the Pacific Northwest. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/ baldwin.html, 1998. Bowes JE. Communication and community development for health information: constructs and models for evaluation. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/bowes/, 1998. Bracht N. Health promotion at the community level. Newbury Park, CA: Sage Publications, 1990. Steckler A, Allegrante JP, Altman D, et al. Health education intervention strategies: recommenda- tions for future research. Health Education Quarterly 1995;22(3):307-328. Extended Parallel Process Model Witte K. Theory-based interventions and evaluation of outreach efforts. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pnr/eval/witte.html, 1998. Additional articles about EPPM on Kim Witte's Website, under "Research" at: http://www.msu.edu/ -wittek/index.htm. Social Learning Theory Bandura A. Self-efficacy: the exercise of control. New York: W.H. Freeman and Co., 1997. Debowski S, Wood R, Bandura A. Impact of guided mastery and enactive exploration on self-regula- tory mechanisms and knowledge construction through electronic inquiry, in press. Stages of Change Model DiClemente CC, Prochaska JO. Processes and stages of change: coping and competence in smoking behavior change. In: Shiffman S, Willis, T.A., ed. Coping and substance abuse. San Diego: Academic Press, 1985:319-334. Tool Kit - Sample Outreach Strategies 37 Objectives Sample Strategies from Theory and Best Practices Increase awareness Increase knowledge Influence attitude Influence beliefs Based on Stages of Change Model, assess audience awareness and readiness for learning new skills or adopting new technology. Then determine priority activities. For example: > If a site has little technology and technical support but great motivation and interest in accessing information resources, the outreach priorities might be to first facilitate access and then motivate and train individuals to use the access effectively. > However, if technology is lacking and users are not aware of the benefits that access can provide, your first focus would be on activities to promote awareness and interest in outreach products and services. Based on Extended Parallel Process Model, influence attitudes and beliefs by first assessing the audience on threat and efficacy variables. Then, convey messages about the threat of being misinformed or out-of-date and about effective ways to cope, such as learning easy-to-use and convenient Internet resources. > Messages can be delivered in print or electronic media, or in classes and demonstrations. > Use channels credible to audience, e.g., employers, colleagues, department chair, community leader, tribal elder, noted expert, professional association, conference exhibit. For consumers, channels could be grocery bags, radio, TV, or doctor's offices or clinics. Based on Diffusion of Innovations Theory, identify opinion leaders and early adopters who will recruit outreach participants by way of mutual influence and respect; and who can help generate attitudes that electronic access can provide a better and easier way to get relevant information. Based on library outreach research, use a variety of promotion methods Develop skills Facilitate access Based on Social Learning Theory, provide training that will increase self-perception of ability by: > Having someone who is respected or similar to the student give hands-on demonstrations, verbalizing aloud as decisions for search formulation are made; > Using proximate goals designed to help students master skills progressively, and feedback to encourage self-efficacy; > Demonstrating searches that are very relevant to audience needs; > Assisting students in refining searches, thereby learning from mistakes. Based on Stages of Change Model, support the "taking action" stage by providing or training onsite technical support, publishing search tips, or providing intermediary searches. Reinforce behaviors Build community Based on library outreach research, provide money for computer equipment. Based on Community Organization, involve stakeholders in decisions about hardware use and location 38 Tool Kit - Planning Outline Workform Fill in Goals, Objectives, Activities and Strategies Outreach goal #__:____________________________ Process objective #__: _________________________ Activity:_________________________________ Strategy:______ Process objective #__: _________________________ Activity:__________________________________ Strategy:__________________________________ Process objective #__: ________________________ Activity:__________________________________ Strategy:__________________________________ Educational Objective#__: ____________________ Activity:__________________________________ Strategy: .^______________________________ Educational Objective*__: ____________________ Activity:_____________.__________________ Strategy:__________________________________ Educational Objective*__: ____________________ Activity:_______________.____________________ Strategy:_________________------------------ Educational Objective*__:____________________ Activity:__________________----------------- Strategy:_________________------------------ Fill in tasks by activity, with person responsible, and according to a timeline Task Person Month 1 2 3 4 5 6 7 8 9 10 11 12 ! H O O 3 o o 3 <5 40 Tool Kit - Gowan Library Case Example In Stage 2, the goals and objectives that were carefully constructed with stakeholders interested in outreach to the Geneva Health clinics provide a useful outline from which to continue your planning process. In reviewing your objectives you note that reaching them will mean conducting promotional, logisti- cal, and educational outreach activities that will: • Implement connectivity at Geneva Health sites • Provide training on use of online health information resources • Develop site liaisons to promote and advocate outreach activities, and to train them as future online trainers and technical support for their sites • Establish primary library relationships for access through Loansome Doc to full text resources • Maximize collaboration between organizations interested in improving health services infrastruc- ture, of which information access is a component The above list provides a rough outline of what Gowan staff will do based on the process objectives. But, thinking through what the library staff will do to meet the process objectives is only part of the plan. Staff need to figure out how to meet the educational and behavioral objectives. These objec- tives have to do with impact—what happens as a result of outreach—such as the numbers and types of people reached and changes in awareness, attitude, knowledge, and skill levels. It's important to keep this in mind as the educational and behavioral objectives help to shape planning for what needs to be done. It's one thing to say that outreach will influence behavior change, but making that happen requires more than disseminating information. Strategies are required to help influence behaviors. You and your staff at Gowan library consult the library literature to see what best practices have been documented from other outreach studies. You also review theories from the fields of health education and communications that are described in Stage 3. Several of the best practices documented in outreach studies are substantiated by these theories. For example, outreach studies show the impor- tance of a local advocate for promoting outreach and for sustaining access to information resources after outreach is completed. According to the Diffusions of Innovation theory, if the advocate is also an opinion leader, he or she will help to increase the adoption and sustained use of an innovation. Using the Internet for health information is an innovation, so you decide that identifying and includ- ing opinion leaders is a good strategy for increasing outreach participation and reinforcing the use of skills learned. So, having consulted knowledge sources about best practice and theory, you and your staff develop an outline of the activities and strategies used to reach each process and educational objective. Here are some examples: Process objective During the next 18 months, outreach staff will conduct at least two educational activities at sites of Geneva Health clinics to increase motivation, skill, use, and exchange of electronic health informa- tion resources. Activity: Based on audience assessment results, schedule appropriate demonstration or training workshops at each clinic. Tool Kit - Gowan Library Case Example 41 Strategy: Based on theories of behavior change (e.g. Stages of Change Model), include questions in audience assessment to determine stage of readiness, such as level of ability and interest in training. Educational objective During the next 18 months, at least 50% of health providers at Geneva Health will participate in at least one educational outreach activity conducted by outreach staff at each site. Activity: Develop and distribute promotional flyers with endorsements from opinion leaders about the usefulness of Internet resources for patient care decisions, and encouraging health care providers to participate in outreach educational activities. Strategy: Based on Diffusion of Innovations Theory, identify opinion leaders and early adopters who will endorse the use of Internet resources. Educational objective Skill level: During the next 18 months, at least one out of three outreach training participants will correctly answer a true/false question based on a simple search of a National Library of Medicine online resource. Activity: Demonstrate search skill techniques followed by progressively difficult hands-on exercise and a question to test understanding Strategy: Based on using proximate goals to increase self-efficacy (from Social Learning Theory), develop hands-on exercises designed to help students master skills progressively. You realize that some of the theories require feedback from your targeted audience about key vari- ables, such as degree of confidence in their abilities (self-efficacy in the Social Learning theory) and readiness to adopt a change (Stages of Change theory). And, there are other questions your staff want to ask their potential outreach participants to help tailor the trainings. To gather this type of feed- back, you decide to develop an informal questionnaire that would be distributed by the clinics to their staff to promote the trainings, to help tailor the upcoming trainings, and to gather some baseline data for comparison with post tests. A sample of the questionnaire is provided in Appendix H. Before conducting the assessment and developing the training, you decide to construct a task timeline. This tool will be very helpful for tracking your progress throughout the project and to help plan when and how you will do the audience assessment. An example of the task timeline covering steps through promotion of the training is provided in Appendix J. In the next Stages (4 and 5), your staff will think about the evaluation component of the project, and develop a plan for when and how that data would be collected, analyzed, and acted upon. Stage 4: Planning Evaluation Topics • Developing an Evaluation Plan • Establishing Evaluation Objectives • Process (Formative) Evaluation Objectives • Accountability • Program improvement • Replication • Summative Evaluation Objectives • Overall program effectiveness • Program effects-what else happens as a result of outreach? • Evaluation Methods • Quantitative Method • Qualitative Method • Selecting an Evaluation Design • Experimental design • Quasi-experimental design • Non-experimental design • How Much Evaluation is Feasible? Figures Figure 11: Program Evaluation Flow Chart Figure 12: Evaluation Designs Figure 13: Level of Resources for Various Evaluation Designs Tool Kit • References and Selected Readings • Workform for Process Evaluation Objectives • Gowan Library Example Stage One ^H Conduct Community Assessment ___________________ i Stage Two 1 Develop Goals and Objectives i ' Stage Three ^H Plan Activities and Strategies i ' Stage Four ^H Plan Evaluation What to Establish U O c o UJ ft) ft) a ^ c a oi iu"8 o _ft) ft) Should process evaluation be conducted? • to demonstrate accountability • to monitor progress • to make 'mid-project' adjustments • to replicate a pilot project Should summative evaluation be conducted? • to document what was achieved • to find out what else happened • to research effectiveness of specific strategies Determine the independent and dependent variables of interest Choose evaluation design that balances time, resources and method with desired level of validity Planning Evaluation 43 A typical model for program development includes the following phases: 1. Identifying a target audience and conduct- ing a community needs assessment, 2. Developing written goals and objectives, 3. Implementing activities to accomplish those objectives, and 4. Evaluating the overall quality and success of those activities vis-a-vis the stated objectives. In reality, planning and conducting a program and its evaluation is more complex than a four- step process. Different types of evaluation correspond to different phases of program development. Thus, as seen in Figure 11, the model should be at least a 6 step process that integrates various types of evaluation through- out. The manual thus far has discussed ways to conduct evaluation for a community and audience assessment, as part of program devel- opment phases I-III in Figure 11. This chapter will describe an overview of evaluation plan- ning to assess a program's implementation and outcomes. For further information on evaluation planning, several sources are listed in the Tool Kits at the end of Stages 4 and 5. One outstanding and comprehensive source is the nine volume kit edited by Joan L. Herman called Program Evaluation Kit, Newbury Park, CA, Sage Publications, 1987. Developing an Evaluation Plan The three major components that should be addressed in an evaluation plan are: 1. Questions or issues you will address in the evaluation 2. What you will measure and how 3. Resources needed to accomplish the evaluation tasks Figure 11: Program Evaluation Flow Chart Program Phases Question to Ask Evaluation Phase I. Identify Problem/Need What is the targeted community? To what extent are information needs being met? Community Assessment II. Develop Goals and Measurable Objectives What changes will address unmet needs? III. Select Activities and Strategies and Design Implementation Plan What kinds of activities/strategies will produce changes desired? How will activities and strategies be tailored to the needs of the targeted group? How should the program be put into operation? Audience Assessment IY. Program Implementation Is the program operating as planned? Are participants learning what is expected? Is the audience satisfied with results? Is the program reaching the intended audience? Process Evaluation V. Program Outcomes Were objectives reached? Are there impacts regarding health information use? What other impacts have occurred? Summative Evaluation VI. Feedback How realistic were initial goals? What programmatic changes need to be made? 44 Planning Evaluation To be most effective, plans for evaluation should be in place before outreach activities begin. Thinking ahead will make it easier to plan whether and what baseline data to collect. Data collection instruments, such as surveys, may need to be developed and pilot tested in advance. If there are plans to compare a spe- cific strategy with an alternative to see which is more effective, time is needed to work out the logistics about when and with whom the two strategies will be tested. And, even though an evaluation report is completed at the end of the program, it is difficult, ineffective, and not very objective to begin thinking about evaluation after the program is over. Therefore, it is best to plan ahead, before activities begin, about what will be measured and how. In developing the plan, the following issues require consideration: 1. Outreach goals and objectives 2. Plans for implementation, or what is currently happening if the program is already in place 3. Evaluation objectives - purpose of the evaluation and its role 4. Evaluation questions to be addressed 5. Methods and types of information that will be accepted as evidence of the effects of the program 6. Design - when and from whom data will be collected 7. Data collection - what and how data will be collected 8. Resources 9. Timeline for evaluation The first two steps in evaluation planning involve clarifying the goals and objectives of your outreach program and plans for implemen- tation. Both of these steps are described in detail in Stages 2 and 3. Equally important is establishing objectives for the evaluation, as described in the next section. Evaluation objectives will help determine the specific issues or questions the evaluation will address. Decisions about how to gather mea- surements will include considering what types of information (qualitative or quantitative) will be most appropriate and accepted as evidence. Decisions about the research design - when and from whom data will be collected - will follow. Each of these considerations are addressed in this chapter, with a brief discussion of how much evaluation is realistic for your program. Issues of data collection - what and how data will be collected - are discussed in Stage 5. Establishing Evaluation Objectives One of the most challenging aspects of evalua- tion is clarifying what it is you want to find out. A good first step is to identify the "stakehold- ers" who will have an interest in the evaluation results. They might include: • Funding agency • Targeted community • Your boss • Outreach staff When planning what data to collect, think about what these stakeholders will look for in the evaluation report. For example, although information about the overall results of the program might be needed by the funding agency, key contacts of the targeted community may want to know the reactions and comments of outreach participants in order to make a decision about future outreach efforts. Other outreach programs with similar audiences may be interested in how you conducted your program and what worked best. Or, your outreach staff may be interested in determining whether one particular strategy is more effective than another. Ask stakeholders about their criteria for success - what outcomes from the project are most important to them? Do they also want to know if it was successful compared to an alternative Planning Evaluation 45 (such as another type of outreach program, or no program at all)? Is the program being evaluated as a pilot study for possible replica- tion? One way of prioritizing the evaluation questions is to ask yourself and those interested in the evaluation how the information gained about a particular question will make a difference. What decisions will be made as a result of the data? Or, how will the information help im- prove the program? It will be important to refine the broad purpose or objectives of an evaluation into specific questions. Questions addressed by evaluation during and after outreach can be categorized as process and summative, respectively. [Note: some evaluation textbooks differentiate process evaluation as part of formative evaluation and summative evaluation as another term for outcome/impact evaluation.] Process (Formative) Evaluation Objectives Process evaluation helps to keep track of an outreach program as it is happening so that modifications or improvements can be made on an ongoing basis. Very generally, process evaluation questions address: • Is outreach working as intended? • How can it be improved (while it is going on)? To focus the types of data you may want to address in a process evaluation, use the "Workform for Process Evaluation Objectives" in the Stage 4 Tool Kit. A sample filled-in workform is provided in Appendix K, "Sample Process Evaluation Objectives." Appendix L, "Sample Ways to Measure Program Process," provides selected measures for several of the evaluation objectives in Appendix K. There are many possible questions for a process evaluation, and choosing which ones to ask will depend on how the data will be used. The following section provides examples, by purpose, for process evaluation data, based in part on a more thorough discussion by King, 1987 (1). Accountability: did you do what you said you would do? To provide accountability to stake- holders such as funders, partners, or directors, first decide what characteristics are important to the success of the program (do not forget the perspective of your targeted audience - what do they think is important)? Some might be: • Costs (staff, materials, equipment, facilities) • Relevance of equipment, resources (e.g. PubMed), and services (e.g. interlibrary loan) provided or promoted with respect to user need -e.g., are resources useful in terms of content, understandability, language, or cultural relevance? • On-site administrative support • Facilities (location, size, and number of computers allotted for training) • Time allotted to activities • Staff responsiveness to participants' needs The above characteristics are just examples. Modify the list according to the characteristics most important to the success of your outreach program and decide how each will be monitored. Appendix K, under Accountability, provides an example list of characteristics important to one outreach program. Note that it is helpful to review the objectives, outcomes, and overall plan for implementing the program when selecting characteristics to monitor. Program improvement: assessing progress toward objectives so adjustments can be made that are targeted and effective. Planners need to decide in advance what indicators to measure, which will depend on the outcomes identified in each objective (see Appendix D "Sample Outreach Objectives"). Some indicators could be: • Numbers or percentage of target audience reached • Evidence that promotional activities increase awareness of information resources 46 Planning Evaluation • Evidence that participants increase their level of self-efficacy (confidence) in search skills • Evidence of quality (relevant or useful or efficient) search results • An increase in ILL requests • Evidence of intended or actual use of electronic resources (e.g. Website hits, if relevant, or survey responses about inten- tions to use electronic resources) The data collected to measure these indicators will give valuable feedback about what might be working and what needs adjustment. This type of evaluation is measuring the effectiveness of specific strategies. You can look to the imple- mentation plan you developed in Stage 3 to help clarify what assumptions you may want to test about causal links between strategies and outcomes. Another way of thinking about what causal links to measure is by identifying the indepen- dent and dependent variables. An independent variable is what the planner has control over (e.g. the intervention). The dependent variable is the outcome or what changes (e.g. use of PubMed) as a result of the independent variable. For example, if assessing the effect of an outreach activity (e.g. skills training) on out- comes of interest such as attitudes, beliefs and behavior, the independent variable is the skills training and the dependent variables are changes in attitudes, beliefs and behavior. Thus, depen- dent variables are typically the outcomes identified in the outreach objectives. If one is conducting a theoretically-based evaluation, it is important to track the variables identified in the theory to determine whether or not the intervention is operating effectively. For example, if a strategy based on Diffusion of Innovation theory is used to change information seeking behavior, you may want to test the assumption that the strategy actually caused the behavior change. By focusing your data collec- tion on variables that are critical to the theories you use, your evaluation can identify those strategies that seem to make the most differ- ence, so you can explain rather than just de- scribe the outcome. Say that the Extended Parallel Process Model was used to develop the intervention and evaluation. In a process evaluation, researchers would measure perceptions of threat (severity, susceptibility) and efficacy (response efficacy, self-efficacy) to determine whether the interven- tion was promoting danger control actions (i.e., adoption of the recommended response) or fear control actions (i.e., defensive avoidance, reactance against the recommended response). If the results of a survey indicated high threat and low efficacy, then according to this theory the intervention would be failing. However, if the survey indicated high threat and high efficacy, then one could be fairly confident that the intervention was producing the actions desired (2). For a more detailed example of theory-based process evaluation see Appendix K, Program Improvement. Keep in mind that, ultimately, the outreach objectives themselves may need modification if they are not being reached. Meanwhile, monitoring progress during the outreach program will provide opportunities to make changes that might impact the overall level of success. Appendix M, Sample Exit Questionnaire, provides sample questions for an end of class survey to assess progress toward educational and behavioral objectives. Results from the exit questionnaire can be compared to the audience assessment (Appendix H), con- ducted prior to the training class that provided a baseline from which to compare. Replication'. If your outreach program is a pilot project, process evaluation will be important for effective replication of the program in other communities or locations. Here, the role of the process evaluation is to document the day to day operation of the program. If results of your outreach are successful and you can say - "It works!" - the descriptive information you gather here will answer the question - "What Planning Evaluation 47 works?" The description might be informal, such as a written outline generated from the implementation plan that is periodically updated to describe what actually happens. This serves as an historic record and a realistic picture of the time, staff, resources, problems, and suc- cesses involved. See the Stage 4 Tool Kit, "Workform for Process Evaluation Objectives," for sample evaluation questions regarding replication. Summative Evaluation Objectives While process evaluation questions help deter- mine how well outreach is working while it is ongoing, summative evaluation helps determine what outreach accomplished. Very generally, summative evaluation questions address: • Did outreach meet its objectives? • What differences (i.e. outcomes) resulted? • Are the outcomes beneficial or deleterious? To whom? • Are the outcomes those originally envi- sioned? The purposes for a summative evaluation can range from making judgments about overall program effectiveness (were objectives reached?) to discovering program effects (whether or not predicted by objectives). Overall program effectiveness: Monitoring and compiling a final tally of whether goals and objectives have been achieved is one of the basic purposes of a summative evaluation. Note that monitoring progress toward objectives is also one purpose of process evaluation; how- ever, in the process evaluation this progress need only be spot checked. For a summative evaluation, data should be collected from a representative sample of outreach sites or participants so that staff will have good infor- mation to describe what the program achieved, and documentation about whether it met its goals. See Appendix N, "Sample Ways to Measure Outcomes," for an illustration of how objectives might be tracked. Appendix O, "Sample Measures of Behavior Outcomes," provides sample questionnaire items that will measure outcomes for objectives related to behavior. Program effects - what else happens as a result of outreach: Summative evaluation questions might also help determine the impact of out- reach on variables not addressed by objectives, to provide a broader perspective. For example, one objective might be: "at least 25% of participants will report that outreach training influenced the way they subsequently obtain information for patient care decisions." Note that this objective does not specify what type of patient care decision is influenced. Data about the type of decision might be collected in a summative evaluation and reported to a hospital administrator or other interested party. Another example of variables not included in program objectives that could be assessed in a summative evaluation is impact on worklife, such as job productivity (see Anderson et al. 1993 for survey examples to measure impacts on worklife)(3). The point is that summative evaluation can be designed to measure whatever outcomes are of interest. Planners may want to collect informa- tion about unintended outcomes, to provide a rich picture of the impact of outreach. For example, an open ended question might ask "what happened that was not expected (either positive or negative)?" Evaluation Methods Discussions of evaluation methods are typically characterized by the definition of two types of data: quantitative and qualitative. Each type of data is useful in both the extensive and intensive data collection approaches introduced in Stage 1 and reviewed here. 48 Planning Evaluation With extensive data collection, much is already known about the situation and the possible variables or factors involved. The purpose is to collect data about a community that can be considered truly representative of the entire user population. Data collected can be both qualita- tive and quantitative (described below). Statis- tical validity and reliability are key criteria, meaning that the research instrument measures exactly what was intended and, if repeated, results would be the same or very similar. Random sampling is also important, so that all people being researched have an equal chance of responding. (For more discussion of random sampling, see Appendix C). In situations where little is known about the phenomena being studied, it may be helpful to use a more exploratory data gathering approach called intensive data collection. The purpose here is to understand patterns of behavior or identify particular impacts or problems imped- ing desired results. With intensive data collec- tion, you want a practical understanding of what is happening, but not to make generalizations. You can get both qualitative and quantitative feedback that does not strive for statistical validity, but does provide data to help under- stand your audience. Each approach can use a mix of quantitative and qualitative methods, described next. Quantitative method Quantitative methods produce numerically based data, such as counts, ratings, scores, or classifications. Examples of quantitative data would be numbers of outreach participants reached, percentage of users satisfied with class instruction, pretest scores about attitudes towards computers, or percentages of users who indicate increased use in a followup survey. Quantitative methods provide systematic and standardized way of gathering data, through the use of predetermined categories into which all responses must fit. Surveys are typically used to gather quantitative data. Extensive data collection approaches might use quantitative data in an experimental research design to compare results of the intervention group with those of other programs or groups. The components of an experimental research design are described in the next section. It provides a way to aggregate results statistically and make generalizations from a carefully selected research group to a larger population. It is difficult to generalize results from one outreach evaluation to another program, how- ever, unless the independent variable is consis- tent across programs. An independent variable is what the planner has control over (e.g. the intervention). The dependent variable is the outcome or what changes (e.g. use of PubMed) as a result of the independent variable. For example, if assessing the effect of class partici- pation by opinion leaders (the independent variable) on behavior outcomes, a count of PubMed use in the following month is the dependent variable. In programs that have standardized curriculum, such as curriculum for K-12 public schools, outcomes (such as standardized test results) can be measured with high validity and reliability using quantitative methods based on experimen- tal design. However, outreach programs tend to be tailored and customized to the unique and specific needs of the target audience and not based on stan- dardized outreach curriculum. Therefore, what might be measured with high validity and reliability for one outreach program may not be important or indicative to all programs. (4). Qualitative method The qualitative approach is based on the need to discover rather than to test the impact of pro- grams (5). The goal is to develop an under- standing about what is happening during implementation of a program and how, as well Planning Evaluation 49 as why, results are or are not achieved. Qualitative methods consist of at least three kinds of data collection: 1. In-depth, open-ended interviews or focus groups 2. Direct observation 3. Written documents, such as open-ended survey questions, personal diaries, and outreach records The descriptive information collected is then organized into major themes, categories, and case examples through content analysis and other methods. Qualitative research is a good method to use for understanding the meaning of a program and its outcomes based on the participants' own words instead of predefined responses. Using qualita- tive methods will help gain a better and perhaps more genuine understanding about participants' opinions or behaviors. The credibility of qualitative methods depends on the methodological skill, sensitivity, and training of the evaluator. As with quantitative methods, achieving valid and reliable measures involves systematic and rigorous techniques. For a thorough and easy-to-use discussion about qualitative methods, see "How to Use Qualita- tive Methods in Evaluation" by Michael Quinn Patton (6). Combining quantitative methods with a qualita- tive approach, described next, can provide information in greater depth than use of either method alone. In a 1989 evaluation by the National Library of Medicine (NLM), researchers used qualitative data as the primary descriptive information, with quantitative data as a supplement. NLM used the Critical Incident Technique (CIT), in which 552 users of MEDLINE responded to a highly structured set of open-ended questions via telephone interviews. The purpose of the study was to develop a detailed understanding of the impact of MEDLINE-derived information - in what ways it is used, and with what effect. The interview technique provided a detailed understanding of user motivation and behavior, which can be determined only very generally if using traditional survey methodology with quantitative techniques (pre-defined response categories). Quantitative techniques in the CIT study included pre-coded responses to characterize interviewees on such variables as specialty, work setting, community size, and the nature and extent of MEDLINE searching experience (7). Thus, the CIT study shows how qualitative methods can be usefully combined with quanti- tative techniques, offering ways to better understand the needs, opinions, or experiences of study participants. Selecting an Evaluation Design A consideration in planning an evaluation will be whether you want to base your analysis of the data on a particular design. An evaluation design structures how one will assess or mea- sure the effect of an independent variable on a dependent variable(s); it dictates when and from whom measurements will be gathered during the course of an evaluation (8). In the health sciences, randomly controlled clinical trials use the experimental design that is quite rigorous (as explained below). Recognizing the difficul- ties of this approach in studying human behav- ior, the field of social science research offers several alternative designs that are considered by many to be preferable. One consideration when determining design is when measurements are conducted. Options usually include a pretest/posttest, posttest only, or a time series where measurements are taken at multiple times before and after the interven- tion. The advantage of a pretest/posttest or time series design is that one can determine how 50 Planning Evaluation much change there was from before to after the intervention, especially if results are compared between the intervention group and a control or comparison group. However, some prefer to use a posttest only design because they are afraid a pretest will sensitize individuals to respond in a certain way and may result in socially desirable responses where people indicate change because "they're supposed to" (2). Decisions about from whom data is gathered will dictate whether the design is non-experi- mental, quasi-experimental, or purely experi- mental as seen in Figure 12. Some of these designs focus exclusively on outreach partici- pants, while others compare participants (called the intervention group) with similar persons or groups (called the comparison or the control group, depending on whether random assign- ment is used). A common and practical ap- proach is to focus only on the intervention group—collecting data after the intervention, or both before and after (the "nonexperimental design"). A more rigorous way to determine the Figure 12: Evaluation Designs I. Experimental design 1. Pretest-posttest design -Intervention group ® 0 X 0 -Control Group ® 0 0 2. Posttest-only design -Intervention group -Control group ® ® X 0 0 3. Time series design -Intervention group ® 0 0 0 X 0 0 0 -Control group ® 0 0 0 0 0 0 II. Quasi-experimental design 1. Pretest-posttest design -Intervention group -Comparison group 0 0 X 0 0 2. Time series design -Intervention group 0 0 O X 0 0 0 -Control group 0 0 0 0 0 0 III. Nonexperimental design 1. Pretest-posttest design -Intervention group 0 X 0 Time series design -Intervention group O O O X O O O Key: ® = Random assignment O = Measurement Planning Evaluation 51 effects of a treatment is to compare results of those who receive outreach with similar persons who do not receive it (the "quasi-experimental design"). The experimental design requires that participant and non-participant groups are comparable by assigning people randomly to the intervention group and the comparison (or "control" group). Experimental design The most rigorous design is the powerful comparison between individuals or groups randomly assigned to intervention and control conditions. The advantage of this design is that random assignment ensures valid and accurate comparison of results. The disadvantage of this design are the difficulties, practically speaking, of achieving random assignment. In random assignment, it is presumed that any pre-existing differences among subjects (skill level, intelligence, race, etc.) will be evenly distributed between the intervention and control groups. Random assignment avoids "selection bias" that may be an issue when, for example, individuals self-select into one or another group based on pre-existing characteristics such as familiarity with computers. Random assignment also controls "threats" to the validity or accuracy of results. For example, how do you know that your intervention alone caused increased usage of PubMed? Perhaps a new promotion by America Online featuring free Internet access caused the increase in usage and not your persuasive message. How random assignment is achieved Random assignment can occur at the individual level (i.e., each person may or may not receive the intervention) or at the group level (i.e., different groups may or may not receive an intervention). If there is concern that members of a group will talk to each other about an intervention, then it is best to randomly assign by the group instead of by the individual. Otherwise, if those in the control group were exposed to the intervention through friends or colleagues, you will not get a clear picture of how the intervention worked. Typically, each subject or group is given a number from one on up and then a random numbers table (which may be found in the back of any basic statistics text) is consulted to place subjects in either intervention or control group. An arbitrary decision is made beforehand, which numbers in the table will be the control group and which will be the intervention group (e.g., odd entries = intervention, even entries = control). Alternatively, one can simply place each person or group's name on a piece of paper, throw the names into a hat, and designate the first 20 draws as the intervention and the next 20 draws as the control group, and so on. Quasi-experimental design Random assignment is the key feature of an experimental design, distinguishing it from a quasi-experimental design in which a compari- son group is included but participants, though they are as similar as possible to the interven- tion group, are not randomly assigned. In most outreach situations, it may not be possible or ethical to randomly assign partici- pants to a control group, so the quasi-experi- mental design is a good option. For example, one can create comparison groups by dividing potential participants into several groups and staggering the intervention. Individuals or groups should still be matched on various characteristics (like demographics) and then compared for results. A quasi-experimental design results in interpret- able and supportive evidence of outreach effectiveness, but usually cannot control for all factors that affect the validity of results. For example, if variations exist between the groups, it may be because of the intervention (you hope) or it may be because of other unique, idiosyn- 52 Planning Evaluation cratic factors (e.g., one group has unrestricted access to the Internet, the other does not). There are ways to statistically control for known covariates (influences on outcomes), but it is best to randomly assign groups or individuals to either the intervention or control group. For either the experimental or quasi-experimen- tal design, the size of the intervention and control or comparison groups is determined according to "power" estimates. Specifically, you want enough people per group to detect significant differences between the groups, if in fact significant differences exist. Usually a minimum of 20 per group can provide an adequate degree of power for attitudes toward an intervention; however, it is best to consult power tables when determining how many individuals or groups you need per condition, given a specific outcome (2). Non-experimental design If it is impossible to assign a control or com- parison group for your research, you can use the one-group pretest/posttest approach. This design is relatively inexpensive and easy to administer. However, it is a weak design if trying to answer questions such as: 1. How good are the results? Could they have been better? Would they have been the same if the outreach had not been carried out? 2. Was it the outreach that brought about these results or was it something else? Time series measurements of a single interven- tion group can provide better information than a simple pretest/posttest. For example, surveys may be administered to a sample of randomly selected individuals of an intervention group at multiple times before and after an intervention. How Much Evaluation is Feasible? A number of factors may affect the feasibility of an evaluation, including: • Costs • Staffing • Timing • Political or ethical considerations A good baseline rule is that five percent or more of a program's budget should be allotted to program evaluation activities (9). Different evaluation designs require different levels of resources, as seen in Figure 13. Reisman describes key implementation factors that influence the amount of resources required, including: • Number of participants • Frequency of data collection • Length of time for which data will be collected • Number of data collection instruments involved • Availability of existing sources of data • Availability of staff with data analysis skills or access to computers and statistical consultants • Ease of administering data collection instruments • Willingness of outreach participants to contribute to the evaluation. Decisions related to selecting an evaluation design should consider implementation factors as well as timing and staffing requirements. Political or cultural considerations of your targeted audience are also important (see page 62 for further discussion of cultural factors in data collection). Planning Evaluation 53 Figure 13: Level of Resources for Various Evaluation Designs Type of Design Description Disadvantages Advantages Resource Intensity Post-Outreach Measures Use of evaluation tools to describe outcomes (e.g., behavior, attitudes, or knowledge) following outreach No comparison with people not exposed to outreach No certainty that outcome has changed (may have been the same prior to outreach) Simple to administer Inexpensive Low Post-Outreach Measures with a Control Group Same as described above, with the addition of collecting similar scores for a control group Using a control group requires additional research participants Additional participants will not receive the outreach (unless it is offered to them at a later point) It is difficult to randomly assign outreach participants Avoids pre-test sensitization Strong basis for comparison, so if there are differences in outcomes between the groups, can have confidence that outreach had some effect Moderate Pre- and Post-Outreach Measures Describes participants' "scores" on expected outcome variables (e.g. behavior, attitudes, or knowledge) both prior to and following outreach Changes in scores could be due to some other source (e.g. media promotion of health resources) No comparison with people not exposed to outreach There is some basis for comparison (before and after) Every participant receives outreach Moderate Pre- and Post-Program Measures With a Control Group or Compari-son Group Same as described above, but with the addition of collecting similar scores for a control group or a comparison group Using a control or comparison group requires additional research participants Additional participants will not receive the outreach (unless it is offered to them at a later point) It is difficult to randomly assign outreach participants to a control group If comparison group used (not randomly assigned), cannot control all factors affecting validity Strong basis for comparison, so if there are differences in outcomes between the groups, can have confidence that outreach had some effect High Multiple Pre- and Post-Outreach Measures (Time Series) Same as pre- and post-outreach measure approach, with additional scores obtained several times before and several times after the intervention Additional measures must be obtained If obtaining behavioral measures, need to allow sufficient time to measure behaviors before intervention can occur Helps to validate whether changes in outcomes sustain over time Helps to obtain a more complete picture of dependent variables before intervention occurs. High 54 Tool Kit - References and Selected Readings References 1. King JA, Morris LL, Fitz-Gibbon CT. How to assess program implementation. (Second ed.) Newbury Park: Sage Publications, 1987. (Herman JL, ed. The Program Evaluation Kit; vol 3). 2. Witte K. Theory-based interventions and evaluation of outreach efforts. Seattle, WA: Na- tional Network of Libraries of Medicine, Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pnr/eval/witte.html, 1998. 3. Anderson JG, Aydin CE, Jay SJ. Evaluating health care information systems: methods and applications. Thousand Oaks: Sage Publications, 1994. 4. Dignan MB, Carr PA. Program planning for health education and promotion. Philadelphia: Lea&Febiger, 1992:164. 5. Glitz B. Focus groups for libraries and librarians. New York: Forbes, 1998. 6. Patton MQ. How to use qualitative methods in evaluation. Newbury Park, CA: Sage PubUca- tions, Inc., 1987. Program Evaluation Kit; vol 4). 7. Siegel E, Rapp B, Lindberg D. Evaluating the impact of MEDLINE using the Critical Inci- dent Technique. Proceedings of the annual symposium on computer applications in medical care 1991:83-87. 8. Fitz-Gibbon CT, Morris LL. How to design a program evaluation. Newbury Park: Sage Publications, 1987. (Fitz-Gibbon CT, Morris LL, eds. Program Evaluation Kit; vol 3). 9. Reisman J. A field guide to outcome-based program evaluation. Seattle: Organizational Research Services, Inc., 1994. Selected Readings Berg BL. Qualitative research methods for the social sciences. Boston: Allyn and Bacon, 1995. Herman JL. Evaluator's handbook. (2nd ed.) Los Angeles: Center for the Study of Evaluation, 1987. (Program Evaluation Kit; vol 1). Herman JL. Program evaluation kit, vol 1-9. (2nd ed.) Newbury Park, CA: Sage Publications, 1987. Hemon P, McClure CR. Evaluation and library decision making. Norwood, NJ: Ablex Publishing Corporation, 1990. Isaac S, Michael WB. Handbook in research and evaluation : a collection of principles, methods, and strategies useful in the planning, design, and evaluation of studies in education and the behavioral sciences. (3rd ed.) San Diego, Ca: EdITS Publishers, 1995. Patton MQ. Utilization-focused evaluation. Beverly Hills: Sage Publications, 1978. Tool Kit - Workform for Process Evaluation Objectives 55 See Appendix Kfor afilled-in example ACCOUNTABILITY Will I be accountable for documenting what occurred as the program happened? If so, what is most important to document? a. Briefly describe the program's goals and objectives (Ask evaluation stakeholders to verify or modify) b. What do you see as the most important results or outcomes of the program? (Ask evalua- tion stakeholders to verify or modify) c. How will the program be implemented? Describe the resources, activities, services, and administrative arrangements that constitute the program. Accountability measures: Obtain periodic updates on characteristics of the program (context, activities, and best practices) that will most determine its success. (Determine in advance what the report questions will include. Ask evaluation stakeholders to verify or modify) Context: tangible features of the outreach program and its site Activities: how the program is being implemented Best practices: what is being done to leverage success? 56 Tool Kit - Workform for Process Evaluation Objectives PROGRAM IMPROVEMENT Will there be an opportunity to make adjustments to the activities and strategies targeted at program objectives? If so, how can progress toward objectives be tracked? Ask yourself and your staff: a. What are the outcomes listed in each objective? b. What indicators will provide measurable evidence of those outcomes? c. How can those indicators be tracked? d. What variables can be measured to show whether the theory-based strategies are working? (Review objectives and strategies identified in the implementation plan outline developed in Stage 3 - see Appendix I for an example). REPLICATION Is the outreach program considered a pilot project, or is it likely to be replicated at another site? If so, what types of information would be most useful to track for eventual documentation? Check off the types of information to track from the following list, and ask relevant stakeholders to add other data you may want to collect: □ Where exactly has the outreach program been implemented and what was done? □ How many and what sorts of people participated in the outreach? (e.g. age, sex, health profes- sion) □ What are the characteristics of their information needs? (e.g. type of practice, types and purposes of information needed, frequency of information needed, sources used) □ What are the socioeconomic characteristics of the setting? □ What does (do) the outreach site(s) look like? □ What are the program's greatest successes? What facilitated each one? □ What are the program's biggest challenges (frustrations, barriers, or disappointments)? □ What sociopolitical factors may have impacted the outreach? □ What were the outreach costs in staff time, materials, equipment, and facilities? □ Other questions? Tool Kit - Gowan Library Case Example 57 In Stage 3, your library staff at Gowan Library thought about their strategies and activities for reach- ing the objectives of the outreach program. At this point, you are on the way to beginning the pro- gram. However, you know this is the best time to begin thinking about the project evaluation. Care- ful consideration at this early stage will help make sure that the right data will be collected. For example, it is soon time to conduct the audience assessment discussed in Stage 3 that will help to tailor the educational activities planned. Staff already have some ideas about what they want to find out in the audience assessment. But, before conducting the assessment, think through the questions to be asked for the project evaluation. Is the audience assessment an opportunity to collect baseline data before the outreach training that can then be compared to results or outcomes at the end? To begin considering what your project evaluation will assess, you list who would be interested in evaluation results, including: • Geneva Health administrator • State chapter of the primary care association • Regional rural health association • Funding agency • Gowan Library outreach staff • Gowan Library director • Health librarian community With this list in mind, you consider what these individuals might want from an outreach evaluation. For example, the evaluation question—were objectives reached?—may be of interest to several people, such as the funding agency and you, the director. This phase of evaluation is called the summative evaluation — asking questions about what happened in the overall picture, such as did outreach meet its objectives and what were the outcomes? The types of data collected for this phase of evaluation might include a comparison of pre- and post-measures of attitudes, awareness, skills, and behaviors, measured both during the audience assessment and in a followup after outreach training is completed. Other outcomes are tallied throughout the program (such as number of classes conducted). These measures also contribute to an overall summative assessment. In addition to evaluating results, much is learned by tracking ongoing progress, so that you can identify what works well, what does not, and what can be improved as the project is ongoing. This phase of evaluation is called the process evaluation. You find that the task of figuring out what evaluation questions to ask takes careful consideration before you can specifically define what you will measure. General questions, such as "were we successful?" is not meaningful until you define your criteria for success very specifically. Fortu- nately, you can look at the objectives you constructed in Stage 2 that include measurable indicators. But, you also want to evaluate other interesting data that will help you improve another similar outreach program in the future. You think about how you designed this outreach program—there were several assumptions you made in thinking through the whole process. For example, your plan to develop onsite expertise for information services support is a worthy objective. But, what if it doesn't work? How will you know what went wrong? You realize you must think about what data might be helpful to collect along the way to help examine reasons for whatever results transpire. 58 Tool Kit - Gowan Library Case Example You also realize that data collection requires effort and it is important to avoid asking evaluation questions if the answers will not be useful to you for making decisions or improvements. Too many measures might dilute your evaluation resources, and you will avoid asking questions just because they are "interesting." You have decided that you do not plan to use the results to make generaliza- tions about any outreach program targeted to primary care clinics. You want practical results that will help you understand what would appear to be happening in your project only. Going any farther than that means using highly structured techniques or methods designed for statistical validity, such as control or comparison groups. At this exploratory level of research, you do not want to extend the evaluation resources necessary to conduct that type of rigorous research. Finally, after figuring out what you really want to know from an evaluation and what you will do with the answers, your next step is deciding the types of data you need to collect and how you will do that. Stage 4 provides a discussion of various evaluation methods, some more rigorous than others. There are a range of possibilities and the planning tools in Tool Kits for Stage 4 and 5 and Appendices K through O help to think through what will be measured. Stage 5: Gathering Data and Assessing Results Topics • What Does Evaluation Measure? • Methods of Data Collection • Surveys • Interviews • Observations • Records • Meetings • Quality of data collection • Reliability • Validity • Cultural appropriateness • Data Analysis •Coding • Quality control • Types of Analyses • T-tests • Univariate analysis • Bivariate analysis • Multivariate analyses Figures Figure 14: Indicators of Selected Outreach Objectives Figure 15: Methods for Collecting Data Tool Kit • References • Workform for Ways to Measure Process • Workform for Ways to Measure Outcomes • Gowan Library Case Example Stage One Conduct Community Assessment u Stage Two Develop Goals and Objectives ^ Stage Three Plan Activities and Strategies ^ Stage Four Plan Evaluation ^ Stage Five Gather Data and Assess Results >»o a «+_ •—._ 11* tu o a What variables or outcomes will each evaluation (either process or summative) measure? 5 o I V) 0) L. -D 3 V) a a +- » a a ■H c 0 < "O +- L. 0 t> 3 -£*o ««- C a O <& go to either preparation or contemplation stage) Diffusion of Innovations Theory Critical mass: the point at which enough individuals have adopted an innovation that any further rate of adoption becomes self-sustaining. Early adopters and opinion leaders are critical in getting an innovation to the point of critical mass. Please list the people or groups who you consider to be local opinion leaders in your [community, profession]: H-l Appendix H Audience Assessment 1. Circle the category which describes your profession: a. physician b. nurse c. dentist d. administrator e. pharmacist f. physical therapist g. other health care provider____________________________ h. other________________________________________________ 2. WTien you think about negative consequences you may face if lacking access to health information, what comes to mind? 3. How likely is it that you will experience the negative consequence? 4. Accessing health resources on the Internet will keep me from experiencing negative consequences identified above. Why or why not? 5. I am easily able to access health resources on the Internet. Why or why not? 6. Choose the statement that best represents your thoughts and actions: a. Yes No I have yet to think about using the Internet for health information. b. Yes No I have thought about using the Internet for health information but have not taken any steps to use it yet. c. Yes No I have not yet used the Internet for health information, but have taken steps so that I will be able to use it soon (e.g., obtained Internet access, signed up for training, sent away for information). I have used the Internet for health information. I regularly use the Internet for health information . I have used the Internet for health information before, but currently do not use it. d. Yes No e. Yes No f. Yes No H-2 Audience Assessment 7. The Internet is an essential tool for my work: 12 3 4 5 6 7 Strongly Strongly Disagree Agree 8. On a scale of 1-5, please rate your ability to do the following tasks: Level of Ability 12 3 4 5 I don't know how I think I can I'm sure I can Ability (1-5) a. I can use a computer keyboard _____ b. I can use a computer mouse _____ c. I can send or receive email _____ d. I can use bookmarks _____ e. I can find medical research about diabetes on at least one Internet site _____ f. I know what PubMed is _____ g. I can narrow results of a Web search to find relevant hits _____ 9. In the past month, how often have you used the Internet to gain needed health care information? ___Daily ___Weekly ___Monthly ___Rarely ___Never 10. What are your reasons for NOT searching the Internet for health information? (Circle all the apply): a. lack of equipment f. prefer others to do my searches b. cost of searching g. dislike of computers c. lack of training h. unsatisfactory past results d. lack of time i. no access to journals e. not needed j. other____________________ Audience Assessment H-3 11. Please list 3 local or regional opinion leaders in your work (people or organizations). a. b. c. 12. Was there a time during the past week when you needed an answer or a piece of information and couldn't find it readily? If so, please describe the question or kind of information you needed. 13. Is there anything that you particularly want covered in this workshop? 1-1 Appendix I Sample Planning Outline Name of Outreach Program: Outreach to Geneva Health Community. Program Goal: Geneva Health clinic sites will establish and maintain Internet connectivity to access and share clinical and patient resources that benefits patient care. Process objective #1 During the next 18 months, adequate hardware, software, and connectivity will be purchased and installed for sufficient Internet capacity at Geneva Health. Activity: Develop and conduct interview or survey of stakeholders regarding wishes/needs for information access and technology requirements. Order and install equipment and telecommunica- tions network. Strategy: Based on Community Organization, involve stakeholders in a technology needs assessment and subsequent decisions about where and what hardware and software should be installed and how connectivity will be provided. Process objective #2 During the next 18 months, collaborations with local, state, regional or federal organizations or agencies will be established for sustained Internet connectivity at Geneva Health. Activity: Work with stakeholders interested in improved health care for the counties in identifying and negotiating partnerships or funding sources to support continued Internet connectivity Strategy: Based on Community Organization, use principles of community capacity develop- ment—maximizing the community's resources and empowering problem solving. Process objective #3 During the next 18 months, outreach staff will conduct at least two educational activities at sites of Geneva Health to increase motivation, skill, use, and exchange of electronic health information resources. Activity: Based on audience assessment results, schedule appropriate demonstration or training workshops at each clinic. Strategy: Based on theories of behavior change (e.g. Stages of Change Model), include questions in audience assessment to determine stage of readiness, such as level of ability and interest in training. Process objective #4 During the next 18 months, at least one person at each site will be trained as the designated site expert and trainer. Activity: Work closely with key contacts in clinics to identify and support designated staff person about who will receive "train the trainer" training for an ongoing role in helping troubleshoot local information access problems or questions. 1-2 Sample Planning Outline Strategy: Follow lessons learned from outreach studies showing that personal contact be- tween the target audience and librarians helps sustain changes in information seeking habits (Dorsch, 1997; Burnham and Perry, 1995). Process objective #5 During the next 18 months, outreach staff will facilitate strategies or partnerships between the clinic, professional associations, and the state medical school to encourage student rotations at the clinic. Activity: Schedule interviews or meeting with stakeholders (including student representative) inter- ested in recruitment for medical school student rotations. Determine resources, skills, or services that outreach can address. Strategy: Follow lessons learned from outreach studies and principles from community organization showing that collaboration and partnering provide more opportunities for reaching shared goals. Process objective #6 During the next 18 months, outreach staff will establish "primary library" relationships for Geneva Health clinicians. Activity: In training activities, include in-class demonstrations, plus a handout with step-by-step instructions, about how to use Loansome Doc. Include the Lib ID number for the Gowan Library Strategy: Based on Diffusions of Innovation principles, demonstrate the ease and conve- nience of getting full text information, even in remote and rural areas. Educational objective #1 During the next 18 months , at least 50% of health providers at Geneva Health, the health district, and the K-12 schools will participate in at least one educational outreach activity conducted by outreach staff at each site. Activity: Develop and distribute promotional flyers with endorsements from opinion leaders about the usefulness of Internet resources for patient care decisions, and encouraging health care providers to participate in outreach educational activities. Strategy: Based on Diffusion of Innovations Theory, identify opinion leaders and early adopters who will endorse the use of Internet resources. Educational objective #2 Awareness level: During the next 18 months, at least two out of three outreach training participants will be able to describe a National Library of Medicine online resource. Activity: Demonstrate example searches from National Library of Medicine resources that are tailored to actual need of audience. Strategy: Based on the observability variable in Diffusion of Innovations Theory (extent to which the innovation provides tangible or visible results), add questions to audience assess- ment to determine specific information needed by audience. Sample Planning Outline 1-3 Educational objective #3 Attitude level: During the next 18 months , at least one out of three outreach training participants will rate one online resource as an essential resource for their work. Activity: In training activities, add threatening messages to motivate access to current health informa- tion via the Internet. Strategy: Based on the Extended Parallel Process Model, use an audience assessment to assess threat and efficacy variables and develop a message about effective ways to avoid negative consequences of being misinformed (e.g. "Stay ahead of your patients with easy access to current clinical care information on Pub Med"). Educational objective #4 Skill level: During the next 18 months, at least one out of three outreach training participants will correctly answer a true/false question based on a simple search of a National Library of Medicine online resource. Activity: Demonstrate search skill techniques followed by progressively difficult hands-on exercise and a question to test understanding Strategy: Based on using proximate goals to increase self-efficacy (from Social Learning Theory), develop hands-on exercises designed to help students master skills progressively. 7-7 Appendix J Sample Task List Task Person Month 1 2 3 4 5 6 7 8 9 10 11 12 Consult literature about information needs of rural heath professionals. X Hold meetings or interviews with key contacts and stakeholders. Discuss information needs of health intermediary communities and how outreach might help. X Develop goals and objectives for outreach based on mutual interests. X Review process objectives and develop activities and strategies to implement them. X Review educational objectives and develop strategies and activities for each one, identifying what audience feedback will be needed in advance of outreach. X Develop draft audience assessment questionnaire. X Revise questionnaire based on review by program stakeholders (including a representative member of audience). X Conduct audience assessment among a sample of health providers from all sites. X Gather and analyze survey results. X Based on results, tailor outreach activities to needs of audience. X Develop post-test questions or end of activity evaluation. X Schedule activity, time and place for demonstrations or training workshops X Identify opinion leaders or early adopters who will endorse and promote outreach activities X Develop promotional flyers about outreach activities with endorsements and persuasive messages. X K-l Appendix K Sample Process Evaluation Objectives ACCOUNTABILITY Think through: Will I be accountable for documenting what occurred as the program happened? If so, what is most important to document?: a. Briefly, describe the program's goals and objectives. (Ask evaluation stakeholders to verify or modify) EXAMPLE: Goal 1: Geneva Clinic sites will establish and maintain Internet connectivity to access and share clinical and patient resources that benefits patient care. Objectives (brief) • To improve information access infrastructure through increased connectivity and/or hardware • To provide effective skills training • To raise awareness, skills, beliefs, and attitudes of health providers about Internet resources for exchange and access to health information • To increase professional use of Internet resources for health information • To increase community-based involvement and support of health information access needs b. What do you see as the most important results or outcomes of the program? (Ask evaluation stakeholders to verify or modify) • Optimal leveraging of current infrastructure • Technology improvements implemented and functioning • Ensured Internet access after NN/LM funding expires • Designated onsite advocate and support for health information access • Increased capability to recruit health providers or students • Effective educational activities • Significant participation in outreach educational activities • Increased use of Internet resources to access health information • Increased use of health information resources for patient care decisions • Increased recognition of value of librarian and/or access services K-2 Sample Process Evaluation Objectives c. How will the program be implemented? Describe the resources, activities, services, and administrative arrangements that constitute the program. EXAMPLE: Each clinic site will define their current resources and technology needs for new or enhanced telecommunications access. Objectives for technology implementation will be agreed upon and listed per site. A timeline for equipment and connectivity implementation will be established for each clinic. NN/LM staff will work with each outreach site to identify opportunities for effective promotional and educational activities about the availability of networked health information sources relevant to their needs. Determine accountability objectives to obtain periodic updates on characteristics of the program (activities and best practices) that will most determine its success. (Determine in advance what the report questions will include. Ask evaluation stakeholders to verify or modify) Activities: how is the program being implemented? Procedures staff follow to understand participants, including their number, why and how they are being targeted (understanding of need), and level of readiness. Are these procedures working? Procedures staff follow to leverage effective and timely implementation of equipment and connectivity. Are these procedures working? Promotional activities: What is being done? Educational activities: What is being done? Other____________________________________________________________________ Best practices: what evidence is there that best practices are being used, such as: Identify mutual outreach objectives with targeted community Involve opinion leaders in planning and promotion Coordinate with site liaison to plan and promote promotional and educational activities. Are contacts effective? Provide follow-up feedback or training Motivate interest in conducting literature searches as a basis for clinical decision-making (see process evaluation measures for theory-based strategies below) Promote at least minimal onsite information services. Partner with agencies or organizations with mutual interest to support or improve information access capability Determine readiness to use computers to access health information Promote success service modules, such as circuit librarian programs and Area Health Education Centers (AHECS) Focus educational efforts on individuals and institutions where they practice Promote Loansome Doc or other ways to access full text resources (may need to be subsidized) Sample Process Evaluation Objectives K-3 • Promote local, regional, or cooperative arrangements to improve telecommunications infrastructure • Other?_________________________________________________________________ PROGRAM IMPROVEMENT Determine measures for program objectives Will there be an opportunity to make adjustments to the activities and strategies targeted at program objectives (if progress is inadequate)? If so, how can progress toward objectives be tracked? Think through: a. What are the outcomes listed in each objective? Example from the Sample Plan for Measuring Outcomes(Appendix D): Objective At least 30% of outreach participants will be able to identify a National Library of Medicine online reource Outcome: Will be able to identify a National Library of Medicine resource b. What indicators will provide measurable evidence of those outcomes? Indicator: Correct answer to multiple chioice question matching online resource with infomation need c. How can that indicator be tracked? Measure: Question on end of class survey Think through: What variables can be measured to show whether the theory-based strategies are working? (Review objectives and strategies identified in the implementation plan outline developed in Stage 3) Example from Sample Planning Outline (Appendix I). Educational objective: During the next 18 months , at least one out of three outreach training participants will rate one online resource as an essential resource for their work. Strategy: Based on the Extended Parallel Process Model, use an audience assessment to assess threat and efficacy variables and develop a message about effective ways to avoid negative consequences of being misinformed (e.g. "Stay ahead of your patients with easy access to current clinical care information on Pub Med"). To measure: Conduct a post- survey (end of class) to track scores about perceptions of threat and efficacy. Results will determine whether the intervention was promoting danger control actions (i.e., adoption of the recommended response) or fear control actions (i.e., defensive avoidance). Desired results would be high threat and high efficacy, because the high threat motivates action when accompanied by a sense of effectiveness in averting the threat. If results are high threat, but low efficacy scores, the strategy might fail because people are more likely to use avoidance behavior to control the fear, when it is accompanied by a low sense of efficacy. K-4 Sample Process Evaluation Objectives Following are examples of questions for each of these constructs: Perceived Threat Perceived Susceptibility 1. I am at-risk for falling behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree Perceived Severity 2. It is dangerous to fall behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree Perceived Efficacy Perceived Response Efficacy 3. Using PubMed prevents me from falling behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree Perceived Self-Efficacy 4. I am easily able to use PubMed to avoid falling behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree Suppose that the EPPM was used to theoretically guide the intervention and evaluation. If the average scores of one's class on the above four measures were #1 = 5.6, #2 = 6.1, #3 = 6.9, #4 = 6.2, then one could see that the intervention was promoting high levels of threat (5.6 and above) and extremely high levels of efficacy (6.2 and above). With these scores one could be confident that the intervention was working well because according to the guiding theory, high threat/high efficacy interventions promote adoption of the recommended response. On the other hand, suppose the average scores were #1 = 6.2, #2 = 6.7, #3 = 2.1, #4 = 3.0. These scores would indicate that the intervention was promoting very high threat perceptions and low efficacy perceptions. According to the guiding theory, an intervention producing these type of responses would fail, because it would be promoting fear control responses (such as defensive avoidance and reactance) resulting in no behav- ioral changes. Sample Process Evaluation Objectives K-5 REPLICATION Think through: Is the outreach program considered a pilot project, or is it likely to be replicated at another site? If so, what types of information would be most useful to track for eventual documentation? Check off the types of information to track from the following list, and ask relevant stakeholders to add other data you may want to collect: □ Where exactly has the outreach program been implemented and what was done? □ How many and what sorts of people participated in the outreach? (e.g. age, sex, health profession) □ What are the characteristics of their information needs? (e.g. type of practice, types and purposes of information needed, frequency of information need, sources used) □ What are the socioeconomic characteristics of the setting? □ What does(do) the outreach site(s) look like? □ What are the programs' greatest successes? What facilitated each one? □ What are the programs' biggest challenges (frustrations, barriers, or disappointments)? What caused each one? □ What sociopolitical factors may have impacted the outreach? □ What were the outreach costs in staff time, materials, equipment, and facilities? □ Are there any assumptions that should be checked? (e.g. level of readiness to learn new skills; level of technical and administrative support at the site; cooperation of outreach site to schedule and promote training; cooperation with collecting data for assessment). □ Other questions? L-l Appendix L Sample Ways to Measure Program Process Program characteristics, theory-based variables, progress toward objectives How will we measure it? Procedures expected to work (e.g. coordination with onsite technical support) -observation/j ournal --project timeline compared with initial action plan --feedback from site Assumptions about how plans will be implemented (e.g. level of onsite support and cooperation, administrative impact at site) --observation/j ournal —feedback from site personnel --comparison between plans and what happened -numbers of promotional materials distributed Assumptions about how objectives would be discussed with site contacts —observation/j ournal -feedback from site personnel Strategies for recruiting opinion leader participation -observation/j ournal -feedback from site personnel -numbers of leaders recruited Identification of NLM online resources by health providers Exit measure (e.g. end of class survey) to identify an NLM resource and to ask whether heard of NLM before training Attitudes about threat of being misinforrned and efficacy of PubMed Exit measures (e.g. end of class survey) about perceptions of threat and efficacy Participants' level of knowledge in skills to search NLM resources In-class exercise with a true/false question based on a simple search of a National Library of Medicine online resource Intended use of Internet resources Exit measure (e.g. end of class survey) regarding intended use on end of class survey Assumptions about components or characteristics expected to work. Unanticipated factors contributing to success or problems Exit measures of satisfaction with activity or service Feedback from site personnel Feedback from project personnel Appendix M Sample Exit Questionnaire This questionnaire is designed to help us better understand ways to improve our class. Your re- sponses will be anonymous and confidential. Thank you!! 1. Circle the category which describes your profession: a. physician b. nurse c. dentist d. administrator e. pharmacist f. physical therapist g. other health care provider_________________________ h. other__________________________ 2. I am at-risk for falling behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree 3. It is dangerous to fall behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree 4. Using PubMed prevents me from falling behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree 5. I am easily able to use PubMed to avoid falling behind current medical knowledge. 12 3 4 5 6 7 Strongly Strongly Disagree Agree 6. On a scale of 1-5, please rate your ability in the following areas: l.None 2. Some 3. Moderate 4. Above average 5. Super I can narrow results of a Web search to find relevant hits I can find evidence based research articles on PubMed M-2 Sample Exit Questionnaire 4. True or False?: "To use PubMed, I need to sign up for a password" True______ False______ 5. In the next month, how often do you anticipate using the Internet to find health informa- tion? ___ daily ___ weekly ___ monthly ___ rarely ___ none 9. About the workshop: Please rate the following statements by circling your choice (Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree) The information was presented in an understandable manner SD D N A SA The instructors were effective in explaining the material SD D N A SA The computer screen was easy to see SD D N A SA There was enough hands-on practice SD D N A SA I received adequate help during the hands-on session SD D N A SA 10. What was the most valuable part of the workshop? What was the least valuable? 11. What, if any, improvements (e.g., content, presentation, logistics) would you recommend? 12. Would you recommend this workshop to a colleague? N-l Appendix N Sample Ways to Measure Program Outcomes What outcome will we measure? How will we measure it? Infrastructure improvements as designated by each clinic (e.g. connectivity) Functional testing Collaborative efforts to continue Internet connectivity Journal of contacts made. Written agreements Implementation of activities Log of activities scheduled and conducted Appeal of clinic facility to medical students for rotation Medical school criteria of student rotation site Participation in outreach activities Tally of outreach activities Attendance counts Development of onsite personnel as liaison or technical support • Feedback from site and outreach staff • Interview with site liaison Intention to use Internet resources Baseline and comparison measure before and after outreach Feelings about value of online resources Baseline and comparison measure regarding attitude Numbers of Loansome Doc requests Baseline and follow-up data on numbers of Loansome Doc requests Continued use of Internet resources Follow-up measures of use Value or usefulness of information obtained Follow-up measures about satisfaction with results Impact on actions or decisions Follow-up measures about how information was used Appendix O Sample Measures of Behavior Outcomes Knowledge 1. To log onto PubMed, I need special software. True False 2. To use PubMed, I must be connected with a university. True False 3. PubMed is only for health care professionals. True False Attitudes 1. Compared to other Internet sources for health information, PubMed is: 12 3 4 5 6 7 Not Beneficial Beneficial 2. PubMed is an essential tool for my work: 12 3 4 5 Strongly Disagree Intentions 1. I intend to use PubMed weekly. 12 3 4 5 Strongly Disagree 7 Strongly Agree 7 Strongly Agree 2. If I need an answer to a clinical problem, I intend to consult PubMed. 12 3 4 5 6 7 Strongly Disagree Behaviors 1. I use PubMed weekly. 1 2 3 Strongly Disagree Strongly Agree 6 7 Strongly Agree 2. If I need an answer to a clinical problem, I consult PubMed. 12 3 4 5 6 7 Strongly Strongly Disagree Agree p-1 Appendix P Bibliography Adams MS. Evaluation. In: Branch K, Dusenbury C, eds. Sourcebook for bibliographic instruction. Chicago: Association of College and Research Libraries, 1993:89. Anderson JG, Aydin CE, Jay SJ. Evaluating health care information systems: methods and applica- tions. Thousand Oaks: Sage Publications, 1994. Arkin EB. Making health communications programs work: a planner's guide. Bethesda, MD: U.S. Dept of Health and Human Services, Public Health Service, National Institutes of Health, Office of Cancer Communications, National Cancer Institute, NIH Publication No. 92-1493, 1992. Ash J. Factors affecting the diffusion of online end user literature searching. Bulletin of the Medical Library Association 1999;87(l):58-66. Awani A. Project management techniques. New York: McGraw-Hill, 1983. Baird L, Meakin F, Bailey M, Shipman J. Assessing the information needs of health professionals: an annotated bibliography. Baltimore, MD: National Network of Libraries of Medicine, Southeastern/ Atlantic Region, University of Maryland at Baltimore, 1991. Baldwin GD. Planning and evaluating information outreach among minority communities: model development based on Native Americans in the Pacific Northwest. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/ baldwin.html, 1998. Bandura A. Self-efficacy: the exercise of control. New York: W.H. Freeman and Co., 1997. Berg BL. Qualitative research methods for the social sciences. Boston: Allyn and Bacon, 1995. Biblarz D, Bosch S, Sugnet C. Guide to library user needs assessment for integrated information resource management and collection development. American Library Association, prepress. Bowden V, Kromer M, Tobia R. Assessment of physicians' information needs in five Texas counties. Bulletin of the Medical Library Association 1990;82(2): 189-96. Bowes JE. Communication and community development for health information: constructs and models for evaluation. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/bowes/, 1998. Bracht N. Health promotion at the community level. Newbury Park, CA: Sage Publications, 1990. Bradigan PS, Mularski CA. End-user searching in a medical school curriculum: an evaluated modu- lar approach.' Bulletin of the Medical Library Association 1989;77(4):348-353. P-2 Bibliography Burnham J, Perry M. Promotion of health information access via Grateful Med and Loansome Doc: why isn't it working? Bulletin of the Medical Library Association 1996;84(4):498-506. Campbell DT, Stanley JC. Experimental and quasi-experimental designs for research. Chicago: R. McNally, 1963. Carlson B. HITECS! Health Information Technology Enterprise Community Students. MUSCLS Newsletter: Medical University of South Carolina Library Systems and Services 1998;9(2):1. Cassel JC. Community diagnosis. In: Omran AR, ed. Community Medicine in Developing Countries. New York: Springer Publishing, 1974:18. Chimoskey S, Norris T Use of MEDLINE by rural physicians in Washington state. Journal of the American Medical Informatics Association 1999;6(4):332-3. Clark NM, McLeroy KR. Creating capacity through health education: what we know and what we don't. Health Education Quarterly 1995;22(3):273-289. Cohen J. Who's out there and what do they want? Public & Acess Services Quarterly 1995;1(3):49- 54. Cork RD, Detmer WM, Friedman CP. Development and initial validation of an instrument to measure physicians' use of, knowledge about, and attitudes toward computers. Journal of the American Medical Informatics Association 1998;5:164-176. Debowski S, Wood R, Bandura A. Impact of guided mastery and enactive exploration on self-regula- tory mechanisms and knowledge construction through electronic inquiry, in press. DiClemente CC, Prochaska JO. Processes and stages of change: coping and competence in smoking behavior change. In: Shiffman S, Willis, T.A., ed. Coping and substance abuse. San Diego: Academic Press, 1985:319-334. Dignan MB, Carr PA. Program planning for health education and promotion. Philadelphia: Lea & Febiger, 1992:164. Dillman DA. Mail and telephone surveys : the total design method. New York: Wiley, 1978. Dimitrof A. Survey design. 1997. Continuing Education Course, Medical Library Association, May 1997. Dorsch J. Equalizing rural health professionals' information access: lessons from a follow-up outreach project. Bulletin of the Medical Library Association 1997;85(l):39-47. Dorsch J, Pifalo V. Information needs of rural health professionals: a retrospective use study. Bulle- tin of the Medical Library Association 1997;85(4):341-7. Fitz-Gibbon CT, Morris LL. How to design a program evaluation. Newbury Park: Sage Publications 1987. (Fitz-Gibbon CT, Morris LL, eds. Program Evaluation Kit; vol 3). Bibliography P-3 Fitz-Gibbon CT, Morris LL. How to analyze data. Newbury Park, CA: Sage Publications, 1987. (Herman JL, ed. Program evauation kit; vol 8). Friedman CP, Wyatt JC. Evaluation methods in medical informatics. New York: Springer-Verlag, 1997. Glanz K, Rimer BK. Theory at a glance: a guide for health promotion practice, http:// rex.nci.nih.gov/NCI%5FPub%5FInterface/Theory%5Fat%5Fglance/HOME.html. U.S. Public Health Service: National Institutes of Health, September 1997:17. litz B. Focus groups for libraries and librarians. New York: Forbes, 1998. Gorman PN, Helfand M. Information seeking in primary care: how physicians choose which clinical questions to pursue and which to leave unanswered. Medical Decision Making 1995; 15:113-119. Green L. Program planning and evaluation guide for lung associations. New York: American Lung Association, 1987. Hafner AW. Descriptive statistical techniques for librarians. (2nd ed.) Chicago: American Library Association, 1998. Haselkorn MP. Technical communication: perspectives on planning and evaluating information outreach. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/haselkom.html, 1997. Herman JL. Evaluator's handbook. (2nd ed.) Los Angeles: Center for the Study of Evaluation, 1987. (Program Evaluation Kit; vol 1). Herman JL. Program evaluation kit, vol 1-9. (2nd ed.) Newbury Park, CA: Sage Publications, 1987. Hemon P, McClure CR. Evaluation and library decision making. Norwood, NJ: Ablex Publishing Corporation, 1990. Isaac S, Michael WB. Handbook in research and evaluation : a collection of principles, methods, and strategies useful in the planning, design, and evaluation of studies in education and the behavioral sciences. (3rd ed.) San Diego, Ca: EdITS Publishers, 1995. Israel BA, Cummings KM, Dignan MB, et al. Evaluation of health education programs: current assessment and future directions. Health Education Quarterly 1995;22(3):364-389. Jenkins C, McDuffee D, Mayer J. Information services to off-campus ambulatory care training sites: a prototype. Chapel Hill, NC: Health Sciences Library, University of North Carolina, 1994. Kanouse DE, Kallich JD, Kahan JP. Dissemination of effectiveness and outcomes research. Health Policy 1995;34:167-192. Kerr ST. Identifying and applying best practices in educational technology to information outreach efforts for medical personnel. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/kerr.html, 1997. P-4 Bibliography King JA, Morris LL, Fitz-Gibbon CT. How to assess program implementation. (Second ed.) Newbury Park: Sage Publications, 1987. (Herman JL, ed. The Program Evaluation Kit; vol 3). Klein MS, Ross FV, Adams DL, Carole M. Gilbert. Effect of online literature searching on length of stay and patient care costs. Academic Medicine 1994;69(6):489-495. Marshall JG. Evaluation instruments for health sciences libraries. Chicago: Medical Library Associa- tion (MLA Dockit #2), 1990. Marshall JG. Using evaluation research methods to improve quality. Health Libraries Review 1995;12:159-172. Marshall JG. A review of health sciences library outreach and evaluation. Seattle, WA: National Network of Libraries of Medicine/Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/ pnr/eval/marshall.html, 1997. McKenzie JF, Smeltzer JL. Planning, implementing, and evaluating health promotion programs: a primer. Boston: Allyn and Bacon, 1997. McKillip J. Need analysis: tools for the human services and education. Newbury Park, CA: Sage Publications, 1987. (Bickman L, ed. Applied Social Research Method Series; vol 10). Minkler M. Improving health through community organization. In: Glanz K, Lewis FM, Barbara K. Rimer, eds. Health behavior and health education : theory, research, and practice. San Francisco: Jossey-Bass Publishers, 1990. Mullaly-Quijas P, Ward DH, Woelfl N. Using focus groups to discover health professionals' informa- tion needs: a regional marketing study. Bulletin of the Medical Library Association 1994;82(3):305- 11. Nyswander D. The open society: its implications for health educators. Health Education Monographs 1966;1:3-13. Orlandi MA. Cultural competence for evaluators : a guide for alcohol and other drug abuse preven- tion practitioners working with ethnic/racial communities. Rockville, MD: U.S. Dept. of Health and Human Services,Public Health Service, Alcohol, Drug Abuse, and Mental Health Administration, Office for Substance Abuse Prevention, Division of Community Prevention and Training : Distrib- uted by OSAP's National Clearinghouse for Alcohol and Drug Information, 1992. Osheroff JA, Bankowitz RA. Physicians' use of computer software in answering clinical questions. Bulletin of the Medical Library Association 1993;81(1): 11-19. Pancer SM, Westhues A. Developmental stage approach to program planning and evaluation Evalua tion Review 1989;13(l):56-77. Patton MQ. Utilization-focused evaluation. Beverly Hills: Sage Publications, 1978. Bibliography P-5 Patton MQ. How to use qualitative methods in evaluation. Newbury Park, CA: Sage Publications, Inc., 1987. Program Evaluation Kit; vol 4). Phillips L. Washington State rural health databook. Olympia, WA: Washington State Department of Health, 1997. Powell RR. Basic research methods for librarians. Norwood, New Jersey: Ablex Publishing Corp., 1991. Rambo N, Dunham P. Information needs and uses of the public health workforce—Washington, 1997- 1998. MMWR Weekly 2000;49(6):118-120. Reisman J. A field guide to outcome-based program evaluation. Seattle: Organizational Research Services, Inc., 1994. Rogers EM. Diffusion of innovation. Glencoe, 111: The Free Press, 1962. Rogers EM. Communication of innovations. (2nd ed.) New York: The Free Press, 1971. Rogers EM, Scott KL. The diffusion of innovations model and outreach from the National Network of Libraries of Medicine to Native American communities. Seattle, WA: National Network of Librar- ies of Medicine, Pacific Northwest Region Web site, http://www.nnlm.nlm.nih.gov/pnr/eval/ rogers.html, 1997. Rossi PH, Freeman HE. Evaluation: a systematic approach. Beverly Hills, CA: Sage Publications, 1989. Shonrock, DD. Evaluating library instruction: sample questions, forms and strategies for practical use. Chicago: American Library Association, 1996. Siegel E, Rapp B, Lindberg D. Evaluating the impact of MEDLINE using the critical incident tech- nique. Proceedings of the annual symposium on computer applications in medical care 1991:83-87. Soriano FI. Conducting needs assessments: a multidisciplinary approach. Thousand Oaks: Sage Publications, 1995. Steckler A, Allegrante JP, Altman D, et al. Health education intervention strategies: recommenda- tions for future research. Health Education Quarterly 1995;22(3):307-328. Thompson B, Kinne S. Health promotion at the community level. In: Bracht N, ed. Newbury Park: Sage, 1990:45-65. Udinsky BF, Osterlind SJ, Lynch SW. Evaluation resource handbook: gathering, analyzing, reporting data. San Diego: EdITS, 1981. Walton L. Outreach: just do it. 3 Sources: Newsletter of the NN/LM, GMR 1996;14(6):5. P-6 Bibliography Witte K. Putting the fear back into fear appeals: the extended parallel process model. Communica- tion Monographs 1992a;59:329-349. Witte K. Theory-based interventions and evaluation of outreach efforts. Seattle, WA: National Network of Libraries of Medicine, Pacific Northwest Region Web site, http:// www.nnlm.nlm.nih.gov/pnr/eval/witte.html, 1998. Index Locators for figures are in italic. Locators for appendices are in boldface. Accountability, 45 Activities categories of, 23 planning of, 23-37 Alternative hypothesis, 65 Analysis of data. See Data analysis Analysis of variance (ANOVA), 65 Audience assessment, H1-H3 conducting, 32-34 in implementation plan, 32-33 theory-based variables in, 33 vs. community assessment, 2 B Bandura, Albert, 24 Behavioral change behavioral capacity in, 24 community organization in, 30-31, 31 diffusion of innovations theory, 29, 29-30, El extended parallel process model (EPPM), 26-28 factors in, 23-24 learning theories, 23-30 observational learning, 25-26 outcome expectations, 24 sample measures of, G1-G2 self-efficacy in, 24-25, 25, F1-F2 social learning theory, 24, 24-26 stages of change model, 28, 28-29 theory-based variables, 33 Behavioral objectives, 18, D2-D3 Bivariate analysis, 65 c Chi square, 65 Community assessment case example, 14 conducting, 1-8 data collection, 5-7 defined, 2 readings, 10-11 references, 9 target community, 2-3 user input, obtaining, 5-7 utilization of results, 7-8 vs. audience assessment, 2 Community organization, 30-31, 31 Competence, community, 30 Correlation, 65 Critical consciousness, 30 Cultural appropriateness in data collection, 62-63 D Data analysis, 63-65 analysis of variance (ANOVA), 65 bivariate, 65 chi square, 65 coding, 63 correlation, 65 F-test, 64-65 inferential, 65 Index-2 multivariate, 65 quality control, 63 software, 64 statistics in, 64-65 T-test, 64-65 types of, 63-65 univariate, 65 variables in, 64 Data collection, 59-65 for audience assessment, 32 case study, 69-70 for community assessment, 5-7 cultural appropriateness in, 62-63 extensive vs. intensive, 5-7, 48 focus groups, 6-7 instrument design, 6, 62 methods, 5-7, 60, 61 open ended questions, 5 outcome measurement workform, 68 planning, 44 process measurement workform, 67 for program evaluation, 44 qualitative methods, 5, 48^19 quality control, 60-62 quantitative methods, 5, 48^9 questionnaires, 6, 12-13 readings, 66 references, 66 reliability in, 62 sampling in, 5, C1-C3 stakeholder interviews, 6 testing of instrument, 62 validity in, 62 Descriptive data, 65 Diffusion of innovations theory, 29, 29-30, El E Educational objectives, 18, D1-D2 Empowerment, 30 Environmental objectives, 18, D2-D3 EPPM. See Extended parallel process model Evaluation benefits of, vii-viii budget, 52 components, 43 criteria, 59 data analysis, 63-65 design of, 49-52, 50 experimental, 51 non-experimental, 52 quasi-experimental, 51-52 randomization, 51 time factors, 49-50 extent of, 52 flow chart, vi, 43 issues in, 44 methods, 47^19 objectives, 44-47 outcome evaluation, 47 planning, 43-54, 57-58 process evaluation methods, LI objectives sample, K1-K5 objectives workform, 55-56 planning, 45^47 readings, 54 references, 54 Executive summary, 72 Exit questionnaires, sample, M1-M2 lndex-3 Extended parallel process model (EPPM), 26-28 definitions, 27 outreach messages using, 27 evaluation, use in, 46 F F-test, 64-65 Fear appeal theory. See Extended Parallel Process Model Focus groups, 6-7, 7 G Goals case example, 22 defined, 15 development of. 15-19 outreach audience input, 15 samples, Dl H Hypothesis, 65 I Implementation plans, 31-32 Inferential data analysis, 65 Information access, 1, 23 Innovation, diffusion of, 29-30 Instrumentation, in data collection, 62 Interviews, 6, 60 M Mean, arithmetic, 65 Median, 65 Meetings, for data collection, 60 Mode, 65 Multiple regression, 65 Multivariate analysis, 65 N Needs assessment. See Audience assessment Null hypothesis, 65 o Objectives case example, 22 development of, 15-19 elements in, 18-19 indicators, 16, 17, 59 outcomes in, 15-16, 17 references, 20 samples, D1-D3 types of, 18 workform, 21 Observation, 60 Observational learning, 25-26 Outcomes behavior, measures for, Ol evaluation of, 47 expectations in, 24 program, measures for, Nl Outreach programs activities, 1, 23 assumptions concerning, vii behavior theories in, viii-ix, 24-34 evaluation, 43-54, vi, viii goals, generally, vii librarian contact in, 23 objectives, 1-2 (See also Objectives) planning, 23-34, vi, viii reporting results, 71-73 theory-based strategies, 24, 37 success factors, 23 Index-4 P Planning activities and strategies, 23-34 audience assessment, 32-34 case example, 40-41 implementation, 31-32 readings, 36 references, 35 sample outline, 11-13 task list sample, Jl theory-based, 31 workform, 38-39 Process evaluation, 45^4-7, K1-K5 Process evaluation objectives, 45-47 Process objectives, 18, Dl Program development, 43, vii Program improvement, 45-46 Program objectives, 18, D3 Q Quality control, 61, 63 Questionnaires for data collection, 6, 33-34, 60, 75 development of, 12-13 sample question formats, B1-B2 sample surveys, A1-A4, H1-H3, M1-M2 Questions, open-ended, 5 R Randomization, 51, C2 Records, in data collection, 60 Reliability, 62 Replication, 46-47 Reports, 71-73 case example, 75 dissemination of, 73 preparation of, 72 structure of, 72-73 Results, utilization, 71 Rogers, Everett, 29 s Sampling in data collection, 5 design of, C1-C3 randomization in, C2 response rate, C3 sample size, C2-C3 types of, C1-C2 Self-efficacy, 24-25, 25, 26, F1-F2 Social learning theory, 24, 24-26 Software, for data analysis, 64 Stages of change model, 28, 28-29 Stakeholders, 6, 44 Statistics, 64-65 Summative objectives, 47 Surveys, 60, A1-A4. See also Questionnaires; Exit questionnaires, sample T T-test, 64-65 Target community, identification of, 2-3 u Univariate analysis, 65 V Validity, 62 Variables, 46, 64 Comment Form Your comments or suggestions will help us evaluate and improve this publication. Please take a few minutes to complete and mail this response form. 1. How much of this book did you read? All of it________ Some of it__________ Did not read it________ 2. If you read some of the guide, please mark parts that you read: ________Introduction ________Stage 1: Conducting a Community Assessment ________Stage 2: Developing Goals and Objectives ________Stage 3 Planning Activities and Strategies ________Stage 4 Planning Evaluation ________Stage 5 Gathering Data and Assessing Results ________Stage 6 Utilizing and Reporting Results ________Tool Kits 3. Did you find the guide to be _______very useful _________somewhat useful _________not useful? 4. Please circle those sections listed above that you found most useful. 5. How have you used this guide? (check as many as apply) _______For background and reference ________As a planning and evaluation tool for a specific project _______As a curriculum resource ________For staff development ________Other_________________________________________________________ 6. Your job title?_______________________________________________________ 7. Affiliation or employer?______________________________________________ 8. How might this guide be improved? Thank you! Please send to: "Measuring the Difference" NN/LM, PNR Box 357155 University of Washington Seattle, Washington 98195- "WA 11UNAL LIBRARY OF MEDICINE NLn D17S7131 3 49B5MG47 14 12/11/02 ^^ HflB j NLM017571313 NATIONAL LIBRARY OF MEDICINE NLM DBMMbflTD 3 » IP loll" w ' ! juts' l \ ii ''■'..•■i.i .■i-.::"i-f!i-:!!H"« V"li!'-':Ol,|l;J \'r-.\ >'.-hi J i v;:!-!:i:;;;!JS» •mm NLM024468903 99999�3