2002 Annual Meeting on Successful Statewide |
Roundtable Topic: Systematic evaluation of statewide EHDI programs Summary Participants: Tracy Jebo, Amy Fass, Terri Mohren, Jayesh Shah, Cynthia Gaudin, Angie Mister, Vitalian Adjei, Scott Grosse, Dave Ross, Roy Ing, Gloria Reyes, Lisa Payne CDC is preparing a document to be circulated to EHDI programs that outlines the planning and evaluation process. It was emphasized that evaluation should be part of the planning process from the beginning, not something done after the fact. The planning process starts with goals, with national EHDI goals a starting point.Objectives are specific statements of measurable outcomes with timetables. An easy way to remember this is that outcomes should be S.M.A.R.T.: Specific, Measurable, Achievable, Realistic, and Timeframed. The handout described the CDC framework for public health program evaluation, available on the CDC website. The first two steps are engaging stakeholders and describing the program. Examples of important groups of stakeholders identified by roundtable participants included hospital staff responsible for managing screening and health department staff from neighboring states to deal with out-of-state births. It is essential to build relationships with these stakeholders, with two-way flow of information. If one wants to obtain data for evaluation, it is important to provide feedback. Describing the program can be broken down into three components:
The planning process needs to include identification of data sources for the information needed to measure outcomes. This can include special surveys or focus groups of parents to determine their perceptions and experiences with the screening, diagnostic, and intervention processes and to identify barriers to timely diagnosis and entry to intervention. Part of the program description consists of laying out a 'logic model' of the structure, process, and outcomes. The logic model will help to determine which questions need to be answered in an evaluation of the program. The participant from Georgia concurred, stating that in his experience, a good evaluation process begins with a flowchart of causes and effects, which determines the data elements needed for the database. Texas has an implementation protocol that defines specific objectives for hospitals to meet in order to be certified, including 90% test rate and 95% pass rate. Virginia is in the process of developing a logic model, evaluating hospital screening data and providing feedback, building relationships with hospital staff, and developing patient and audiologist surveys. |