Ireys H, Zickafoose J, Brach C, Bergofsky L, Devers K, Burton R, Simpson L, Albritton E
Abstract
Within the broad mandate of the CHIPRA legislation, demonstration States pursued a variety of activities, projects, and approaches. This summary, which draws from products produced throughout the evaluation, highlights program objectives, the strategies States used, and the lessons learned about: 1) reporting and using the core set of quality measures for children 2) transforming service delivery to promote quality of care 3) improving service systems for youth with serious emotional disorders 4) applying health information technology (IT) for quality improvement 5) building partnerships to improve quality of children’s healthcare 6) using Federal grants to build intellectual capital at the state level
Insights Results
Key takeaways/implications
Evaluating methods: the national evaluation team used qualitative and quantitative data about the demonstration projects (includes information from reports, interviews, claims data and surveys)
Insight: to make progress in transforming service delivery systems, States need a combination of strategies (e.g., learning collaboratives, direct facilitation of practice-level changes, payments to practices to support staff time for QI efforts)
Lessons learned for transforming service delivery: 1) Learning collaboratives are useful when structured by provider input; 2) Most practices lacked the technical competencies to gather data needed to track QI efforts; 3) School based health centers (SBHCs) may have limited experience engaging youth in discussions about their own health and healthcare; 4) Developing sustainable methods for engaging families and youth is challenging; and 5) Allowing practices to hire their care coordinators directly better supported integration of these staff members in daily operations
Lessons learned from improving systems for youth with serious emotional disorders and their families: 1) Broad stakeholder involvement is critical for coordination and curriculum development; 2) Assistance from experienced consultants can help States understand the array of options for designing their CMEs; and 3) Analyzing data on service use, cost and eligibility from multiple agencies can help inform CME design decisions
Lessons learned from using Child Core Set of Quality measures: 1) Reporting capacity was influenced by Medicaid data availability, technical expertise, past experience with quality measurements, demand for measures and lack of provider EHR infrastructure; 2) Stakeholders value State reports on the performance of health plans and child-serving practices; and 3) While the majority of child physicians believe quality reports are effective for QI, only one-third use them in their QI activities
Lessons learned from applying health information for QI: 1) This process took far longer and required more resources than anticipated; 2) Differences in EHR functionality, system incompatibility and poor internet connections for SBHCs created challenges; 3) Implementing the EHR Format for children into current EHRs will be difficult; and 4) Developing electronic screening methods helped identify health risks and track quality
Lessons learned from building cross-state partnerships to improve quality of children’s healthcare: states in partnerships can: 1) Share tools, training resources and expertise; 2) Leverage several funding sources; 2) Apply lessons learned to avoid repeated mistakes and improve quality; and 4) Expand the spread/potential impact of project