MEDICAID POLICY BRIEF Brief 22 • August 2015 Assessing the Usability of Encounter Data for Enrollees in Comprehensive Managed Care 2010-2011 Vivian L.H. Byrd and Allison Hedley Dodd A s growing numbers of Medicaid enrollees receive health benefits through comprehensive managed care (CMC), researchers and policymakers seeking to About This Series The MAX Medicaid policy issue brief series highlights understand the service use of these enrollees must rely the essential role MAX data can play in analyzing on encounter data that states receive from managed care the Medicaid program. MAX is a set of annual, plans. However, not all states report the encounter data person-level data files on Medicaid eligibility, service utilization, and payments that are derived from state submitted by their plans to the Medicaid Statistical Infor- reporting of Medicaid eligibility and claims data into mation System (MSIS), and, until recently, little was known the Medicaid Statistical Information System (MSIS). about the data’s usability for research. This issue brief MAX is an enhanced, research-friendly version of discusses the availability, completeness, and quality of MSIS that includes final adjudicated claims based encounter data for physician, clinic, and outpatient services on the date of service, and data that have undergone (OT); inpatient hospital services (IP); and prescription additional quality checks and corrections. CMS drug services (RX) in the Medicaid Analytical eXtract produces MAX specifically for research purposes. For more information about MAX, please visit: https:// (MAX) data, which are derived from MSIS. Knowing this www.cms.gov/Research-Statistics-Data-and-Systems/ information can help researchers and policymakers judge ComputerData-and-Systems/MedicaidDataSources- the usability of the 2010 and 2011 encounter data in MAX. GenInfo/MAXGeneralInformation.html. Introduction The percentage of full-benefit Medicaid enrollees in quality and validation checks either in MSIS or in the CMC grew steadily—from 41 to 54 percent—between production of MAX. 2004 and 2010 (Borck et al. 2014). In CMC arrange- By 2010, more than half of all full-benefit Medicaid ments, states contract with health maintenance organiza- enrollees were in CMC plans.2 With the arrival of the tions (HMOs)/health insuring organizations (HIOs) to Affordable Care Act of 2010, the percentage of non- deliver comprehensive services to Medicaid enrollees and disabled adult enrollees is expected to climb as states pay capitation payments, a set fee to cover an enrollee. eliminate the categorical requirements for Medicaid States fairly consistently report enrollment in managed eligibility (such as pregnancy and dependent children) care plans and the capitation payments they make to and expand eligibility to everyone below 138 percent HMOs/HIOs. However, capitation claims, unlike fee- of the federal poverty level (Kaiser Family Foundation for-service (FFS) claims, contain no information about 2012). Given that most full-benefit adult enrollees are in service use.1 Instead, service use is captured through CMC (65 percent in 2010), researchers and policymakers encounter data, managed care records that contain will need to rely on encounter data—records that contain information on utilization but not Medicaid expenditures. information on service use but not on expenditures—to Encounter data, unlike FFS data, do not undergo extensive assess service use in this population. 1 This brief discusses the availability, completeness, and BOE group if at least 10 percent of full-benefit Medicaid quality of the encounter data in MAX 2010 and 2011, enrollees in that group were enrolled in an HMO/HIO expanding on a previous brief that discussed MAX plan. We also limited our analysis to BOE groups that had encounter data from 2007 to 2009 (Byrd and Dodd 2012a). at least 200 encounter records, as assessments based on a Our analysis focuses on the encounter data submitted in small number of records could skew our estimates. the MAX 2010 and 2011 physician, outpatient, and clinic We analyzed several types of services to gauge the com- services (OT) files; inpatient hospital services (IP) files; and pleteness and quality of the MAX claims files. The OT files prescription drug services (RX) files in MAX, highlighting contain up to 22 types of services, IP contains up to 4, and trends in the availability and usability of these data.3 RX up to 2. Of these, we chose to focus on the following: Methods • For the OT analysis, we chose physician (type of service = 08), outpatient hospital (type of service = 11), MAX is designed to enable research on Medicaid enroll- and clinic (type of service = 12) services because they ment, service use, and expenditures by calendar year. are frequently used by Medicaid enrollees and are Analysis by calendar year is particularly important with covered by Medicaid in all states. Managed care plans encounter data because some states that submit these data are also accustomed to collecting and reporting these to MSIS do not do so every quarter (Byrd et al. 2011). We data for Healthcare Effectiveness Data and Informa- limited our analysis to managed care HMO/HIO plans that tion Set (HEDIS) measures. Analyzing these services are fully capitated (comprehensive) because they cover separately and together yielded similar results, so the the widest range of services and are thought to have the services are presented together in this brief. highest quality and most complete encounter data.4 The goal of our analysis was not to validate the encounter • Inpatient hospital (type of service = 01) services from data but rather to assess the availability, completeness, the IP file were chosen because they represent the vast and quality. To be usable, data needed to be of comparable majority of services provided in the inpatient setting. completeness and quality to FFS data. Our goal was to assess • Prescribed drugs (type of service = 16) were chosen from the availability and usability of encounter data from states the RX file; durable medical equipment was excluded. with a notable portion of their full-benefit, nondual Medicaid Because MAX data are derived from the MSIS data that population enrolled in CMC. We excluded enrollees with states submit, data for some states are available before dual eligibility—that is, people eligible for both Medicaid others. At the time of our analysis, MAX 2010 data were and Medicare—because many of their services are covered available for all states. However, MAX 2011 did not include by Medicare, and thus they have less encounter data than data from Arizona, Colorado, the District of Columbia, nondual enrollees (Young et al. 2012). We considered a state Hawaii, Idaho, and Louisiana because the corresponding to have CMC if at least 1 percent of its full-benefit, nondual MSIS files were unavailable or had significant data prob- enrollees participated in CMC at some point during the year. lems. Of these six states, Arizona, Colorado, the District of The mix of Medicaid populations enrolled in CMC pro- Columbia, and Hawaii enroll more than 1 percent of their grams varies widely by state. Many states rolled out CMC Medicaid population in CMC, and almost all Medicaid recipi- to child and adult enrollees first, but fewer states have ents in Arizona and Hawaii were enrolled in CMC in 2011. enrolled people eligible on the basis of age or disability, where service use is often higher. Due to the differences in Metrics eligibility criteria and benefit packages for CMC programs, Table 1 shows the measures we used to assess completeness as well as potential differences in service use, we analyzed and quality. We calculated a value for each measure using data using each enrollee’s basis-of-eligibility (BOE) encounter data and compared it to a reference range created classification in Medicaid—adult, child, disabled, or aged. from the FFS data. Because managed care coverage varies Because states with low enrollment in managed care are by state and type of enrollee, we evaluated the completeness less likely to devote resources to producing high quality and quality measures for OT, IP, and RX data separately for encounter data, we only analyzed data for a particular each BOE category, for each state, and in each year. 2 Table 1. Data elements and reference ranges used to analyze Medicaid encounter data in MAX 2011, by BOE category Reference range for MAX 2011 Data element Adult Child Disabled Aged Physician, Clinic, and Outpatient Visits (OT) Completeness Measures Average number of OT encounter 1.66–13.08 1.40–10.42 7.55–30.78 0–22.18 records per enrollee Percentage of enrollees with OT 38.73–91.91 42.86–93.38 41.94–100 5.79–89.02 encounter records Quality Measures Percentage of OT encounter records 83.44–100 78.16–100 82.08–100 83.32–100 with place of service code Percentage of OT encounter records 89.61–100 82.37–100 90.79–100 91.98–100 with primary diagnosis code Percentage of OT encounter records 85.68–100 78.15–100 86.46–100 87.41–100 with a primary diagnosis code length greater than three characters Percentage of OT encounter records 70.19–100 79.01–100 79.98–100 84.93–100 with procedure (service) code Percentage of OT encounter records 62.44–100 69.43–100 72.09–100 80.82–100 with a procedure code in CPT-4 or HCPCS format Inpatient Hospital (IP) Completeness Measures Average number of IP encounter 0–0.36 0.03–0.12 0.09–0.56 0.01–0.40 records per enrollee Percentage of enrollees with IP 0–30.39 2.24–10.69 5.88–27.07 1.53–22.96 encounter records Quality Measures Average length of stay 2.54–5.74 3.70–6.43 5.37–10.65 4.82–10.44 Average number of diagnosis codes 2.56–7.55 1.97–4.77 3.73–10.07 3.76–11.00 Percentage of IP records with 53.51–96.96 23.89–74.51 30.70–74.00 29.85–75.62 procedure codes Percentage of IP records with UB ≥90 ≥90 ≥90 ≥90 accommodation codes Prescription Drugs (RX) Completeness Measures Average number of RX encounter 1.34–14.25 0.71–7.81 13.62–50.94 0–53.23 records per enrollee Percentage of enrollees with RX 29.22–87.23 33.05–78.11 51.46–97.51 16.68–89.69 encounter records Quality Measures Percentage of RX records with date ≥90 ≥90 ≥90 ≥90 prescribed Percentage of RX records with quantity ≥90 ≥90 ≥90 ≥90 Source: Mathematica’s analysis of MAX 2011 data. Note: UB = uniform billing, CPT-4 = Current Procedural Terminology version 4, HCPCS = Healthcare Common Procedure Coding System 3 To create comparison metrics, we used data for FFS To assess the quality of the IP files, we looked at four participants who were similar to our “encounter data” data elements (length of stay, diagnosis codes, procedure population: that is, we looked at the full-benefit, nondual codes, and Uniform Billing (UB) accommodation codes) FFS population across all states with substantial FFS that are scrutinized during the quality and validation participation in 2010 and 2011. We then examined the checks for FFS claims. For the RX file, we created a completeness and quality of the FFS data, calculating an quality measure for each of the two data elements (data average value and standard deviation for each metric in prescribed and quantity) that we expect to see routinely each BOE category. We used the average FFS value as filled on FFS claims. the midpoint of the reference range, and we set the top of For certain measures that assessed whether a data element the reference range at two standard deviations above the was provided, state values were highly skewed, but typi- FFS average and the bottom at two standard deviations cally they were either close to 100 percent or 0 percent for below the FFS average. This approach approximates both FFS and encounter data. Rather than use the reference the construction of confidence intervals typically used range based on the average value, we defined a “good” in statistical analysis. We considered the FFS reference value as 90 percent or greater for these measures. range to be the acceptable range of values for each year’s encounter data for that metric. For each BOE category that met the analysis criteria, we compared the state’s value to the FFS reference range To judge the completeness of the data, we examined constructed for the same year to determine if it fell within each file type (OT, IP, and RX) using two measures that that acceptable range. The ranges for 2011 are shown in captured the volume of encounter data: (1) the average Table 1 (2010 ranges are not shown).5 A state’s encounter number of encounter records per person and (2) the data did not have to meet all completeness and quality percentage of enrollees with encounter records. measures to be considered usable. For the OT, IP, and To evaluate quality, we created metrics for each file type RX data, we defined “complete” as having values within that assessed data elements in the records. For the OT the acceptable range for at least one of the two complete- files, we first selected two data elements to examine: ness metrics for that data type. For example, if there was diagnosis codes and procedure codes. We then chose two a high enough percentage of enrollees with encounter quality measures to assess each element: one measure records, but the average number of records per enrollee indicated whether the data element was filled, and the was too low, the state’s data for that BOE was still second indicated the format of the data. We expected considered complete. many of the diagnosis codes to be filled because few OT claims are paid without a diagnosis code. To determine To meet our quality standard, the OT data had to satisfy whether the diagnosis codes in encounter records had a at least four of the five quality measures, the IP data had comparable level of specificity to those reported in FFS to satisfy at least three of the four quality measures, and claims, we evaluated the length of the code; the more the RX data had to satisfy at least one of the two quality characters it had (beyond the three requisite characters), measures. A BOE category within a state was considered the more specific the diagnosis. Similarly, we expected to have usable data if the encounter data for that BOE met many of the procedure code elements to be filled, but the both the “complete” and “comparable quality” criteria. heavy reliance of some states on procedure codes specific to the state made a national analysis more complicated. Findings We also examined whether the reported procedure codes Since 2007, there has been a notable increase in the were in a standard national format (Current Procedural number of states that met our CMC threshold for the Terminology version 4 (CPT-4) or Healthcare Common disabled and aged BOE groups (Byrd and Dodd 2012a). Procedure Coding System (HCPCS)). Lastly, we evalu- This is consistent with the shifting Medicaid landscape, ated what percentage of records had a valid place of in which more states are bringing more of their traditional service code comparable to those reported in FFS claims. FFS Medicaid populations into CMC. Also of note is that, 4 for the first time, encounter data for Massachusetts and threshold submitted data of comparable completeness Ohio appeared in the MAX data for services delivered and quality to FFS data in both years, and the usability in 2011. Although these two states did not meet all of of the data improved for each BOE category. Twelve our thresholds of usability, they do have many Medicaid states submitted usable data in each year for all four BOE enrollees in managed care, and it is encouraging that they categories: California, Delaware, Kentucky, Michigan, have begun to submit their encounter data to CMS. Minnesota, Nebraska, New Jersey, New York, Oregon, Tennessee, Texas, and Virginia. An additional five states We saw a continued increase in the availability of data submitted usable data for each BOE category in which they as well (Byrd and Dodd 2012a). Of the states that met met the CMC threshold in both years: Connecticut, Georgia, our CMC enrollment threshold, the number that submit- Indiana, Missouri, and Washington. The number of encoun- ted more than 200 encounter records stayed the same or ter records increased to above 200 for at least one BOE increased over the two years for each of the BOE groups. category in Massachusetts, Ohio, South Carolina, and Utah. The vast majority of states that met at least one complete- In 2011, 30 states met the CMC enrollment threshold ness measure met both of them. Data quality within states for their adult Medicaid population. Of those states, did not uniformly improve, however, which is consistent 23 submitted OT encounter records that met complete- with what states have reported to Mathematica through ness and quality thresholds and were deemed usable for other contracts that are providing technical assistance to research. Of the remaining 7 states, 3 submitted 200 or the states. Although some states have seen data quality fewer OT encounter records, and 4 submitted more than improvements, others have had mixed experiences due to 200 records, but the records were unusable. The number flaws in internal system processing, such as converting to a of states that met the CMC enrollment threshold for every new Medicaid management information system, as well as BOE category and the number of states submitting usable problems with data quality received from individual plans data grew between 2010 and 2011. (Byrd et al. 2013). The quality measures where the lowest number of states fell within the reference range were the IP Encounter Data percentage of OT encounter records with a place of service code and the percentage of OT encounter records with a Table 3 shows the number of states that met the CMC procedure codes in a nationally standard format. enrollment threshold as well as the availability and usability of the IP encounter data for each state, by BOE. The number The percentage of states submitting encounter data com- of states submitting encounter data increased between 2010 parable in completeness and quality to FFS data—and thus and 2011 for the adult, child, and disabled BOE categories. usable for research—increased for OT and RX file data for In 2011, 27 of the 30 states meeting the CMC enrollment the adult, child, and disabled groups from 2010 to 2011 threshold for children submitted more than 200 encounter (see Tables 2, 4). The most notable change over time was records, and 19 of those submitted usable data. an increase in the number of states submitting usable RX data for the disabled (from 10 states in 2010 to 18 states in RX Encounter Data 2011). Generally, states that enrolled at least 10 percent of States that use CMC to deliver comprehensive services one BOE category in an earlier year continued to meet the sometimes choose to exclude, or “carve out,” prescription enrollment threshold for the same BOE categories in later drug services from the CMC arrangements. However, years. Furthermore, states whose data were usable in one the number of states that submitted data for each BOE year often had usable data the next year. category rose from 2010 to 2011. Also noteworthy is the OT Encounter Data fact that, of all states that submitted RX encounter data for 2011, all but one submitted usable data for each BOE. Table 2 summarizes the availability and usability of the OT Table 4 summarizes the availability and usability of the encounter data in MAX 2010 and 2011 for each state, by RX encounter data for each state, by BOE. BOE category. Most states that met our CMC enrollment 5 Table 2. Usability of OT encounter data from MAX 2010–2011, by state and BOE category Usable OT encounter data, 2010 Usable OT encounter data, 2011 Adult Child Disabled Aged Adult Child Disabled Aged Alabama Alaska Arizonaa Y Y Y Y NR NR NR NR Arkansas California Y Y Y Y Y Y Y Y Coloradoa N N 0 NR NR NR NR Connecticut Y Y Y Y Delaware Y Y Y Y Y Y Y Y DCa Y Y N N NR NR NR NR Florida Y Y Y Y N Y Y N Georgia Y Y Y Y Hawaiia Y Y Y Y NR NR NR NR Idaho a NR NR NR NR Illinois N Y Indiana Y Y Y Y Y Y Iowa Kansasb NR NR NR NR Y Y Kentucky Y Y Y Y Y Y Y Y Louisiana NR NR NR NR Mainea,b NR NR NR NR NR NR NR NR Maryland Y N N N Massachusetts 0 0 0 Y Y Y Michigan Y Y Y Y Y Y Y Y Minnesota Y Y Y Y Y Y Y Y Mississippi Y Y Missouri Y Y Y Y Montana Nebraska Y Y Y Y Y Y Y Y Nevada 0 0 0 0 New Hampshire New Jersey Y Y Y Y Y Y Y Y New Mexico Y Y Y Y Y Y Y New York Y Y Y Y Y Y Y Y North Carolina North Dakota Ohio 0 0 0 0 Y Y Y Y Oklahoma Oregon Y Y Y Y Y Y Y Y Pennsylvania 0 0 0 0 0 0 0 0 Rhode Island Y Y Y N N N South Carolina 0 0 0 Y Y Y (continued) 6 Table 2. Usability of OT encounter data from MAX 2010–2011, by state and BOE category (continued) Usable OT encounter data, 2010 Usable OT encounter data, 2011 Adult Child Disabled Aged Adult Child Disabled Aged South Dakota Tennessee Y Y Y Y Y Y Y Y Texas Y Y Y Y Y Y Y Y Utah N N 0 Y Y Y Y Vermont Virginia Y Y Y Y Y Y Y Y Washington Y Y Y Y Y West Virginia 0 0 0 0 Wisconsin N Y N Y Wyoming States meeting CMC 32 33 26 20 30 30 25 18 enrollment threshold States submitting data 26 27 23 16 27 27 24 17 States submitting usable 24 24 18 16 23 25 21 16 data Of states meeting CMC 75% 73% 69% 80% 77% 83% 84% 89% threshold, percentage that submitted usable data Source: MAX 2010 and 2011. Notes: Blank cells indicate the state's enrollment in CMC did not meet the enrollment threshold in that BOE category. 0 indicates the state met the enrollment threshold but submitted 200 or fewer encounter records in that BOE category. N indicates the state met the enrollment threshold and submitted more than 200 encounter records in that BOE category, but the data did not meet completeness and quality thresholds. Y indicates the state met the enrollment threshold, submitted more than 200 encounter records in that BOE category, and the data met completeness and quality thresholds (and were therefore usable). NR indicates that the files were not available in MAX. a Arizona, Colorado, DC, Hawaii, Idaho, Louisiana, and Maine were not included in the analysis because the corresponding MSIS files were unavailable or contained significant data problems in 2011. b Kansas and Maine were not included in the analysis because the corresponding MSIS files were unavailable or contained significant data problems in 2010. 7 Table 3. Usability of IP encounter data from MAX 2010–2011, by state and BOE category Usable IP encounter data, 2010 Usable IP encounter data, 2011 Adult Child Disabled Aged Adult Child Disabled Aged Alabama Alaska Arizonaa Y Y Y Y NR NR NR NR Arkansas California Y N N N N N N N Colorado a 0 0 0 NR NR NR NR Connecticut N N Delaware Y 0 Y Y N 0 DC a Y Y Y NR NR NR NR Florida Y Y Y Y N Y Y Georgia Y Y Y Y Hawaiia Y Y Y Y NR NR NR NR Idaho a NR NR NR NR Illinois 0 0 Indiana Y Y Y Y Y Y Iowa Kansasb NR NR NR NR Y Y Kentucky Y Y Y 0 Y N Y N Louisiana NR NR NR NR Mainea,b NR NR NR NR NR NR NR NR Maryland Y Y Y Y Y Y Massachusetts 0 0 0 N N N Michigan Y Y Y 0 Y Y Y 0 Minnesota Y Y Y Y Y Y Y Y Mississippi 0 0 Missouri Y Y Y Y Montana Nebraska Y Y Y 0 N N N 0 Nevada 0 0 0 0 New Hampshire New Jersey Y Y Y Y Y Y Y Y New Mexico Y Y Y 0 Y Y Y New York Y Y Y Y N Y Y North Carolina North Dakota Ohio 0 0 0 0 Y Y Y Y Oklahoma Oregon Y Y Y 0 Y Y Y 0 Pennsylvania 0 0 0 0 0 0 0 0 Rhode Island N N N South Carolina 0 0 0 Y Y Y (continued) 8 Table 3. Usability of IP encounter data from MAX 2010–2011, by state and BOE category (continued) Usable IP encounter data, 2010 Usable IP encounter data, 2011 Adult Child Disabled Aged Adult Child Disabled Aged South Dakota Tennessee Y Y Y 0 Y Y Y 0 Texas Y 0 N Y N 0 Utah Y 0 0 Y Y Y 0 Vermont Virginia Y Y Y Y Y Y Y Y Washington Y Y Y Y Y West Virginia 0 0 0 0 Wisconsin Y Y Y Y Wyoming States meeting CMC 32 33 26 20 30 30 25 18 enrollment threshold States submitting data 26 26 21 8 27 27 22 8 States submitting usable data 23 19 17 7 21 19 16 6 Of states meeting CMC 72% 58% 65% 35% 70% 63% 64% 33% threshold, percentage that submitted usable encounter data Source: MAX 2010 and 2011. Notes: Blank cells indicate the state's enrollment in CMC did not meet the enrollment threshold in that BOE category. 0 indicates the state met the enrollment threshold but submitted 200 or fewer encounter records in that BOE category. N indicates the state met the enrollment threshold and submitted more than 200 encounter records in that BOE category, but the data did not meet completeness and quality thresholds. Y indicates the state met the enrollment threshold, submitted more than 200 encounter records in that BOE category, and the data met completeness and quality thresholds (and were therefore usable). NR indicates that the files were not available in MAX. a Arizona, Colorado, DC, Hawaii, Idaho, Louisiana, and Maine were not included in the analysis because the corresponding MSIS files were unavailable or contained significant data problems in 2011. b Kansas and Maine were not included in the analysis because the corresponding MSIS files were unavailable or contained significant data problems in 2010. 9 Table 4. Usability of RX encounter data from MAX 2010–2011, by state and BOE category Usable RX encounter data, 2010 Usable RX encounter data, 2011 Adult Child Disabled Aged Adult Child Disabled Aged Alabama Alaska Arizonaa Y Y Y Y NR NR NR NR Arkansas California Y Y Y Y Y Y Y Y Colorado a 0 0 0 NR NR NR NR Connecticut 0 0 0 0 Delaware 0 0 0 0 N 0 0 0 DC a N N 0 NR NR NR NR Florida Y Y Y Y Y Y Y Y Georgia Y Y Y Y Hawaiia 0 0 0 0 NR NR NR NR Idahoa NR NR NR NR Illinois Y Y Indiana 0 0 0 0 0 0 Iowa Kansasb NR NR NR NR Y Y Kentucky Y Y Y Y Y Y Y Y Louisiana NR NR NR NR Mainea,b NR NR NR NR NR NR NR NR Maryland Y Y Y Y Y Y Massachusetts 0 0 0 Y Y Y Michigan Y Y Y Y Y Y Y Y Minnesota N Y Y Y Y Mississippi Y Y Missouri 0 0 0 0 Montana Nebraska 0 0 0 0 0 0 0 0 Nevada 0 0 0 0 New Hampshire New Jersey Y Y N Y Y Y Y Y New Mexico Y Y Y Y Y Y Y New York N N N Y Y Y Y Y North Carolina North Dakota Ohio 0 0 0 0 Y Y Y Y Oklahoma Oregon Y Y Y Y Y Y Y Y Pennsylvania 0 0 0 0 0 0 0 0 Rhode Island Y Y Y Y Y Y South Carolina 0 0 0 Y Y Y (continued) 10 Table 4. Usability of RX encounter data from MAX 2010–2011, by state and BOE category (continued) Usable RX encounter data, 2010 Usable RX encounter data, 2011 Adult Child Disabled Aged Adult Child Disabled Aged South Dakota Tennessee 0 0 0 0 0 0 0 0 Texas 0 0 0 0 0 0 0 0 Utah 0 0 0 0 0 0 0 0 Vermont Virginia Y Y Y Y Y Y Y Y Washington Y Y Y Y Y West Virginia 0 0 0 0 Wisconsin 0 0 0 0 Wyoming Total meeting CMC 32 33 26 20 30 30 25 18 threshold States submitting data 15 16 14 11 19 18 18 12 States submitting usable data 13 13 10 10 18 18 18 12 Of states meeting CMC 41% 39% 38% 50% 60% 60% 72% 67% threshold, percentage that submitted usable encounter data Source: MAX 2010 and 2011. Notes: Blank cells indicate the state's enrollment in CMC did not meet the enrollment threshold in that BOE category. 0 indicates the state met the enrollment threshold but submitted 200 or fewer encounter records in that BOE category. N indicates the state met the enrollment threshold and submitted more than 200 encounter records in that BOE category, but the data did not meet completeness and quality thresholds. Y indicates the state met the enrollment threshold, submitted more than 200 encounter records in that BOE category, and the data met completeness and quality thresholds (and were therefore usable). NR indicates that the files were not available in MAX. a Arizona, Colorado, DC, Hawaii, Idaho, Louisiana, and Maine were not included in the analysis because the corresponding MSIS files were unavailable or contained significant data problems in 2011. b Kansas and Maine were not included in the analysis because the corresponding MSIS files were unavailable or contained significant data problems in 2010. Caveats dard deviations is consistent with confidence intervals typically used in statistical analyses, but for measures Our analysis shows that a reasonable volume of encounter with a lot of variation in the FFS data, this sometimes data is available in MAX and that the data appear to be resulted in a wide reference range. Researchers interested of good quality on basic measures. We assumed that, like in the full scope of Medicaid service use within states the FFS data, the encounter data falling within acceptable should still examine the encounter data that were not ranges accurately depict what is happening in the state. Our deemed usable for research based on our analysis. analysis is limited, however, by its assumption that FFS data provide a reasonable benchmark for judging the encounter Conclusions data, which may not be the case, depending on the popula- tions a state chooses to enroll in managed care. One issue is This brief is intended to shed light on the availability and that people who are moved to CMC may be healthier than usability of the 2010 and 2011 MAX OT, IP, and RX encoun- those who are not, or vice versa, within all BOE categories. ter data. Our analysis provides information that will help researchers and policymakers determine which states’ encoun- People who are enrolled in CMC plans likely do differ ter data to analyze. In many states, the quality and availability from FFS populations in important ways. To control for of the encounter data improved over the two study years. This this, we used metrics within two standard deviations is an encouraging trend for researchers and policymakers, to account for variations in service use that may reflect who can use this larger volume of data to assess service use differences in the populations or in the FFS system versus across the variety of Medicaid delivery systems. the managed care delivery systems. The use of two stan- 11 Endnotes References 1 Fee-for-service claims account for dollars paid by states to providers Borck, R., L. Ruttner, V. Byrd, and K. Wagnerman. “The Medicaid for Medicaid services. Analytic eXtract 2010 Chartbook.” Washington, DC: Centers for Medicare & Medicaid Services, 2014. 2 A full-benefit Medicaid enrollee is defined here as an enrollee with Byrd, V.L., and A.H. Dodd. “Assessing the Usability of Encounter a restricted benefits flag equal to 1 for any month of enrollment in Data for Enrollees in Comprehensive Managed Care Across MAX the calendar year, meaning the person is eligible for Medicaid or the 2007–2009” Washington, DC: Centers for Medicare & Medicaid Children’s Health Insurance Program (CHIP) and entitled to the full Services, December 2012a. scope of Medicaid or CHIP benefits. Byrd, V.L., A.H. Dodd, R. Malsberger, and A. Zlatinov. “Assessing 3 Encounter records in the LT file are clustered among very few states the Usability of MAX 2008 Encounter Data for Enrollees in in MAX data. After imposing our analysis criteria, there were too Comprehensive Managed Care.” Washington, DC: Centers for few encounters for a cross-state analysis of LT data. Medicare & Medicaid Services, July 2012b. 4 Another issue brief in this series discusses the availability and usability Byrd, V.L., J. Nysenbaum, and D. Lipson. “Encounter Data Toolkit.” of encounter data for prepaid behavioral health plans in MAX 2009. Washington, DC: Centers for Medicare & Medicaid Services, See Nysenbaum et al. (2012). November 2013. Byrd, V.L., J. Verdier, J. Nysenbaum, and A. Schoettle. “Technical 5 The reference ranges for MAX 2007, 2008, and 2009 data appear Assistance for Medicaid Managed Care Encounter Reporting to the in previous issue briefs; see Dodd et al. 2012 and Byrd et al. 2012a Medicaid Statistical Information System, 2010.” Washington, DC: and 2012b. Mathematica Policy Research, February 2011. Dodd, A.H., J. Nysenbaum, and A. Zlatinov. “Assessing the Usability of the MAX 2007 Inpatient and Prescription Encounter Data for Enrollees in Comprehensive Managed Care.” Washington, DC: Centers for Medicare & Medicaid Services, April 2012. Kaiser Family Foundation. “Medicaid’s Role for Women Across the Lifespan: Current Issues and the Impact of the Affordable Care Act.” Washington, DC: Kaiser Family Foundation, December 2012. Available at https://kaiserfamilyfoundation.files.wordpress. com/2013/01/7213-04.pdf. Accessed July 31, 2015. Nysenbaum, J., E. Bouchery, and R. Malsberger. “The Availability and Usability of Behavioral Health Organization Encounter Data in MAX 2009” Washington, DC: Centers for Medicare & Medicaid Services, December 2012. Young, K., R. Garfield, M. Musumeci, L. Clemans-Cope, and E. Lawton. “Medicaid’s Role for Dual-Eligible Beneficiaries.” Washington, DC: Kaiser Commission on Medicaid and the Uninsured, April 2012. For further information on this issue brief series, visit our website at www.mathematica-mpr.com Princeton, NJ • Ann Arbor, MI • Cambridge, MA • Chicago, IL • Oakland, CA • Washington, DC Mathematica is a registered trademark of Mathematica Policy Research, Inc. ® 12