Our impact analysis finds significant evidence that ELE implementation increased Medicaid enrollment. Across a series of model specifications, estimated impacts of ELE were consistently positive, ranging between 4.0 and 7.3 percent, with most estimates statistically significant at the 5 percent level. Overall, these estimates had a central tendency of about 5.5 percent. The analyses also find evidence that ELE increased Medicaid/CHIP enrollment. Across a series of models, estimated impacts were again consistently positive, though less often statistically significant, with a central tendency of about 4.2 percent. Although the multivariate analysis finds consistent evidence that ELE had a positive effect on enrollment, some questions remain about the magnitude of the impact.
The less robust evidence of an effect of ELE on combined Medicaid/CHIP enrollment is not surprising given how modestly ELE has been implemented for CHIP. Indeed, at the time of this analysis, only four states implemented ELE through CHIP, one of which (Iowa) had an ELE-like policy in effect before the period of analysis. We would also expect the effects from Oregon and Georgia’s ELE programs would be heavily weighted toward Medicaid, because each state’s Express Lane agency—the Special Supplemental Nutrition Program for Women, Infants and Children (WIC) and SNAP, respectively—has income eligibility levels that encompass the Medicaid threshold but are below the CHIP threshold. In other words, these findings do not mean that ELE policies cannot affect CHIP enrollment, but rather that the existing ELE programs are targeted more toward Medicaid than to CHIP enrollment.
Table IV.6. Estimated ELE Effect for Regressions That Model the ELE Effect over Time, 2007-2011 Quarterly SEDS Data
|Dependent Variable (log transformed)|
|Total Medicaid/CHIP Enrollment||Medicaid Enrollment Only|
|Main Regression Model||0.0420* (0.024)||0.0562** (0.026)|
|Number of Quarters Since ELE Implementation|
|ELE||0.0279 (0.024)||0.0374 (0.024)|
|ELE*Number of Quarters Since ELE Implementation||0.00401 (0.003)||0.00509* (0.003)|
Source: CMS Statistical Enrollment Data System (SEDS) as of March 30, 2012, verified and provided by CMS
Notes: (1) Robust standard errors clustered at the state level are in parentheses. (2) *p<.1, **p<.05, ***p<.01. (3) All models include state and quarter fixed effects (coefficients not shown). All other right-hand side variables are the same as those in the Table 3 main results. (4) Total enrollment includes children who were ever enrolled in Medicaid or CHIP during the fiscal quarter. Medicaid enrollment only includes children who were ever enrolled in Title XIX (Medicaid) or TItle XXI Medicaid expansion CHIP programs during the fiscal quarter. (5) The Medicaid/CHIP models include 660 and the Medicaid model includes 820 state-quarter observations.
CHIP = Children's Health Insurance Program; CMS = Centers for Medicare & Medicaid Services; ELE = Express Lane Eligibility; SEDS = Statistical Enrollment Data System.
Although our results suggest that ELE can have a positive effect on Medicaid enrollment, it is uncertain how this finding might generalize to a particular state or state program. We find that ELE had an above-average effect on enrollment in Iowa and Oregon, where ELE primarily functioned through SNAP, and in Maryland and New Jersey, where ELE functioned through the tax system as an outreach tool. However, differences across states or ELE approaches are not statistically significant and the experience for any individual state could vary widely due to differences in policy design, implementation, or its target population.
As with any quasi-experimental impact analysis, unobservable factors might bias our estimated ELE effects. Specifically, unless accounted for in our models, any factors correlated with the timing of ELE adoption that also affect enrollment might bias our estimates of ELE effects. For example, some states might have upgraded their information technology systems or implemented targeted outreach programs, subsequently increasing enrollment, at the same time they carried out ELE58. Should such factors increase enrollment in ELE and not be accounted for in our set of policy covariates, it could upwardly bias our estimates. Alternatively, should non-ELE states pursue such unmeasured initiatives, it could bias our impact estimates toward zero.
Acknowledging this bias risk, we have conducted a series of robustness checks that raise confidence in our findings. The estimated effects of ELE vary only slightly across sensitivity tests and are not driven by the inclusion of a single variable (or set of variables) or the inclusion of a single ELE state. We also find that the average ELE effect remains statistically significant and similar in magnitude to what we find in the main regression model when we exclude different sets of comparison states.
Our findings further suggest that ELE might have an extended effect over time (rather than a one-time increase), though this finding should be viewed with caution given the short post-ELE period available at the time of this analysis. (Most of the ELE policies were approved in 2010 or later and this analysis of the SEDS data was finalized in May 2012.) Unlike other eligibility and enrollment simplification strategies that might diffuse slowly, ELE policies were implemented quickly and it is possible that the effect could phase out over time, depending on the details of state policy. We rely on quarterly data to obtain the longest possible window of post-ELE data over the analysis period and will reassess impacts in 2013, when a longer post-ELE experience will be available.
Finally, more research is needed to assess the effects of the non-ELE policy variables on Medicaid or Medicaid/CHIP enrollment. Although we included several of these variables in our models, this analysis cannot conclude whether they had an effect on enrollment because we did not subject them to robustness analyses. Also, a number of them showed little variation in the analysis period, leading to imprecisely estimated effects. A separate and more extensive analysis, focused on individual non-ELE policy variables, is needed to assess their effects rigorously.
58 For example, in New Jersey, ELE was the centerpiece of a broader initiative to increase coverage of uninsured children eligible for Medicaid/CHIP and to ensure retention of enrollees in these programs (State of New Jersey 2009). The initiative included broader changes to information technology, staffing, public awareness and media outreach, and application simplification.