The end of the year is quickly approaching, and if you’re an Accountable Care Organization (ACO) you’ve likely already started planning for your Medicare Shared Savings Program (MSSP) submission. This process is critical to helping your organization demonstrate you’ve met the necessary quality measures to receive reimbursement from CMS at year-end.
MSSP submission, while necessary, can be somewhat complicated. To help your organization successfully navigate this complex process, Jake Aleckson, director of Optimization Solutions, and Kyle McAllister, a senior consultant on the optimization team, break down the key areas of focus for your population health and quality management teams. They offer tips and advice around the tools, timelines, and data extraction options your team should consider when putting together your submission for CMS.
If you’re interested in learning more about the MSSP submission process, or need our assistance with your submission, please contact us.
[02:45] Definition of MSSP ACO submission
[04:25] Overview of tools for data capture
[07:28] MSSP submission process timeline and preparation
[12:10] Who should be involved in your submission process?
[17:26] Options for data extraction, and key considerations
[26:40] Tips for successful data extraction
Jake Aleckson: Hey everybody. Kyle, thanks for joining me. Appreciate you taking the time here before you catch a flight back to Kansas City here.
Kyle McAllister: Yeah. Happy to join.
Jake: Welcome to the home office here at Nordic. Today, we've been doing a lot of exciting things down in Atlanta with the Healthy Planet Team, with the quality team down there. One of the hot topics that they're thinking about — and that other organizations are — is around MSSP submission. We were hoping to get together and talk through the process today. Hopefully others will appreciate listening to us.
Jake: I'll do introductions here quick. Jake Aleckson, I'm director of Optimization Solutions here at Nordic. I've been in the Epic space for about 10 years, most of which has been spent on Healthy Planet, population health-related projects. Looking forward to talking with you today.
Kyle: I'm Kyle McAllister. I'm a consultant with the Nordic team. I think I've been in healthcare IT for eight or nine years. The last five or six of those focus on population health IT, mostly with Epic based solutions, but also I've worked with Cerner. I've worked with Aetna in their population health IT space as well. Worked with a few clients on MSSP submissions. Some good successes from previous clients to share.
Jake: A few lessons learned as well, that we're looking to talk through. Awesome. Let's talk a little bit about what MSSP ACO submission is and then why is it important.
Kyle: Well let's maybe define MSSP ACO for the audience. Medicare Shared Savings Plan, then ACO is Accountable Care Organization. If you've been in the pop health space, it's probably a pretty familiar term. MSSP ACO submission from an Epic and Healthy Planet team perspective really ties back to a lot of the ACO quality measures that are expected for that program. Participating in a CMS MSSP ACO, you have some cost-related goals that you have to hit. That's less of the Epic team's concern. Then there are some quality measures that you have to meet as well to even qualify to make money at the end of the year. To actually achieve reimbursement, you have to meet certain quality measures. A handful of those quality measures I think ... I actually don't remember the exact number of them off the top of my head. It's like 22 or 23, I think. Fifteen or so of those you actually don't have to report on. Then, 15 or so of those you do report on in Epic. The 15 that you do report on from Epic, that you have to submit that data for for CMS, is what we're going to focus on.
Jake: A lot of the challenging piece of this is first capturing the data. Getting the infrastructure built in Epic or in the EHR to capture the data. Then get it out at the end of the year to, then ultimately to get it to CMS for the reimbursement.
Kyle: Right. Yeah.
Jake: Let's take a step back in the submission process. What tools need to be in place within the system to be able to capture that?
Kyle: Like you said, there's a lot of workflow component that underlies a lot of this. There's a lot of your typical EpicCare Ambulatory and other type of workflow tools and data that needs to be captured. Then there are a lot of — where the Healthy Planet piece of this comes in — is a lot of the tools that gather that up. Think of registries, quality measure builds that those registries feed into to actually build the performance around each of the various 15 ACO measures. There are dashboards that can be created to display them. If providers, or executives, or other folks at your institution want to be able to see their performance over time, so it's not a surprise at the end of the year when you get there, there are dashboards to be able to monitor that throughout the year.
Kyle: Some others that sit behind the scenes are things like roster management engine, which helps you load your patients that CMS requires you to report on. They give you an actual roster of folks to report on, and roster management engine helps you load that. There are ACO clarity tables and a tool called the abstraction assistant, which at the end of the year helps you pull that data out to actually send it to CMS.
Jake: Awesome. You mentioned workflow. That's sometimes part of the struggle is getting end users to document information in the correct place within the workflows. That's something that we've worked with organizations on as far as optimizing quality measure decision support tools that we put into place. Can you touch on any of a few examples there as far as how organizations can improve on capturing that information?
Kyle: Yeah, it often becomes kind of an iterative process year over year. You do your best the first time around to capture your workflows as they stand. There's Epic model released content built around those quality measures and the stuff that you need to report on. A lot of times, things are going to be different at your organization. To take Piedmont as an example. One of the measures is around diabetic eye exams. Piedmont doesn't have ophthalmology specialists on staff. Their workflow's drastically different than others. It requires changes to the quality measures and to the groupers and all the underlying build that goes into the quality metrics that you have to report on can be changed a lot based on what your specific workflow is. There's a lot of validation of your site's workflow versus what the standard Epic workflow is. Then changing a lot of that quality metric like CER rule and other type of grouper build to match what your workflow is.
Jake: Yeah, there's a lot of work that goes into prepping for this. Let's talk about that timeline. I don't know if you want to start with the end, as far as I think it's that February timeframe, or if we want to start in the summer when we receive updates to the measures. Let's talk through that timeline, just to give folks a sense of what needs to be done, when to feel prepared for this.
Kyle: Yeah, yeah, let's start with the end I guess. The end goal is in February. You're actually going to be submitting your data to CMS. If we're taking 2018 as an example, in February 2019 is when you will actually finish the whole process. Your data will all be zipped up and sent off to CMS. Then working back from there, you're spending ... CMS tells you the population and the group that you need to report on in January, usually early January. That's how you know who you actually need to use the abstraction assistant and other data extraction tools to pull out and start doing some manual abstraction. Check, actual literal checking of the data that gets pulled out of the system.
Kyle: As you approach the end of the year, as you're approaching January, there's a lot of work to prep the data. Make sure that everything in your system is all set to go. When you do finally get to that January point, you need to pull the data out. There's an end of the year data gathering metric finalization, population finalization process.
Jake: Is one way to look at it that you almost need a picture freeze frame of that 2018 data at the end of the year to be able to then use the abstractions assistant and then submit in February? Is that a good way to think about it?
Kyle: Yeah, exactly, exactly. I think we'll probably talk a little bit later about some of the ways to do that. There are a couple of different options. There are some that are better than others. Some that just depend upon where you're at in the process. You're basically trying to take a freeze frame of your 2018 data at the end of the year. There's a whole lot of work that goes into prepping, making sure your registries are run before the end of the year, your clarity tables are updated at the end of the year, all that stuff.
Kyle: In preparation for that time, though, basically the whole rest of the year you're basically validating your metrics. You're making any changes to measures as they come out. For example, this year CMS didn't really make a lot of changes to the metrics that you're required to report on. In the previous year, in 2017, they made drastic changes. There were new measures. Some measures fell off. Big changes to the ones they already had. There was a lot work that goes into taking updates from Epic because they do respond to some of this stuff and update the rules in model settings for you. If you have some of those customized workflows, there's a lot of work that goes into fixing them yourself.
Jake: Timeframe wise, I would say, what, middle of the year 2018 or so we received the special updates from Epic as far as what changes were made from CMS and then what organizations need to be looking for in their build to update?
Kyle: That's right. Going to that yearly timeline, in February, you end up submitting your data. As you can imagine, CMS is spending a bunch of time gathering all of that, crunching it, figuring out how you actually performed for the year. Then in roughly March or April, they tend to get their stuff together around what the metrics are going to be for the coming year. Basically, every spring you get a new set of metrics from CMS, which sets you off the rest of the year making sure all of your Epic build and measure build in the system is up-to-date for that. There's actual build that goes into it. Validation of that build, then there's validation of the data that comes from that build once it's in to the system with your providers. Then, you get into the whole end of year process.
Jake: You mentioned providers. I think that's an important call out that we've had success with organizations that have really involved the physician champ in the special updates, in the grouper updates. I know that in our most recent experience, there was also a goal to even look at other programs as well and do some consolidation of measures. Are there any other call outs as far as at this time in the process of updating the measures and kind of getting them ready to be able to start validating?
Kyle: Yeah, that's a good point, so the provider champion is a very good one. I haven't been a part of a submission or that yearly process where a provider wasn't involved and it wasn't extremely necessary for that provider to be involved, and then it's also really important to have your quality management team involved. So they are going to be the ones that know exactly how the organization wants to report on a certain quality measure. Even though CMS gives you specific criteria, there are still decisions to be made within that, so that the quality management team is going to know how you actually want to do things for that quality program, and they are also likely going to be involved in the validation process. So after you get your build out there and you know, into the wild in production, you need to make sure that it's collecting data the right way before you get to the end of the year and find out that it's wrong. So they are very involved from a validation perspective.
Jake: Yeah and so the tool that they're using, both the quality team physician champs is probably that dashboard. They are using that to not only make sure how are they doing measure-wise but also are they going ... can you talk through the process as far as going into each of those measures, seeing the patients, and actually validating to make sure that it's set up correctly?
Kyle: Yeah, definitely. The dashboard is definitely a great tool for that, it's a great tool to just see how you are performing at a high level but then based upon how you set it up, oftentimes that dashboard has some drill-in reports that can show you whether patients fell for any given measure into the numerator, into the denominator, whether they were excluded and then once you get down to that patient ... they get down to that patient level information, you can actually look at their chart and see you know, did they get the eye exam that the measure says that they did or that says that they didn't. So it's using that dashboard in some of those drill down reports and then going actually patient by patient and doing some random spot checking with patients to see how the build turns out. It's really important from that perspective as an analyst team working with that quality management team, doing the validation to make sure that when they do the validation, they are doing a very good job of documenting what the outcomes are. It's one thing to know whether a patients got errors or not but it's another to know very specifically what procedure record was or wasn't there when they did it.
Jake: And this, I think what we were talking earlier about the relationship between the healthy planet team and the quality team because there's going to be a lot of back and forth there to ensure that those measures are accurate at that point in the game. Alright, so let’s recap, so we received the special updates from Epic, we made changes to the measures, we included physician champs to make sure our groupers were up to date, and then we got our dashboard running. Now we talked about some validation with the quality team, making sure that everything looks accurate, making sure that everyone is comfortable with where the quality measures are at. What's next, where are we in the year here, are we getting close to submission time?
Kyle: Yeah I mean it depends how quick you get on that stuff but yeah, usually it's pretty much around the year I guess for lack of a better way to say it. All yearlong process, it ends up taking quite a bit of time to do that validation and getting those updates into the dashboard and making sure your metrics are really fine-tuned and ready to go at the end of the year. One thing we didn't talk about throughout that whole process as well is you're going to be getting quarterly updates of your roster file from CMS as well, so they release their measure updates but they also release to you their roster, the eligibility roster, every single quarter, and it's up to you as an analyst team to use the roster management engine to actually load that up so when you are looking at the dashboard, it's the correct patients underlying it, calculating into those metrics, and you’re looking at the correct patients while you're doing validation with that team.
Kyle: So that's happening all year long, and that really kind of leads you into the end of the year because at the end of the year you're going to get a final roster from CMS for quarter four. Hopefully you've gone through all that validation process and everybody feels really good about the metrics, and then you really get into that end-of-year prep with getting the data set, getting the clarity tables backfilled and everything.
Jake: Good point on roster management, I think I've worked with multiple organizations now. That tool is so important, but it's also challenging, so organizations that probably haven't implemented it yet or started using roster management, don't underestimate I think, the time that it takes to get that accurate.
Kyle: Absolutely, yeah.
Jake: So, you started diving into it, now we're even maybe getting a little bit more technical with the two options that customers have as far as capturing that freeze frame at the end of the year?
Kyle: Yeah, so at the end of the year for MSSP ACO is really kind of the crunch time, it's the time where you've got to have your data ready, and it has to be ready before you hit the basically last moments of 2018 and roll into 2019. We can get into specifics on that in a second. The other hard part about it is as the end of the year approaches, CMS also has a lot of last second updates. They realize they maybe had some things slightly off with the way that they configured their metrics or they made slight errors with the way that your last patient roster came in. I've never had an experience where there weren't last second updates from CMS that needed to be included into the metrics or new roster files that needed to be loaded in the last month of the year in December that really challenged the timeline. And it's not a knock on CMS. It's just how the process works, and it can be challenging. So when metrics change, and when the roster changes, there's potential to need to do some last second validation of those things and last second build changes, moves to prod, and a lot of times the end of the year can be build freezes and all kinds of other things. So there's a lot to coordinate at the end of the year, so you have to be prepared and know that there will be curveballs from CMS at the last second.
Jake: Well said. Let’s, for our Healthy Planet folks out there and for our ... almost our little more technical side of the house, let’s talk about these two options, the backfill versus maybe the date option that we've seen over time?
Kyle: Yeah, so the two options like Jake said, there is an option — that's probably the more recommended path in all Epic documentation — and their recommendation would be option one, which is really making sure all of your data is up-to-date. All of your registries are run, and all the data from that registry is backfilled into the corresponding clarity table, which is where data actually gets pulled from at the end of the year by your quality team to report to CMS. So you're preparing all this stuff in chronicles and the registries that you create, and then it actually gets pulled out to clarity to do the reporting. So the first option is because things turn over obviously at the first of the year, everything switches to 2019 and really from a CMS, MSSP perspective, that's like a clean slate for a lot of these measures. So, if you got your flu shot last year or in this year, 2018, in 2019 you're still going to need a flu shot that next year. You go from meeting the measure to no longer meeting the measure.
Jake: On January 1.
Kyle: On January 1. Yep. So you have to do a lot of prep to make sure that you kind of freeze frame the data of 2018 so that when you flip to 2019, it doesn't write over that data and Clarity for lack of a better way to put it. So option one is doing registry runs right at the end of the year, working with your Clarity team to make sure that they're doing the backfill the minute that that registry completes and as close to the end of the year as possible, so you're really getting all the data that you can into Clarity before you backfill it. Then you literally turn off all updates to that Clarity table from that point on, so it stays as 2018 — it looks like it's a 2018 performance on all of those measures. So that's — oh, go ahead.
Jake: I was just going to say, so before I ask about some of the challenges with that first option, are you telling me that at the end of the year you might be out on the town December 31 and you gotta make sure that you stop and run that registry one last time, is that how it works?
Kyle: Hopefully it's all scheduled out and you set up your registry to actually run at a specific time, but I can't pretend that that hasn't happened to me in the past, so it's definitely something to know that could happen. Especially if a lot of those CMS type updates come at the last second, but yeah, it's kind of a bummer because it's the time where a lot of people are off and on vacation, so a lot of times that can be a really big challenge at the end of the year.
Jake: So that is probably one of the downfalls of this option one is that the coordination is probably challenging because there's multiple teams that you have involved. You have DBAs, the Clarity team, you have to have the Healthy Planet team or quality team, so I would think that that's going to make this option ... It's one of the reasons this option is tough.
Kyle: Right. Yep. I've had conversations with my Clarity counterpart at 11 p.m. on December 31 in the past, so it's not unheard of. So yeah, that's option one and honestly there are drawbacks to that like we just mentioned, but that's the most stable option and ends up being the option that's recommended because it doesn't require you to make any changes to your metrics or anything after the first of the year hits. That's important because the dashboard that we've been talking about also flips over to 2019 with the registry, so that dashboard will begin showing your performance on ACO measures for 2019 the moment that the clock strikes the new year, so it's important to get all that set up in that data freeze-framed off into a separate — into the Clarity table so that chronicles can keep running as is, and there's no disruption to the dashboard for end users.
Jake: So, if I'm understanding correctly, we get that freeze frame of data. I think another downfall of that first option is if there are any issues with that validation, with measures as far as something maybe being wrong that's discovered during abstraction, then you may need to actually go back and rerun that. So is that another downfall to option one? You have to have all your ducks in a row?
Kyle: Yeah, that's a really good point, so it's really important to make sure that when you are using option one, you can feel a lot better about using option one if you did all that data validation and everything really solidly during the year, and you feel very confident in your metric build and the data that's sitting behind it. So it does require a lot of confidence in your build and a lot of times if it's your first time participating in the program, your first year, you are not going to feel that, so you should know that going into it. Aim for feeling really confident and good and aim for this option one, but know that you might have to fall back on other options when things change in the new year. CMS also can't send updates after the year changes over, so there is a chance that your metrics change and your data needs to change, you need to rerun your registry so that your data changes after the new year. So you have to have option two which we haven't talked about to fall back on to actually make changes.
Kyle: So option two is basically changing your registry to force it to look back at the previous year. So you have the ability through a single — it's actually pretty slick the way that Epic put it together — there is a single registry metric that controls whether the entire registry looks at the current year, or some other year in the past. So you can set this specific — it's called last day of measurement period, for anyone who wants to go look it up in their system — but you can set that rule that instead of looking at the current year, which is Y, look at the previous year, which is Y-minus-1. Which literally forces your entire registry to look back at the previous year.
Kyle: So we already talked a little bit about the impact of that. It's something that I’ve actually had to utilize because of updates from CMS, and maybe last second changes to metrics and things, literally every single time I've done it. So it's a good fall back option, it's a good idea to try to meet option one and do the Clarity backfill option if you can because it helps the dashboard, but know and set expectations that there is a chance that if things change that you might need to fall back on changing that registry metric, re-running your entire registry on 2018, and know that your dashboard will be looking at 2018 data for whatever time period you actually force that registry to run on the previous year. Hopefully that makes sense.
Jake: I have worked closely enough with you Kyle, I'm following, so hopefully folks are out there as well. So this has been … I think it sounds like a great recommendation to shoot for that option one and then have option two as a fall back. So let's talk about, now that the data is out, now we are feeling good, now there is an abstraction process prior to the actual submission, do you want to just touch on that?
Kyle: Yeah, yeah, so we talked about the abstraction assistant as one of the tools that we have to use. That tool is an Epic produced Excel spreadsheet with a bunch of Sequel queries sitting behind it that actually go out, find the data in Epic, based on the patients that you are supposed to report on, which really quickly, CMS provides you with that list. Usually about the middle of January, so you get a list of 600 something, I think it's actually 600 even patients and you have to go enter all those patient IDs into your abstraction assistant and then you run some of the fancy Sequel queries that sit behind it, and it actually goes out and find all the patients you have to report on and it also pulls in the data in the format that CMS requires it. So that's how you actually pull the data out, using that tool so that is one option.
Kyle: You can also just go ahead and write the Sequel queries yourself, so if you have some really talented reporting folks or other folks who are really good with Sequel and you don't want to rely on that abstraction assistant you can also write it yourself. I've used the abstraction assistant in the past every time and it has its unique challenges as well, but it ends up working at the end of the today.
Jake: Okay, awesome and I assume that the quality team is leading that abstraction charge, and it's probably similar to the validation process that they went through in the prior year?
Kyle: Yeah, I would say while you are using, in the use of the abstraction assistant, it's usually, the Healthy Planet team ends up being the quarterback or driving that, just because they have the most vested interest across teams, across the technical teams, but you end up pulling in Clarity folks or other Sequel folks who can help you set that abstraction assistant to pull from the correct places, from the correct servers essentially. So there is a lot of coordination in actually pulling the data. And then after you've pulled the data, yeah, you are definitely working with the quality management team to make sure as they do manual abstractions of that data, so literally manual validation of all of the data that's in the Excel spreadsheet, they are the ones often running that work, but it's worth noting that it might be a good thing to check in with your quality management team while in advance of that time to make sure they are planning for the fact that they are going to need to have staff to actually do that manual validation.
Kyle: Those folks, depending on who is hired there too, sometimes they actually bring in clinical staff who know Epic well and don't need any sort of training of how to look at patients’ charts, but sometimes it's students, something like that, folks who haven't seen Epic very much before, and I have, in the past, actually trained those folks in advance of that time too. So, one note to make is, make sure that you're working with your quality management team if it's your first go round to be prepared for the fact that they are going to need to staff to do this work, and they might need some training, which oftentimes, the Healthy Planet analyst team will provide, or your training team.
Kyle: After you get that staff on board and doing the actual manual validation, so what that actually looks like is they are going through the Excel spreadsheet kicked out through Abstraction Assistant, checking it against charts in Epic, so actually making sure the meds it says that patient got or the eye exam that patient should have received is in the system and you are not submitting data that is incorrect to CMS at the end of the day, which is very important from an audit perspective.
Kyle: So they are going to go through that, that is going to take weeks of time, maybe even months of time, and usually you end up reporting that data to CMS via their web interface, so last year was the first time that this new web interface came out. The way you actually load it up to that web interface is via an Excel document that is extremely specifically formatted. So I am really bringing this up to say make sure you have a plan at the end of the day, depending on how your abstraction team is going to do it, oftentimes they will have their own saved off versions of Excel spreadsheets that you need to bring together.
Kyle: You have to have someone who is pretty handy with Excel to bring it back together and make sure it's formatted in the appropriate manner for CMS, and throughout that whole process just like the validation — it's very similar to a validation — or the validation that we did throughout the year. So it's really important that you are getting really good feedback from the abstraction team as they abstract the data and make sure it's all accurate. So, make sure you have a solid communication method of … if they found an error, what was the specific error and on what specific patient? Or if they found inconsistencies in how things were documented or something like that. There is large impact to how your metrics are set up, but there might be impact to how your workflows are set up or how your workflows are trained. If all of your providers aren't trained to document the correct way, you are not going to get the data documented that you need in the right manner.
Kyle: So it's a good opportunity to build in feedback on your build and the way that your organization is executing their workflows around those quality measures. Ideally if you're wrapping this entire process together, you are doing validation of the new metrics that CMS puts them out. And then you are doing validation. As you do abstraction at the end of the year you should be really tightly configuring those metrics and those workflows year-over-year.
Kyle: With organizations I have worked with in the past where it's our first year doing it, it was a little bit uglier and a little bit harder, and things weren't super clear, but then year-over-year it got tighter and tighter to the point that some of them are having really great success and are some of the higher achieving quality performing organizations in the last year or so.
Jake: Yeah, this is really helpful, Kyle. We went pretty deep here, and I think it's good because a lot of organizations out there are going through this and are maybe struggling. So I think we provided a lot of good details here that I hope folks are finding helpful. Just to summarize, there are a lot of teams that are involved, a lot of people that are involved in this process, and it's an important one that is going to impact reimbursement. So we really appreciate you swinging by before you are flying back and I hope people found this podcast helpful.
Kyle: Yeah, I hope so to. Thank you!
Jake: Awesome. Thanks Kyle.