← Back to Work
Change Architecture

Adoption Architecture for a Healthcare System Transition

How we mapped resistance patterns, designed incentive structures, and built adoption engines during a private hospital's migration from a legacy billing system to an open-source EMR.

February 17, 20267 min read
systemshealthcareadoption

A private hospital had been paying for a new EMR system for months. Almost no one was using it. We were brought in to figure out why, and to design something that would actually change behavior.

Context

The hospital was running a legacy proprietary billing system that had been in place for years. Staff knew its quirks. Workarounds had calcified into process. The system was expensive, inflexible, and aging, but it worked, in the way that duct tape works.

Leadership had contracted a vendor to implement an open-source EMR as a replacement. The plan was a parallel-run phase: both systems would operate simultaneously while staff transitioned to the new platform. After sufficient adoption, the legacy system would be decommissioned.

We were engaged in an advisory capacity. Not to manage the implementation, but to assess operational readiness, document adoption patterns, and build the feedback infrastructure the transition was missing.

The problem

The surface problem was low adoption. Staff weren't using the new system. But that framing obscured the structural issue.

There was no adoption architecture. No formal User Acceptance Testing had been completed or approved, yet the system was live and the hospital was paying monthly fees against a deliverables schedule. Many of those deliverables were incomplete or missing entirely. Staff had been told to use the new system but hadn't been given reasons, training pathways, or feedback mechanisms that made compliance rational.

The real problem was a misalignment between contractual milestones and operational reality. The vendor's progress was measured in technical deliverables. The hospital's success depended on behavioral change. No one was bridging that gap.

Discovery

We started with observation, not documentation. The documented workflows and the actual workflows were two different systems.

We ran structured shadowing sessions across multiple departments (billing, front desk, clinical intake, pharmacy), watching how staff actually processed patients, generated invoices, and reconciled charges. What we found:

Resistance wasn't ideological. It was mechanical. Staff weren't opposed to the new system in principle. They were blocked by missing features, incomplete module configurations, and workflows that required more steps than the legacy system for the same output.

Workarounds had become invisible. Billing staff had developed informal reconciliation processes: cross-referencing printouts, maintaining side spreadsheets, manually checking charges between systems. These workarounds consumed hours daily but were never surfaced to management because staff assumed this was just how the transition worked.

Trust had eroded quietly. Several departments had tried the new system early, encountered errors or missing functionality, and reverted. No one had followed up. The implicit message was that adoption was optional and problems wouldn't be fixed.

We also conducted a detailed reconciliation between the vendor contract and actual deliverables. The gap was significant. Modules that were contractually due had not been delivered, yet invoices continued to be paid against the original schedule. This wasn't malice; it was a monitoring failure. No one on the hospital side had the technical context to evaluate whether deliverables met spec.

Architecture

We designed a layered adoption system rather than a single rollout plan. The core insight: you can't train people into using a system that doesn't work yet. Adoption and implementation had to be synchronized, not sequential.

First, we built a competency tracking system. A mobile-accessible tool that let supervisors document staff interactions with the new EMR in real time. Each observation captured the task attempted, the system used (legacy vs. new), whether the task completed successfully, and any friction encountered. This gave us quantitative adoption data by department, role, and task type.

Second, we designed a billing reconciliation tool that automated the comparison between legacy and new system outputs: charge codes, invoice totals, patient records. This replaced the informal side-spreadsheet process and surfaced discrepancies systematically rather than anecdotally. It also worked as a quality gate. If the new system couldn't match legacy output for a given workflow, that workflow wasn't ready for transition.

Third, we created a tracking matrix that mapped every contractual deliverable to its current status, with evidence requirements for completion. This gave the hospital a tool to have informed conversations with the vendor. Not adversarial, but grounded in specifics.

Finally, rather than mandating adoption top-down, we proposed a phased approach: identify the workflows where the new system was already better than legacy, transition those first, and let early wins build momentum. Staff who adopted early became informal trainers, not through a formal program, but because colleagues asked them how they did it.

The tradeoff was speed vs. integrity. We could have pushed for faster adoption by ignoring system gaps, but that would have produced compliance without competence: people clicking through the new system while still relying on legacy for actual work.

Execution

Rollout followed the friction map, not the org chart.

Phase one targeted billing reconciliation, the highest-pain, most-measurable workflow. We deployed the reconciliation tool and began generating daily comparison reports. This immediately surfaced specific discrepancies: charge codes that didn't map correctly, tax calculations that diverged, patient categories that the new system handled differently.

These reports became the vendor conversation. Instead of "the system isn't working," the hospital could now say "invoice #4721 shows a $340 discrepancy in surgical charges due to an unmapped modifier code." The specificity changed the dynamic.

Phase two expanded observation tracking across departments. We trained department leads on the tool and began collecting structured adoption data. Within weeks, we had enough data to identify which departments were progressing, which were stalled, and why, at the task level, not the department level.

What broke: the assumption that all departments would follow the same adoption curve. Pharmacy had fundamentally different workflow constraints than billing. Clinical intake had regulatory documentation requirements that the new system's templates didn't yet accommodate. We adjusted by creating department-specific transition roadmaps rather than a single timeline.

Phase three was the contractual reconciliation. This was sensitive. We presented findings in a neutral, evidence-based format: here's what was contracted, here's what's been delivered, here's the gap. The goal wasn't to create conflict but to reset expectations and establish a realistic completion timeline.

Outcome

By the end of the assessment period, the hospital had something it had never had: visibility.

Adoption rates were tracked by department, role, and task, not estimated. Billing discrepancies were identified and categorized systematically rather than discovered ad hoc. The contractual gap between paid and delivered was documented with specifics sufficient to renegotiate terms.

Staff resistance decreased, not because anyone gave a motivational speech, but because the system gaps that caused the resistance were being addressed with evidence. When billing staff saw that their reconciliation pain points were being tracked and reported upward, the dynamic shifted from "no one cares" to "this is being worked on."

The hospital moved from passive dependency on the vendor to informed oversight. They could evaluate progress against deliverables, prioritize which system gaps to escalate, and make data-driven decisions about transition timing.

Full adoption wasn't achieved during our engagement. That was never the realistic goal. The goal was to build the infrastructure that makes adoption measurable and manageable, and that infrastructure is now operational.

Reflection

The biggest lesson: adoption is not a training problem. It's a systems design problem.

When a new system isn't being used, the instinct is to train harder, communicate more, mandate compliance. But if the system has genuine gaps (missing features, broken workflows, unmet contractual deliverables) training just teaches people to be frustrated more efficiently.

The intervention that mattered most wasn't any single tool or framework. It was making the invisible visible. The workarounds staff had developed, the deliverables that hadn't been met, the discrepancies that were being absorbed as manual labor: all of it was happening in silence. Once it was documented, tracked, and surfaced, the organization could actually respond.

If we ran this engagement again, we'd push earlier for the contractual reconciliation. The technical adoption work and the vendor accountability work are the same work. You can't assess whether staff should be using the system until you've assessed whether the system is ready to be used.

The hardest part of a system transition isn't the technology. It's the six months where everyone pretends the old way and the new way are both working.

← Back to Work