Skip to content
← Back to Insights

The Defibrio CLI Was an Organizing Surface, Not Just a Tool

At Andrews-Cooper, the most important part of the Defibrio work was the protocol boundary. The host CLI, the firmware, the cellphone UI, and the product itself were organized around it.

John Sambrook, TOC Jonah Certified ·

TL;DR

At Andrews-Cooper, I was contracting with a product development company on the Defibrio defibrillator project. The part of the work that mattered most was not just the firmware. It was the protocol boundary between a Python CLI on the host and the embedded firmware in the device emulator. That boundary became the organizing surface for the product: the commands, queries, and error behavior had to be hammered out there before the rest of the system could settle down.


A protocol boundary diagram showing a Python CLI, a defibrillator emulator, firmware, and a cellphone UI connected by a structured command layer

The most interesting part of the Defibrio work was not the firmware by itself.

It was the interface between the pieces.

I was contracting with Andrews-Cooper, the product development company behind the Defibrio defibrillator project, and my part of the work centered on the USB-serial protocol between a Python CLI on the host and the embedded firmware in the device emulator.

The defibrillator was entirely powered and controlled from an Android cell phone.

That protocol was not an afterthought. It was the interface control document between two teams: the defib hardware and firmware developers on one side, and the Android app developers on the other. It was the place where the implementation got organized.

The protocol was the product boundary

If you only looked at the hardware, you would miss what was really happening.

The defibrillator emulator was part of a larger system. There was the embedded firmware. There was the host-side Python CLI. There was the cellphone that powered the system and served as the UI. And there was the command-and-control layer that made all of those pieces behave like one product instead of a pile of parts.

That protocol boundary had to answer a lot of questions:

  • What commands are included?
  • What queries does the product expose?
  • What happens when something fails?
  • How do the host and firmware stay in sync?
  • What is the right behavior when a packet is corrupted?
  • How do we make the interface stable enough to verify?

Those were not just technical questions. They were product-definition questions.

In practice, the protocol spec became the organizing surface for the whole project. The shape of the interface drove what the software could do, how it could fail, and how the team could reason about it.

What the system looked like

The project had a few important pieces:

  • a defibrillator emulator
  • embedded firmware on ARM Cortex microprocessors
  • a Python CLI on the host
  • USB-serial communication between the two
  • a cellphone powering the system and acting as the UI
  • safety-related and regulated-device work around the whole thing

The firmware was bare-metal super-loop code, not an RTOS-based system. I also helped with SPI-based device-driver work and with error-checking and profiling tools in NXP’s MCUXpresso IDE.

But the most important architectural decision was that the host tool and the firmware were not independent islands. They were paired.

The protocol spec forced the implementation to take shape

A lot of embedded work gets described as “build the firmware” or “write the app.” That is too small for this kind of system.

On Defibrio, the protocol spec itself was where the implementation details took shape. The commands, queries, and error behavior were hammered out there so the hardware/firmware team and the Android app team could build to the same contract.

That meant the protocol wasn’t just moving bytes. It was organizing how the capabilities would be implemented.

That kind of boundary matters because it forces discipline. It keeps the project honest. It also gives both teams one place to discuss what the product behavior really is.

That matters even more in a regulated context, where vague interfaces are a liability.

The hard parts were about trust

The protocol had to be reliable enough that the team could trust it.

So it used:

  • length fields
  • per-packet CRCs
  • sequence numbers
  • resync/restart behavior on error

The command set eventually grew to roughly 100 commands.

That is a lot of surface area for drift. If you hand-maintain that kind of interface in two places, it will eventually get out of sync. So I wrote Python code that auto-generated the protocol definitions for both the Python client and the firmware. I also auto-generated the protocol documentation and the verification test plan, so the protocol, docs, and tests stayed aligned instead of drifting apart.

That is the real leverage story.

Why the host tool mattered

The host CLI was not a nice extra. It was part of the system’s organizing layer.

The operator used it to drive the device emulator. The project used it to define what the system could do. The team used it to decide what belonged in the interface. And the verification work depended on it because the command set and the error behavior had to be testable.

In other words, the tool was part of the product architecture. Not just part of the developer convenience stack.

That distinction matters. A lot of embedded projects treat host tooling as a thin wrapper around firmware. Here, the wrapper was doing architectural work. It was organizing the product around a stable and inspectable boundary.

The regulated context mattered

This was a defibrillator project, so the work was not casual.

Early in the project, I reviewed system architecture with Jim, George, and Lindsey, and worked with Curtis Wichern on safety-related concerns. I participated in system hazard analysis, helped assemble the system block diagram, and contributed to the regulatory plan in the project development plan. I also worked through PC-lint warnings and aligned the code with MISRA C-style guidance.

That is the kind of work where you do not want hidden ambiguity. You want the interface to be explicit, stable, testable, and documented. The protocol helped create that clarity.

The thing I want people to notice

The point of this story is not that I wrote a Python script.

The point is that I helped create the organizing surface for a regulated embedded product. The protocol boundary defined the product’s behavior, kept the firmware and host tooling aligned, and gave the team a place to reason about commands, queries, and failure modes.

That is the sort of work I still like best:

  • make the boundary clear
  • keep the pieces in sync
  • reduce drift
  • make the product easier to trust
  • make the system easier to verify

If you are building something with real constraints, that usually matters more than cleverness.

What this means for the rest of my work

Defibrio is not an isolated story. It sits alongside the SonoSite device-model tooling, the Verasonics build and licensing work, and the broader Common Sense Systems work around automation and operational improvement.

The common thread is this: I like the places where systems become legible. Sometimes that means a protocol. Sometimes it means a build system. Sometimes it means an offer. Sometimes it means a website.

The work is the same shape. Make the boundary clear. Then make the rest of the system easier to use.

If that is the kind of help you need, I’m available.