Shadows in EOCs

Shadows in EOCs

Seeing Shadows in EOCs — Or, The Time We Almost Got Raided by the FBI

Organizational behaviorists, sociologists, and philosophers as foundational as Plato could have predicted this: lock people in a room long enough, even without the pressure of a disaster bearing down on them, and they will begin to believe the shadows on the wall are real.

We have poured enormous innovation into the storytelling inside EOCs. We have invested far less in the quality of the data that storytelling is built on. The result is a kind of high-stakes telephone game. Someone catches a flap of a butterfly’s wing, whispers it to the next layer, and by the time it passes through enough reports and briefings, it arrives on the GIS dashboard as a hurricane — rendered beautifully, with confident colors, clean borders, and a legend in the corner. A polished portrait increasingly unrecognizable from the action that inspired it.

I witnessed this on every major disaster I worked. Sometimes I even caused it myself. (just wanted to talk about a pretty butterfly). Nothing illustrated this for me quite like the time we almost got raided by the FBI in Puerto Rico.


After Hurricanes Irma and Maria knocked out power across most of the island, the need for food was unlike anything modern US emergency management had encountered. After several iterations the large interagency response started working with large FEMA-contracted grocery boxes — roughly 30 pounds, about a few days worth of groceries. The plan made sense on paper, like any point of distribution site: pick up boxes from a central warehouse, drive them to distribution sites around the island, people come collect them.

One afternoon I was at the San Juan Convention Center doing my usual rounds with the FEMA mass care staff when they pulled me aside with the energy of people who feel like they’ve finally solved something. A facility had been flagged for stockpiling FEMA grocery boxes. The FBI was involved. A raid was imminent.

I pulled up our Red Cross Incident Action Plan to check if the address was ours. It wasn’t. I congratulated them and went on my way.

Ten minutes later I was back at our disaster relief operations headquarters, hitting the punchline of that same story, when I looked up and saw a Post-it note on the wall with that exact address on it.

We called FEMA immediately, our logistics team warned the warehouse, and they scrambled whatever Red Cross signage they could find. Whether the raid wasn’t as imminent as advertised or we got the information out in time, the FBI never showed up.


What the EOC had seen was boxes accumulating at a location that didn’t match the official plan, against a backdrop of news coverage about hoarding. The dashboard rendered that shadow with full confidence. But the assumptions built into that model — about how far a driver could get and back before dark on roads without street lights, about whether someone without a car or fuel could reach a pickup site — didn’t reflect what the reality of the situation. The suspicious warehouse in question had been added recently as a staging location to break the logistics into manageable legs, but the information hadn’t made it to me or the IAP yet.

After the adrenaline faded, we started to worry about the visual of warehouses with boxes, not just ours, but others we were partnering with. I asked my team to reach out to the municipalities and houses of worship giving out boxes and ask what they were seeing. What we heard was that they were wrestling with the same problem we were. Getting a 30-pound box to someone without a car, in a neighborhood without power — it was hard everywhere. People still needed food. The boxes we saw accumulating weren’t necessarily a story about what anyone was doing wrong. They were a signal that the model’s assumptions didn’t match the reality on the ground. But nothing in our collective system was built to surface that signal before the FBI got called.


I’m not telling this story to indict how we build situational awareness in EOCs. The tools we have are genuinely impressive, and the people working in those rooms are doing hard work in good faith. But the sophistication of what we can do with information once it’s inside the EOC has raced far ahead of the mechanisms for making sure the right information gets in at all.

The more we invest in the technology of rendering data — the dashboards, the GIS layers, the mapping tools that make everything look authoritative — the more we need to protect the informal, open channels that let ground-level reality compete with the model. The communities we serve aren’t just the subjects of our situational awareness. They are participants in the same problem we’re trying to solve, and they often have the clearest view of where the model has stopped matching what’s true.

The shadow on the wall looks most convincing right before someone walks outside.


If you’re finding this space useful, I’ve been pulling the larger argument together in a short ebook. It traces these coordination challenges from the Cold War era through FIRESCOPE and NIMS to today, draws on the disaster research, and presents a model for how we might build something designed for the realities of disaster.

Look out for the upcoming Strategic Disaster Coordination: The Missing Architecture of Disaster Management. If you want to hear directly when it is available, send an email to SDC@rivermac.com.

Leave a Reply

Your email address will not be published. Required fields are marked *

We would love to hear your thoughts. Comments are moderated to prevent spam and keep the conversation space productive. Please allow for a short delay as comments are approved. Check out our comment policy for more details.