In 2024, I was part of a team that published a retrospective epidemiological review of Mpox surveillance data from Imo State, Nigeria, covering 2017 to 2023. We analysed 231 suspected cases across 27 Local Government Areas. Of those, 49 were confirmed positive, a 21.2% confirmation rate, with a case fatality rate of 8%.
The numbers tell part of the story. What they do not capture is what the investigation revealed about the state of disease surveillance in Nigeria, and what it means for how we design better systems.
The most striking finding was geographic clustering. Eight districts accounted for 71% of suspected cases, despite representing only 32% of the population. This is not unusual in outbreak epidemiology. Clustering patterns point to underlying risk factors, proximity to wildlife reservoirs, specific occupational exposures, community behaviours, or healthcare access gaps. But the investigation also revealed something about our surveillance system: our ability to detect and report cases was uneven across the state.
Districts with better health infrastructure reported more cases. This sounds obvious, but it has a critical implication. High case counts in well-resourced areas may reflect better detection, not higher transmission. Low case counts in under-resourced areas may reflect gaps in surveillance capacity, not absence of disease. Any surveillance system that treats reported cases as true prevalence, without adjusting for detection capacity, will generate misleading insights.
This is where digital health tools can make a transformative difference. A surveillance system that digitises case reporting, standardises data collection across all 27 LGAs, and tracks both case counts and reporting rates simultaneously gives you a fundamentally different picture. You can see not just where disease is being found, but where it might be hiding.
The other lesson from Imo State was the value of historical data. Our dataset covered six years. That temporal depth allowed us to identify trends, seasonal patterns, and shifts in age distribution that a single-year snapshot would have missed. Sustaining surveillance systems across political cycles and budget changes is one of the hardest problems in public health. It is also one of the most important.
For any health system investing in surveillance infrastructure now, the priorities I would recommend are: digital case notification that works on basic smartphones, standardised case definitions enforced at the point of data entry, and a feedback loop that sends district-level data back to the health workers who collected it. That last point is underrated. Health workers who see their data used and returned to them become better surveillance officers. Those who report into a void eventually stop reporting carefully.
The Mpox investigation reinforced my conviction that surveillance is not just a technical function. It is a system of trust between communities, health workers, and institutions. Digital tools can strengthen that system, but only if they are designed with the humans in the loop, not around them.