The AI hype is real. But for manufacturers, the path forward runs through unheralded work.
The manufacturing industry is asking a lot of questions about AI right now. Which platform. Which vendor. Which use case to pilot first. When to pull the trigger. These are reasonable questions, eventually. But they are not the right first question. The question is whether the operation is legible enough for any intelligent system to learn from. Most manufacturers are not there yet, and the technology will not get them there for them.
A Meeting That Happens Every Week
Picture the scene. A cross-functional team sits down to solve a warehouse capacity problem. They are bursting at the seams and need a way to project utilization. The IT member explains, with genuine enthusiasm, that the current system can do exactly this if they have inbound package dimensions on file. Someone across the table confirms they have that data. Then the qualifiers. They have it for items from certain suppliers. For those suppliers, not all items are in the system. For those that are, some dimensions have changed since the data was entered. And for those with current dimensions, not all have a unit of measure attached.
They have the data. Except when they don’t. Or it’s old. Or the units don’t match.
The chokepoint wasn’t the technology. It wasn’t the report logic or the system configuration. It was the data the whole enhancement was predicated on. The quiet accumulation of years of exceptions, workaround, and deferred cleanup that made it unreliable when it mattered. That meeting happens in manufacturing plants every week. It happened before ERP. It was replayed during every execution and management system rollout since. It is now happening in AI pilots, with high visibility and expectations that leave little room for foundational gaps.
What AI Actually Needs
Artificial intelligence (AI) learns from the representation of an operation that its data provides. Not the operation as it was intended to work. Not the operation as the system was originally configured to reflect. The operation as it actually exists in the data including every stale record, every missing attribute, every ruleset left undefined, and every process step that lives in someone’s head rather than somewhere a system can find it.
That is the part of the AI conversation that tends to get skipped in the vendor presentations. AI does not define operational reality. It treats what is there as authoritative and learns accordingly. The tools that try to correct for gaps still depend on having enough clean data to know what a gap looks like. In the places where AI is working in manufacturing today, the common thread is that the underlying data is machine-generated, mathematically grounded, and not dependent on human entry. For example, vision-based quality inspection, predictive maintenance, parameter optimization on process equipment. The system can observe directly and humans can audit. Where AI runs into trouble is precisely where human judgment has been compensating for years: reason codes entered inconsistently, routings that haven’t reflected actual practice in a decade, planning parameters no one updated after the last product change. The more an operation has relied on people to bridge the gap between the system and reality, the more work there is to do before AI has anything solid to learn from.
What It Looks Like When the Work Gets Done
Early in my career I worked on an ERP implementation at a lipstick manufacturing facility. As part of the project, I examined material usage and scrap data on the bills of material. The kind of work I now know rarely makes the project implementation sales points but ends up determining whether the system reflects how the plant runs and runs it as desired. What we found was that planned production run sizes did not align with the yield of the bulk processing step. In a make-to-stock business where batch sizes are fixed by equipment and chemistry ratios are critical, we were routinely under-running and discarding expensive bulk. It was self-fulfilling: components with three-month lead times were procured to the stated run sizes, so there was never more material available to capture what the batch could yield. Once we surfaced the discrepancy and corrected the data, we right-sized the runs, yielded more finished goods from the same batch, reduced waste, and improved the planning metrics the business cared about. No new technology, just honest data.
A separate problem at the same facility: right-first-time rates on lipstick batches were poor, and the root cause traced to natural variation in organic red pigment. The concentration of each incoming lot varied within specification, but the system didn’t account for it, so every batch was processed as if all pigment lots were identical. The fix turned out to be an underutilized feature already sitting in the ERP, a mechanism for incoming quality to record each lot’s concentration so formula quantities could adjust accordingly. No one had implemented it. Once they did, right-first-time metrics reached world-class levels and the solution went global. The capability was there the entire time. The awareness and discipline to use it wasn’t.
Both of those stories predate the AI conversation by many years. But they describe exactly the condition AI requires: an operation defined clearly enough, and maintained accurately enough, that an intelligent system has something real to work with.
The Unheralded Work
The AI hype will continue. Feature sets will expand. Vendor promises will get larger. And underneath all of it, the unheralded work will remain the same: defining how work actually gets done, capturing what actually transpires, and maintaining the integrity of the data those systems depend on. Not because some transformation readiness framework demands it. Because no system, AI or otherwise, can learn from what has not been defined, optimize what has not been captured, or execute well on what has not been maintained. And no data-driven decision, human or machine, can be trusted with dirty data.
The industry is debating which AI tool to choose. The more useful question is whether the operation is ready to be learned from. For manufacturers, that answer isn’t chosen. It’s built one one unheralded dataset at a time.
