Do your analysts in engineering, science, and business often solve problems that have already been addressed? Or do they fail to utilize and repurpose valuable insights gained by their colleagues? Are you vulnerable to delays and disruptions when experienced staff leave or are unavailable?
If you answered yes to any of these questions, your organization is likely experiencing knowledge loss. This vulnerability arises from tacitly storing knowledge in people’s minds rather than explicitly recording and maintaining it for easy search, transfer, and re-application.
In 2020 and 2021, we surveyed over 100 engineering and science-based companies to understand how frequently staff fail to leverage prior knowledge. Astonishingly, around half of them reported that 40% to 60% of the time, valuable insights and solutions were not being reused effectively. Why? Because they can’t find it to begin with. As a result, highly paid staff expended valuable time and effort replicating knowledge that was already known needlessly delaying objectives. This is waste.
Even more concerning was the fact that not a single company had hard data to support their estimates – they were all based on conjecture.
That means that knowledge is not being managed like an asset.
In contrast, when asked about percent utilization of equipment, downtime, yield and return-on-investment, most companies could quickly provide accurate figures.
Generating knowledge is expensive. Why lose it? Almost any initiative can benefit from prior knowledge, if it can be found.
Knowledge and information are often used interchangeably, but they are not the same. Information is data or facts, while knowledge allows for prediction. Simply knowing yesterday’s production and yield numbers alone do not predict output for the next day – that’s the difference between knowledge and information. Knowledge predicts.
To put it another way, knowledge exists only if data are passed through analysis to generate insights that determine how best to operate and accurately predict performance, within reasonable limits.
Traditional data storage systems like data lakes and data warehouses are excellent for housing information, but they fall short in managing knowledge effectively. Electronic Lab Notebooks (ELNs) can be rigid and disconnected from actual work processes, making them inadequate for capturing and transferring full sets of knowledge.
Managing knowledge requires maintaining the interconnections between data, the analysis of that data and the insights generated. We call this the bi-directional knowledge generating pipeline. At the end of the pipeline, these insights take the form of models, recommendations, standard operating procedures (SOPs) and the like. All of them predict.
The Knowledge pipeline provides traceability, allowing analysts to identify and review the analytical methods and data used in generating insights. If a model comes into question, it’s easy to locate and reconsider not only the analysis that produced the model but also the exact data points that were fed into that the analysis. When, as is often the case, the interconnections have not been maintained, analysts must begin anew capturing new data and analyzing it to regenerate the insights.
The final crucial part of managing knowledge is to capture the narratives and discussions among analysts and integrate them closely with the data and work conducted in the knowledge pipeline.
Analysts can record their thinking with respect to the modeling methodology they’ve employed. They can note observations that they have excluded from their analysis and why. When other staff, perhaps years later, have need to look back, they have important context of the work that was done.
Insights presented as models, SOPs, and more, require maintenance akin to equipment. Insights carry lurking limitations and sooner or later, these limitations come to light. Essentially, knowledge, like equipment, possess a finite lifespan and will require maintenance to be extended.
Echoing Dr. George Box’s words, “all models are wrong, but some are useful,” models that were once beneficial will eventually, for various reasons, experience failures.
When a model fails to make accurate predictions, analysts should record their concerns and link them to the model. This allows others to engage with these concerns, add their own thoughts, and so forth. Eventually, the model might undergo revisions, and those who raised concerns get a chance to assess the new version.
Such collaboration happens asynchronously. Notably, participants in this discussion might include analysts who have left the company. Without a structured approach to capturing these insights, and a framework to attach them to, the valuable experiences and concerns, generated at a high cost to the company, would dissolve from explicit to tacit states and, eventually, be lost.
Senior management must take immediate action to establish programs that safeguard their investment in knowledge. The annual expenses allocated to data collection, storage, analyst compensation, and associated costs constitute a significant portion of revenue. Don’t allow the results of this expenditure to become hard to locate or vanish when individuals move on. Knowledge is an investment that should accumulate and be readily found and repurposed at every opportunity.
By Wayne J. Levin, President and CEO at Predictum Inc
Wayne J. Levin is the President and CEO at Predictum Inc. He has been leading the charge of Predictum team for more than 30 years. He is passionate about improving analytical productivity.
He started out on this mission by delivering integrated analytical applications for various industries to improve productivity and operational performance.
In the late 1990s Wayne recognized that too often engineers, scientists and business analysts were needlessly replicating prior work, regenerating insights and resolving problems. This happens largely because prior work was unstructured, lacking standards and ultimately scattered and, as a result, cannot be found. Since 2007 this has been a major focus for Predictum and a solution is now available in the form of a server-based knowledge framework known as CoBaseKRM™. As an indexed, structured, repository, CoBaseKRM serves as an enduring treasury for collective intelligence.
Wayne also leads statistical training courses and workshops and consults across industries and speaks at conferences and events. Wayne holds an M.A.Sc. in Engineering with a focus on research methods from the University of Waterloo and a B.A.Sc. in Industrial Engineering from the University of Toronto.