As the 2020 COVID-19 pandemic unfolded the global community was flooded with ‘shaky’ data and not enough evidence. The knowledge needed to support decisions was limited and consequently our responses seemingly changed on a constant basis. Could this challenge our idea of evidence-based medicine?
Evidence and knowledge take time to generate. In contrast data might be quickly generated, but data is just data until it is processed to become evidence. The distinction between data and evidence is often debated, although it is generally supported that they are not the same. Understanding why is essential in evidence-based practice. Here I draw on the literature to explain what I have learnt so far and highlight some published literature to encourage you to explore further.
But before we look more closely at this difference, why does it even matter? The volume of data and information is increasing at never-before seen rates. Added to that, funding for the collection of data and in particular ‘big data’ outstrips that for translating evidence.  The notion of big data sets generated through the increasing power of computer technology appeals because it is seen as ‘cutting edge’ and of the future. Yet, as Car et al 2019 points out, clinical data is inherently messy and establishing a causal relationship challenging. ‘Data, while necessary, are insufficient to inform medical practice.’1 Can we sort out how data and information relate to evidence so that they inform best practice and we can benefit rather than be overwhelmed?
The “data, information, knowledge, and wisdom” or DIKW hierarchy first proposed by Ackoff in 1989 has often been referred to in medicine to provide a conceptual structure around the question of data and knowledge.  More recently a revised framework ‘data, information, evidence and knowledge” (DIEK) has gained traction.  While others have discussed how they differ, the frameworks have much in common including that data is simply data, numbers, or an observation. That is, until it is transformed into something useful. Numbers alone tell us nothing of the context including how the data was collected, why it was collected or how it is changing. An audit report stating that ‘five people experienced severe pain’ is meaningless unless we know the context. In the DIEK framework understanding the context of data allows us to transform or process it into information.
Information is data in context. With this we understand the meaning and purpose of the data and can assess whether it is right or wrong. ‘Five out of 100 patients undergoing a specific surgical procedure with local anaesthetic experienced severe pain’ allows us to convert the numbers into meaningful information.
According to the DIEK framework information must then be transformed to reach the level of evidence. It does this by association with an argument, a hypothesis, or an opinion. Evidence is information that relates to the truth or otherwise of this assertion. A point reached by comparing information to reference values or standards so that it can be analysed. For evidence to be useful it must be robust, repeatable, and reproducible. We might ask what the incidence of severe pain associated with this procedure was in previous audits or in other clinics. How consistent are the findings, and if not, what factors might be influencing this outcome?
When evidence has achieved a state of being predictive, testable, and a consistently successful belief then according to the DIEK framework it has reached the level of knowledge. In our example, knowledge that in comparable clinics five per cent of patients undergoing the procedure will experience severe pain alerts us to monitor patients even though for most it is unlikely to be an issue. Knowledge justifies action, in best practice it justifies or denies the value of what we do in providing care.
Finally, a word on the framework structures. While both DIKW and DIEK propose a pyramid framework with wisdom or knowledge at the pinnacle, this linearity has been challenged.  Questions about how to incorporate the influence of knowledge on data collection or user expertise and experience on data transformation also remain.
There is more to these conceptual frameworks than highlighted here, but it is worth remembering that in most the distinction between data and information or evidence is consistently upheld – context matters.
- Car, J., Sheikh, A., Wicks, P. et al. Beyond the hype of big data and artificial intelligence: building foundations for knowledge and wisdom. BMC Med 17, 143 (2019). https://doi.org/10.1186/s12916-019-1382-x
- Matney S, Brewster PJ, Sward KA, Cloyes KG, Staggers N. Philosophical approaches to the nursing informatics data-information-knowledge-wisdom framework. ANS Adv Nurs Sci. 2011 Jan-Mar;34(1):6-18. doi: 10.1097/ANS.0b013e3182071813. PMID: 21150551.
- Dammann O. Data, Information, Evidence, and Knowledge: A Proposal for Health Informatics and Data Science. Online J Public Health Inform. 2019 Mar 5;10(3):e224. doi: 10.5210/ojphi.v10i3.9631. PMID: 30931086; PMCID: PMC6435353.
Suggested reading in addition to the references:
What’s the difference between data and evidence? Evidence-based practice an Oxford Review Blog by David Wilkinson
Dr Katrina Erny-Albrecht, Senior Research Fellow, CareSearch, College of Nursing and Health Sciences, Flinders University