I’d like to begin by posing you four questions. Can you think of a time where you’ve been frustrated by how data is used in your organisation? Have you ever been asked to report on something without seeing the results? Have you been shown dashboards that contain irrelevant or inaccurate information? Have you provided valuable data insights and they’ve been ignored? If you’ve answered yes to any of these questions, you’re not alone.
At Data Understood we’ve worked with customers in the public, private and charitable sectors, helping organisations to overcome barriers preventing them from experiencing data’s full potential. While every customer is unique, the pitfalls they experience tend to be the same. People need support to leverage meaningful data. Likewise, data needs attention to realise its benefits. Without resources to empower people and data in equal measure; meaningful data insights can be ignored, and organisations can be frustrated by the time and effort required to see relevant data.
I got my degree in applied statistics and data mining three years before the HBR published “Data Scientist: The Sexiest Job of the 21st Century”. Whilst the increased interest was welcome… and somewhat bizarre, it created new challenges. The industry flipped from trying to build engagement to managing expectations. Just last year Gartner published research which revealed more than half of marketing leaders were disappointed in their analytics results. Navigating these changing waters we’ve seen four common data pitfalls – poor communication, inadequate skills, inappropriate technology, and misaligned structures – and, more importantly, how to avoid them.
The most common misconception I hear is that data makes humans redundant. It is no wonder that this assumption results in people being scared of data. If communication is poor in an organisation, assumptions can kill engagement with the data initiative and an individual’s job satisfaction.
Communication helps people understand what data means for them. Actively listening to individuals impacted by data changes (i.e., listening without judgement, clarifying through questions, and replaying learnings to check understanding) helps to understand barriers to adoption and opportunities where data can help. Maintaining two-way communication builds trust and allows for issues to be raised and concerns unblocked.
Use storytelling to make data initiatives relatable and impactful. The amazing Nancy Duarte describes stories as fundamental frameworks consisting of three parts – scene-setting, introducing a conflict which needs to be overcome, and resolving the conflict through a transformation. Explaining data initiatives using this framework builds empathy, ignites action, and helps gain approval.
Clarifying how data will be made easy to see and act upon, helps employees and employers, as well as customers and partners, see the advantages in building a healthy data culture. Creating a matrix showing the relationships between the data generators and data recipients can assist this process. The personal reward for filling in a form, enabling cookie tracking, or providing an email address should outweigh the effort and time required to provide this data. Be open and transparent about how data is used and communicate the benefits.
Research carried out in 2019 by Ipsos Mori, in partnership with National Numeracy and the Policy Institute at King’s College London, showed about half of the UK adult population have numeracy skills that are no better than those of a primary school child. Rather worryingly the same research showed one in four would be put off from applying for a job if it listed using numbers and data as a requirement. While this research is shocking, consider how many dashboards you’ve seen and did not know how to act upon. Redundant dashboards are not only a waste of time for the people who created them, but they can have detrimental effects on individuals who believe data is not for them because they cannot understand what is being presented.
Providing data literacy training ensures that people can read graphs, work with Key Performance Indicators (KPIs) and metrics, analyse the performance of an experiment and tell stories with data. Good training should focus data on making data interpretable for the person who will use this data. Combining training with written resources and supportive colleagues helps those with anxiety about data learn in a safe environment. Encourage people to ask questions and provide time for learning.
Beyond establishing basic data literacy across an organisation, data often requires expert support. A common issue we see is an individual who is expected to be able to do everything – the data unicorn. Whilst such individuals exist, they tend to be rare. Reviewing what an organisation needs in line with its business strategy and its data maturity can simplify the recruitment process and set data resources up for success.
Expert data resources fall into three categories – strategists, translators, and practitioners. Strategists are the leaders and managers who drive the data initiatives in parallel with the business and data strategy. Practitioners are data scientists, engineers, analysts, etc. who build models and maintain the technology underpinning any data solution. Translators are the individuals who make sure any request from a strategist is understood and any abstract insights from a practitioner can be utilised in the business. Determining which of these roles can be upskilled, recruited, or temporarily contracted, accelerates an organisations’ journey to getting valuable results.
Statistics, while a new discipline in mathematical terms, has been around for hundreds of years. Within the last 30 years, technology has revolutionised this field. I remember one of my lecturers informing me that a form of analysis called a PCA (Principal Component Analysis) used to be the topic of a PhD. Thanks to software, nowadays a PCA is a single line of code which can be run within a fraction of a second, enabling facial recognition.
A 2019 Freshworks survey showed software had the biggest impact on employee performance. Whilst software can be incredibly powerful, anyone who has had to use a poorly implemented application will know the feeling of technology working against them rather than with them. In addition to limiting people performance, applications can prevent data from being utilised to its fullest. Within Data Understood, we measure the effectiveness of applications against five criteria – their use, installation and/or execution, built-in analytical features, data accessibility, and the organisation's current connectivity with this application. A good data application is user-friendly, collects meaningful data and promotes data sharing.
Technology is not just about applications. Computers enable data to be gathered more easily, stored in a way which is accessible to people around the world, analysed quickly, and made visible and interactive. For data to be utilised to its fullest, it requires a well-engineered platform to work form. Creating the appropriate infrastructure, data storage, data architecture, and information security enables data to be secure and accessible.
I recently worked with an organisation that could not figure out how to give me access to their data. Even with data sharing agreements, internal stakeholders signing off permission, the roles and responsibilities in the organisation were not aligned to make data work. A hyper secure culture is understandable given the average cost per lost or stolen record is 146 USD, however inaccessible data has no value. Making an individual’s data responsibilities clear reduces the risks associated with data breaches and gathering / spreading misleading information. Implementing information governance, helps data thrive in a safe environment.
Beyond a structure to support data, the foundation of any data strategy is a business’s strategy. Most people have heard the phrase “What gets measured gets managed”. Focusing on what’s easy to measure rather than what’s meaningful often results in poor quality data, which according to the 2017 Gartner’s Data Quality Market Survey, costs on average 15 million USD annually.
Measure what matters rather than what is easy. Define your organisation’s goals and align SMART (Specific, Measurable, Assignable, Realistic, and Time-Bound) Key Performance Indicators to track whether these goals have been met. Then define SMART metrics to track progress in reaching these KPIs. Mapping metrics to roles assists individuals in these roles take control of their contribution. Transparent metrics and KPIs help an organisation build an experimental mindset where people can test ideas and learn from initiatives.
In summary, you can avoid the common data pitfalls by:
We hope this article has provided some useful insights and if you would like Data Understood’s support to avoid the four data pitfalls, please get in touch today.