Small data will be key to understanding your ‘new normal’

The saying ‘knowledge is power’ has, over time, become less potent. Not because comprehension is no longer beneficial but because, in a world that has become ever more complex, differentiating between tangible fact and postulation has become problematic. Big Data and analytical tools combined with high-capacity virtual infrastructure serve to dispel this problem. Vast swathes of data, incomprehensible to the human mind, can be stored in data centres and analysed by a.i. artificial intelligence.

These technologies are combined with advanced data analysis methodologies to scrutinise quantitative information in a way that was not previously possible. The understandings it has delivered have influenced the ways in which potential risks have been countered, determined which routes to market should be leveraged and discounted, informed decisions on expenditure, brought about revised supply chain management methodologies, and more. There is no organisational process that has not been influenced by Big Data in some way, shape or form.

Unlike its heftier equivalent, Small Data is typified by the fact that is malleable and can be analysed by individuals, rather than AI, and without the need for specialised applications or infrastructure: the responses of 1,000 surveyed high-street shoppers would be considered Small Data, whereas high-street footfall data collected over a six-month period would be classed as Big Data. Owing to its ability to deliver knowledge that was previously unforeseeable, Big Data is habitually viewed as superior to Small Data, but such assessments are ultimately reductive.

Big Data can drive insights that Small Data cannot but the same is true in reverse: large information sets are impersonal and incapable of providing the unique and granular information that drives effective decisions unique to organisations themselves. Small Data does not generate salient findings for industries as a whole, but it consistently provides businesses with important information and conclusions that bring about internal improvement.

The importance of Small Data post-pandemic

Recent events have brought about changes that are wholly unprecedented. The algorithms that are used to extract findings from considerable quantities of data are also highly intricate. These two factors combined bring about an environment within which existing frameworks and practices concerning the analyses of Big Data become ineffective.

As Michael Berthold, CEO of open-source data analytics app Knime recently informed Computer Weekly, sometimes new realities are so different to what existed previously that base assumption that has been built into systems simply no longer function correctly and needs to be rebuilt in their entirety.1

Whilst few know how the world will look following the COVID-19 pandemic, there can be little doubt that it will be considerably different to the one we knew before the dawn of 2020. Reports have indicated that GDP in the UK fell by more than 20% in April 20202. Such a fall is the largest in recorded history and it is simply not possible to determine what effects this will have on organisations.

Equally difficult is the task of determining how recent events will influence societies. In short, it is unlikely that Big Data will be able to deliver meaningful findings until developers are able to adapt relevant systems – something which could take several months.

Organisations that are able to comprehend the ‘new normal’ will nevertheless be well placed to ride out resultant difficulties and potentially even use testing circumstances to their advantage. Small Data, then – much of which organisations will already possess in some form – will be what is required to drive this understanding.

Audit and develop your data sources

A large amount of information can be gleaned from already existing sources, but these must be identified before they can be leveraged. Most will have been recognised previously, but locating others will involve an investigation. This does not need to be strenuous, however: simply consider any platform or tool that collects any form of data and whether or not some insight can be extracted from anything it generates. Google Analytics, for example, should not just be considered a means of analysing the performance of online properties.

An increase in traffic to a page centred around a particular product or service indicates increasing demand, particularly if it correlates with a larger number of related enquiries or sales. Likewise, a larger than normal number of impressions for a particular search term (this could be observed in reports generated through Google’s Search Console or Ads facilities) would indicate something similar. The key is to consider what platforms generate empirical data and what potential hypotheses can be derived from them.

Once sources have been identified, organisations should then consider how to enhance existing data supplies, as well as locate new ones. A Customer Relationship Management (CRM) tool is when leveraged correctly, an invaluable means of generating data relating to customer satisfaction via surveys.

Often, organisations are unsuccessful in their attempts at harnessing CRMs and surveys because they fail to plan effectively and create questionnaires that will generate data that can be easily analysed. This, though, can be addressed by requesting feedback be provided in a numerical format. As 65% of companies that use CRMs hit sales quotas compared to just 22% of those that do not3, it’s clear that these tools are powerful means of generating data and fiscal value.

Free 2 hour data consult

Develop a data governance policy          

Following the identification of viable data sources, organisations must standardise the processes they use to generate, store and analyse data by developing a data governance policy. This will outline how data is to be collected, the formats that will be used in various instances, where findings are to be stored etc. Without such a policy, the manner in which data is recorded will be inconsistent. This, in turn, will result in information that is of diminished quality and from which it is difficult to draw meaningful and accurate conclusions.

All data-driven policies should provide a central repository outlining all sources of data and what findings the organisation hopes to glean from it. It should also make clear the formats that are to be used when storing data, where it is to be stored and how it is to be maintained.

Additionally, the policy should clearly state which stakeholders are to be given access to various sets of data, who is to manage them and, finally, who is to be responsible for enforcing the policy and ensuring it complies with GDPR legislation.

Whilst the creation of such a policy may be problematic, it is a vital means of ensuring that all the data a company records can be scrutinised and used to glean insight. If one is not in place, value cannot be derived from data.

Blend big and small data

As Big Data adapts to the new normal and becomes more capable of providing accurate assessments, organisations’ stakeholders may well opt to revert back to an analytical model that disregards Small Data entirely. As we’ve discussed previously, though, drilling down into Small Data is typically a more effective means of identifying how an organisation can improve. Big Data identifies macro trends, but organisations can leverage micro trends to great success, also.

Consider how Uber identified that there was likely to be a demand for an alternative means of hailing taxis not by analysing exabytes of data, but via their founders having experienced individual frustrations with existing models. Airbnb was formed by two friends who simply believed they could monetise an air mattress. Even two of the world’s largest companies that now rely on Big Data to generate billions in advertising revenue – Google and Facebook – were created on assumptions concerning internet users, not an analysis of macro trends.

Organisations can simply become too reliant on Big Data. Martin Lindstrom, author of Small Data: the tiny clues that uncover huge trends, has claimed that the corporate world has been blinded by Big Data in recent years, arguing that this has, in numerous instances, brought about poor decision making. He cites Lego as an example noting that trends derived from the analysis of large datasets had resulted in them making their blocks larger.

This proved to be unsuccessful and, following the company re-introducing one-to-one interviews to their data collection strategies, this decision was reversed. The company has since returned to a state of profitability.4

Big Data can provide valuable insight but, like so many things, it works best when teamed with other resources. By combining it with its smaller counterpart, organisations can develop nuanced understandings of trends and patterns that their competition cannot.

References
  1. Computer Weekly (2020) Dealing in Data
  2. The Independent (2020) UK GDP fell by 20% in April – the largest slump since records began
  3. Super Office (2019) 18 CRM statistics you need to know for 2020 (and beyond)
  4. The Wharton School of the University of Pennsylvania (2016) Why Small Data Is the New Big Data​​​​​

Next

Prolonging sustainable practice beyond crisis

© 2024 ROCK. All rights reserved.

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×