Views and announcements

Data Intelligence: Are humans complicating everything?

Share

Latest jobs

  • By David Collins, CEO of First Derivative. 

    The problem with data is us humans; we have a tendency to fixate on certain data points without reference to the circumstances that are driving that signal. I suspect it’s that human tendency to default to reductionist thinking where the complicated has an answer and we are uncomfortable with the inevitable ambiguity of the complex.

    We all think of data as being obviously useful but the journey from creation to decision-enabling intelligence is complex. Firstly, the data needs to be gathered, in theory, the easy part but it’s not that simple. In large organisations, data is never straight forward; aggregation of common data points across multiple systems complicates data compatibility. To be useful, data needs to be clean, produced consistently and reliably.

    Then we can start to apply analytics, not just the reporting of specific data sets but to search for the relationships between data points. For example, IOT data shows that Pump A is running hot but if we know that running hot while the oil is low can lead to catastrophic failure, we can produce real insight and predictability from the data.

    That simple example of x + y = meaningful, is a basic example of context and this is where real insight is derived. For the analytics to be truly useful we need the real-world context or a narrative surrounding it. Where humans are involved, we may also need to understand emotions and motives.

    To give another example of context: 15 degrees celsius in New Delhi is damn near freezing but 15 C in San Francisco is a nice day*. To make a judgement call on the merits of 15 Celcius we need to understand the context with respect to the observer. A New Dehli resident just arrived in San Francisco is likely to dispute the local view on the merits of the weather.

    It is for exactly these reasons KPI’s can be dangerous; the lack of contextual narrative allows a ‘dumb’ number to take on more significance than it warrants, people then start managing ‘it’ rather than achieving the intended result.

    Locking in on a single target for a data point is like working in one-dimension; you need more data points from additional angles and contextual narrative to properly understand the signal. For example, in consulting, the correct size of our bench is a function of many factors, not least the size and timing of our sales pipeline combined with the current contractual state of our existing book of work plus attrition. Add in the fact we operate in multiple countries with a delivery model that can hybrid across regions; have multiple specific skill sets and have a further dimension in our delivery pyramid. Try putting that in an algorithm.

    The importance of the contextual narrative is critical when turning well analysed data into true intelligence. It would all be so much easier if our world was a series of complicated issues that can be resolved by reductionist thinking but so much of what we do and the environment we work in is complex. There is ambiguity in complexity, as a decision maker we need to interpret all of those inputs and make a call. If all of that could be reduced to an algorithm, then a machine can replace the human; we might be inherently uncomfortable with ambiguity, but it keeps us in work!

Share this story