Context brokering, is a relatively new term that broadly relates to using data to provide a context around a particular issue (other techniques like topic modelling or ‘concept testing’ are sometimes also used to describe a similar analysis process). Context is particularly valuable where people, in business or government, need to derive insights and understanding around a particular issue, that can be highly complex and involve a large range of data. This is where context brokering has a strong link to strategy, which is where someone, generally a leader has to take the data and do something with it.
Forming a strategy, or even a plan, generally requires an understanding of what’s going on. Having a good understanding of the context allows you to both understand what decisions you might need to make and any assumptions that you could be making. This really isn’t new. As an analyst, the first thing everyone tells you to do is start by understanding a particular problem or issue. To do this we generally start by gathering data. There are many ways we can do this and the choice of method usually depends on the time and resources at our disposal. But however we do it, be it from simple google searches through to a detailed literature search, the aim is the same - to gather as much data as we can to inform our understanding of a particular issue or topic.
So, to form a context we need to first gather data and then decide on our approach on delivering a particular outcome (our strategy). In the modern, data rich world, this is often quite a challenging thing to do - we now live in a time where it’s not about too little data, but too much. We continually face questions about how reputable our data is, so understanding how to refine and understand the data is becoming increasingly important. Traditionally, this used to be limited to how much information the human gathering the data could process. So, in a way, we’re roughly limited to say around 10 reports of maybe 20 pages a report, perhaps a 100, if you’ve got an inhouse team of people and some analysis processes to help you triage and summarise the increasingly complex research data.
Today though we are a lot less limited by human processing. There are many options and dashboard solutions that enable people to gather a lot more data and make sense of what is being said. Making sense of data is now increasingly important and the range and the scale of the data is increasing. So in some ways, the data available to be understood is far greater, potentially less biased and not limited to the human processing bottleneck. But, this creates a new range of issues - how accurate are the processing algorithms applied to them and how and where should the human intervene to select out the most important aspects of the data for context?
And this is the challenge we now face - attempting to balance tools and techniques that allow us to gather and structure more data, whilst providing a useful, accurate and informed context that enables us to make better decisions and form policies and actions. And this is where context brokering can help, but it can be a complicated process that yields deceptively simple outcomes, so to understand how it’s applied and how it can differ to traditionally applied techniques it’s worth considering an example. Have a look at this post that explains things a little more.