Viewing entries in
horizon scanning

Comment

Context brokering - understand the hype.

If you haven’t heard the term ‘context brokering’ - you will do soon.  The 2016 Gartner ‘Hype cycle’ highlighted it as a smart machine technology that will begin to mature over the next 10 years.  

Gartner Hype Cycle 2016 - from http://www.gartner.com/newsroom/id/3412017

Gartner Hype Cycle 2016 - from http://www.gartner.com/newsroom/id/3412017

To understand this development a little more it’s probably worth defining what we think ‘context brokering’ could mean and its different levels of application, as at present it seems to covers a range of different problems and timescales, which can cause confusion.  So, let’s look at the first part.

Context

At the time of writing (19/07/2017) - if you google ‘context’ you get the following:

In life (and business generally) we regularly make plans to deliver a particular outcome or to get to a particular point.  To form such plans, we generally need some kind of understanding of what’s going on.  This is where the context part comes it.

Having a picture of the ‘context’ you’re currently facing (to use the google definition - the circumstances that form the setting for an event, statement or idea), or likely to face in the future allows you to both understand what decisions you might need to make and any assumptions that could be implicit in them.  To understand this a little more, have a look at our post from January 2017

Once you understand what context means, you start to appreciate the issues of ‘timeliness’ (when do you want to do something) and 'relevance' (what sort of data is there to help you understand a context, and how regularly is this data produced) - and these two qualities relate to the ‘brokerage’ side of things.  

Brokerage

A ‘broker’ is someone or even a section of computer code that offers a particular service and this is generally to find or provide some kind of data to help you make a better decision.  In the application of context brokering, the broker is (in theory) a type of service that finds and returns data relevant to the particular context required to form a plan or make a decision.  

However, the decisions that people need to make vary greatly, both in terms of complexity and timeliness and this is what’s crucial to understand when considering how context brokering could work.  What’s crucial about ‘context’ in the context of context brokering (I won't do that again, I promise) is that it currently has different meanings - and these generally relate to the timeliness of the data being used.  By exploring what 'context brokering' could mean, and how it could be applied we've deduced that there are two key applications for how context can be useful in the real world:

  1. Immediate prediction of consumer behaviour
  2. Understanding of strategic insights

To understand these two different applications, we've unpacked them a little here.

1. Immediate prediction of consumer behaviour

The immediate predictive benefit of context brokering is probably in the form of brand and consumer insight generation.  With advances in big data, many organisations generate and/or have access to a large amount of data across a wide range of sources.  By managing and aggregating all these different data sources organisations can start to generate particular contexts, perhaps for how a brand is performing or how consumer spending happens.  For example, an online sales platform could use different cookie data to track what associated web-pages a shopper has viewed before and around a purchase.  This could yield interesting predictive data, for example, do people look for certain products during certain climatic conditions for example, in a heat wave - do sales of hats and fans go up? Or would a person purchasing sun cream, mosquito repellent and beach towels, be interested in new sunglasses?  As such, does such an understanding of unique contexts derive a commercial advantage?

To fully get value from immediate context brokering, there are a number of research questions to consider.  For example, how does an organisation bring together and model data?  For a large organisation with an established infrastructure, this isn’t likely to be an issue - the sources and data collection mechanisms already exist - it’s mostly a question of making sense of the data and being able to translate it into some form of action or insight.  It is perhaps a bigger challenge for an organisation trying to understand the utility and applicability of this type of service to their business, for example, if you don’t have access (or the need) for big data and you’re not generating masses of data, what value will it have to your business?

Additionally, if context brokering is likely to add value to your business, how wide ranging are you data sources?  How accurate a context can you produce just using data from facebook? Can you get a more accurate context from looking across as wide a variety of social media channels as possible?  If so, what are the cost implications for drawing data from such a wide variety of sources?

Another question to consider is what technologies exist to make context brokering a reality? This is currently a hot area.  Predictive techniques are improving as people use more sophisticated statistical models.  Machine learning can be applied to train algorithms to detect patterns and find specific terms in larger and larger datasets, ditto for machine vision algorithms and visual data.  At the same time, databasing and data storage continues to rapidly increase as does processing power.  All these technology trends make the immediate predictive benefits of context brokering increasingly enticing.  However, it’s still worth reflecting on the fact that however pure the model and the maths, at some point, context has to equate to action for it be of value and this is something that can often be forgotten.  Essentially, the final challenge is making sure the right kind of predictions are linked to the right kind of behaviours!

2. Context brokering for strategic insights

Another application for context brokering relates to less immediate decision making and relates more to research and development.  Timeliness is not such an issue in this application - its the scale and breadth of data covered to inform a decision that’s important here.  At present, it’s the kind of activity that kicks off many large projects, especially in research, policy and academia. The common element to immediate prediction is that, such projects are undertaken to determine what we believe the state ‘truth’ is around a particular issue/idea or event.  

This form of 'strategic analysis' is less time pressured than immediate prediction, but the sources and ranges of the data used to inform our actions, plans and decisions are still important.  In the past, organisations have generally done some kind of early activity like a literature review, or assigned an intern or student to summarise the research around a particular issue.  From this, an assessment is produced on why we want to do something or follow a particular course of action. Context is really important here, often we base our first principles for a course of action on our belief around a certain event, paradoxically, this often the point where we do the least amount of research and it can be subject to a high degree of bias and poor research.  For example, if too small a starting dataset is used our assumptions and lack of research can be quickly exposed as the work is shared more widely.  This is where context brokering can offer a decent alternative to such traditional techniques (which we see today in the form of literature searching and workshops).

Using context brokering for strategic insights improves how we gather, store and map knowledge; enabling us to have greater confidence in our initial assumptions or understanding of complex problems.  As a technique it can also allow us to produce ‘ontologies’ for particular problems that can allow more specialist data gathering and improved research gathering and network understanding and this can allow us to learn and gather more data more efficiently.

However, as with immediate prediction, there are challenges.  Selection bias can have a large impact for such a technique, especially if a small data set is used and if particular terms favoured knowingly or unknowingly, the process can simply yield more things to confirm a particular view of the world.  Additionally, the issue of perfection versus relevance still applies greatly - however good our model is, it still needs to be communicated with decision makers who need to be able to understand and interact quickly with the main findings of the model, while at the same time trust that the model relates to ‘real’, trusted data.

Additionally, strategic insight generation is probably based on a more limited format of data. Where there are considerable conversion issues for immediate prediction, strategic context brokering tends to rely on text-based analytics (this could reflect the longer lead time in data used for research and development planning?)  This means it is, in some ways, a simpler area of study, one that can benefit greatly by further research of the applications of machine learning for speeding up how the data can be processed and used.  However, its still worth coming back to the potential bias a human can apply in such analysis - but does the intelligence and insight that such human input provides outweigh the downsides? This is a key issue for further research - one that data scientists and analysts continually grapple with. How do you configure the optimum balance of machine-based learning to improve the efficiency and scale of human analysis?  What role does the human analyst have in the analysis process, when at least for the next 20 years, they are likely to remain the best predictor of context and its translation into specific insights, actions and implications?

With all these points on board, and to offer some kind of conclusion to this post, it’s probably worth defining and thinking about what 'context brokering' could mean in the future as we start to understand its applications a little more.

An updated definition of context brokering

Context brokering is a service that enables actions and insights to be generated from broad sources of data and information.  It can be applied with different levels of timeliness - from the immediate to the strategic.  Immediate context brokering as a service applies advanced forms of computer science to provide actions and insights either to another system or a human.  Strategic context brokering, applies the same principles to wide ranging problems that have considerable published literature (often from a scientific or research basis) to map and better inform decisions and insights to be formed around the dataset.  

Additionally, another thing to reflect on is how context brokering works as a process - which whatever the timeliness of the data, tends to rely on the following process.  

  1. Definition of problem and sources

  2. Data gathering

  3. Data storage

  4. Mapping

  5. Action/insight generation

  6. Feedback to 1 (as required)

Final thoughts

'Context brokering' is a newly emerging area and it’s exciting to be in it.  Our own insights have come from the smaller scale applications of strategic context brokering, but what’s interesting is how applicable many of the techniques are to different sources and timescales. However, it may still be worth reflecting that certain principles for analysis still hold true, and are perhaps more important than ever when applied to the era of 'big data'.  As well as issue of timeliness and relevance, trust is still key.  How much do you value and rely on your sources? It is your sources that will ultimately still drive and sustain the validity and quality of whatever context you produce.

What do you think?  If you have any thoughts you’d like to share on context brokering, please either add them here or drop us a line at info@simplexity.org.uk!

 

 

 

 

 

 

 

 

 

 

 

Comment

Comment

Using short story competitions to predict the future.

The Economic and Social Research Council has just announced the winners of its 'World in 2065' writing competition, and very good they are too.  The winning entry, by James Fletcher is called 'City Inc' and is a future vignette about the rise of the City State.  Other entries from Josephine Go Jeffries and Gioia Barnbrook, both explored themes relating to climate change and how they could impact on the future. 

Looking through these great entries, it does make you think about all the other submissions.  By necessity for short story competitions you do have to have a winner, that's how these things go.  But, if you think about it, the reason we commission creative exercises about the future is about getting ideas.  So, if you run a competition what happens to all the ideas and insights contained in the entries that didn't make it to the short list?

Now, generally there could be a good reason that stories didn't make it to the short list.  Probably, they were difficult to read, possibly they were incredulous or maybe they weren't entertaining enough.  The reasons they didn't make the cut will be defined by the selection parameters of the judging panel, in this case, a high profile one (including Tash Reith-Banks from the Guardian) really knows its stuff artistically.

But, I can't help reflecting on how short story competitions could be best used to gather ideas about the future.  We know that they don't really provide a more accurate view about what could happen, so what value to they have?

The value of ideas from short stories.

The real value from short story competitions comes in the ideas and possibilities they raise.  This is where they are so valuable and its so important for them to be creatively unconstrained and really highlight as wide a range of ideas as possible.

Unfortunately, this is where the traditional means of assessing competition entries fall down a little, as they only select and take forward a small proportion of the entries for further exploration.  It means all the ideas contained in the entries that don't make the shortlist aren't used.

As a futures analyst, I feel this is a shame as it wastes ideas.  But, I do understand, that for a competition, you do need to have a winner, and choosing an entry based on its artistic merit is a good way to go.  However, there is an alternative...

Data mining competition entries

In this data-led age its now possible to rethink how we run competitions.  So, as well as choosing an overall winner, you can also filter the ideas contained in all the entries, this will give futures analysts what they want - i.e. a broad a range of ideas about the future as possible.  Then when you take these ideas and map them, you get a real understanding of the bulk of what people are thinking when they write their entries.  For example, take the following map.

Sample Map from the Scan of Scans.

Sample Map from the Scan of Scans.

This data visualisation is based on the content of around 300 foresight reports and summarizes that most frequently occurring terms.

Now, if you take this approach with all of your competition entries - you then have an additional source of ideas and material.  Which for a futures analyst is highly desirable as any one of these could be a potential lead on a trend in the future...

Unfortunately, at present most competitions aren't designed with getting this added value from its entries, however, perhaps as we become used to being more data-driven this could change in the future.

And, for those sci-fi short story aficionados out there - the data driven approach in journalism, was actually predicted by Paolo Bacigalupi in his short story called 'The Gambler' - in this media agencies track the most popular stories using a live data visualisation called the 'Maelstrom', a brilliant name for a living, evolving complex mess.  And its such things that can be a little daunting to work through.  But for the first time we have the tools and the know-how.  It would be great to start using them to get more value from short story competitions...

Comment

Comment

A data-driven forecast for the future of healthcare.

Back in November 2014, we produced a forecast that was entirely data derived. Unlike a lot of futures reports, this analysis is based entirely on openly available data and analysed and visualised in a manner that illustrates all of the available data used to derive judgements. We believe this is important as it means we can reduce bias in our assessments but also when we make predictions for the future we can produce quantified assessments to reflect our belief in whether they will happen or not.  We've now developed this analysis significantly and have used it to test ideas and trends in a wide variety of areas, but, if you're interested in the future of health, we've now made the analysis available here.

But, for those of you, too busy to read the actual report, the main 'Top 5' findings are detailed below [caveated appropriately!]

Top 5 trends in Healthcare 2015 - 2050.

1. Fee paying healthcare is likely to increase out to 2050.

Insight - Because of greater demand on health systems (ageing, obesity and disease), the rise of new healthcare markets and strategies (from emerging markets) and increasing technologies and medications to promote and prolong life, fully funded state-based healthcare is unlikely to be sustainable out to 2050.

Judgement - There is a probability of 0.8 that by 2050, countries like the UK will deliver a far greater proportion of their healthcare through private agencies. State-based provision is likely to become increasingly difficult because of the continued evolution of diverse healthcare demands and increasingly complex technical requirements of future treatments. By such a point, states are likely to focus on facilitating access to affordable healthcare and promoting healthier lifestyles.

2. Global obesity rates are likely to increase over the next 30 years, prompting significant initiatives to address them.

Insight - Without coordinated intervention global obesity rates are likely to increase out to 2050. Basic projections suggest that if global obesity continues at its current level, an estimated 2 billion people in a global population of 7 billion in 2013 (contrasting with 857 million from a global population of 4.5 billion in 1980), then by 2050, around 30-60% of the global population will be obese. In total numbers, if the global population reaches 9.5 billion by 2050, this will represent a range of 2.7-5.7 billion obese people.

Judgement - There is a 0.95 probability that the levels of obesity in the global population will increase from 2014-2050. This trend will be driven by higher calorie diets as lower activity levels become the global norm. However, the problem may become so significant, so quickly, that policy reforms, new technologies and medicines may provide the necessary interventions to mitigate this trend.

3. Out to 2050, states are more likely to occupy the role of facilitating healthcare access as opposed to direct provision.

Insight - Over the next 30 years, the rising cost of healthcare and the increasing diversity of technologies and medicines to promote health and prolong life will mean state-based care strategies will be increasingly costly to maintain. This is likely to lead to many countries developing less costly models to promote and facilitate access to healthcare, guaranteeing a level of access to the least well off citizens alone, whilst enabling access (through part funded and tax incentive schemes) to the majority of their citizens. However, due to the variability of national strategies and priorities, there will be considerable variation in the political attitude to toward state-based healthcare.

Judgment - There is a 0.65 Probability that governments will move to roles based on facilitating access to healthcare as opposed to being the direct provider.

 

4. The use of healthcare data will be increasingly important for healthcare treatments.

Insight - Out to 2050, improvements in sensor technology, data collection and increasingly available open data will drive metric collection and increasingly sophisticated trials and health strategies. Such developments will change many perceptions on the use/protection of health information and patient confidentiality.

Judgment - The use of healthcare data will increase out to 2050. It is a certainty that data (once it has been approved for confidentiality and legal consideration) will be collected and used to improve the quality of human healthcare.

5. Policies to encourage healthy behaviours and lifestyles are likely to become increasingly important.

Insight - To reduce long term health issues government and company policies are increasingly likely to promote healthy behaviors and lifestyles to reduce long term costs on industry and the state. Such strategies will be more cost effective to implement in the long term and reduce the treatment of symptoms rather than the causes. However, certain specific requirements such as the guarantee of basic security and emergency responses to save lives will remain key ‘duties of care’ that will need to be maintained.

Judgment - There is a 0.7 Probability that policies to encourage healthy behaviors will increase over the next 30 years.

Judgment - There is a 0.95 Probability that the duty of care of governments to maintain and protect the health and safety of their citizens will endure out to 2050.

Outliers

As well as weighting our top five findings we've also collected some 'outliers'. These are the rare, and very low reported trends. When you get all these together they can make for interesting reading, just think, if we'd done this exercise in 2009 - where would the term 'healthcare metrics' appeared?

9 'Outlier' trends for the future of Health

1. The next pandemic may not be flu.

2. Both Japan and the EU may suffer from a shortage of trained healthcare providers in the future.

3. Long term chronic illness (such as diabetes or forms of cancer) could represent significant healthcare issues in the future.

4. Hypertension could be an increasingly significant healthcare issue.

5. The rise of counterfeit medicines and synthetic narcotics could be of potential significance to the future of human health and the pharmaceutical industry.

6. The increased use and sophistication of biomarkers could be significant for addressing future health challenges.

7. Cognitive systems that sense, act, think, feel, communicate and evolve, could be increasingly important in how we understand and improve the healthcare solutions at our disposal.

8. ‘Localisation’ and the local environment could be increasingly significant for how healthcare options are delivered to the surrounding populace.

9. A revolution in farming and agriculture could improve or alter health dynamics anywhere around the world.

Any questions?  Get in touch, at info@simplexity.org.uk

Comment