Comment

Are we always ‘fighting the last disaster’?

Despite Donald Trumps desire to treat climate change as a belief, the trend for increasing global temperature and a greater frequency of extreme weather events suggests we are likely to see more climatic disasters.  Hurricane Harvey, Hurricane Irma and the flooding in South Asia are likely to be representative of the increasing frequency and scale of natural disasters that we face. And how will we prepare for them in the future, will we improve how we adapt and respond to such disasters?

Disaster politics.

When we try to deal with the aftermath of disasters, it's worth thinking about how and who responds.  Mostly, those that respond are those that are immediately affected by the disaster and are organised by local governance systems.  Non-government organisations (NGOs) at both the local and international level also play a crucial role, but the most likely source of authority and organisation in most situations is from the local government.  This can have a bearing on how we prepare for future disasters.  Have a look at this great post by Gemma Sou that outlines how people respond to disasters and who is most likely to be affected in the long term.  

If we think about how governments respond to disasters - is it worth reflecting on the perception that governments tend to be reactive, rather than proactively manage potential crisis.  For example, when asked what is most likely to blow governments off course, the former UK Prime Minister Harold McMillan famously said ‘Event’s dear boy, events’.  And this is how it was and how it seems to be for many states.  Disasters continue to happen yet governments everywhere are still caught out and accused of not doing enough by the press and the people they serve.

Why is this so?  It pays to reflect on how governments and international organisations form policies.  Are the policies a state develops what are best for the country they are designed for?  In spirit yes, that is why governments form policies.  However in reality, they are often a reflection of what a government believes the prevailing opinion of the country is.  And where does a government get its opinion from - generally aside from the immediate aftermath of an election, its from the media.

So, at its most minimal - the state is a structure that exists to provide stability and security for its citizens.  At times, events (disasters especially) can prevent the state from achieving this. Paradoxically, in the eye of the storm, while the disaster is occurring and the immediate reparation, the role of the state is pretty clear - help people survive and then help people rebuild their lives. However a government respond to fulfil these roles, will come under scrutiny from this media.  This can create a situation from which the state is only likely to lose, forcing governments to develop defensive postures to justify the management of disaster responses.  For a current example of this, look at some of the press speculation regarding the UK government responses to Hurricane Irma.

The Media as the arbiter of truth.

As an event occurs, so does the reporting on the event.  ‘Traditional’ media documents events as they occur in real time - with agencies delivering stories and facts on particular issues.  Non-traditional (social) media does a similar thing but accounts for events at the individual level and documents first hand how the events/disaster transpires, far more quickly and with a greater range than traditional media can acheive.  So effectively, we now have two variables - we have the specific event and then the belief of what the event is and its impact.  Think of it like this:

20170921_collective_opinion_on_events.png

The speed in which both traditional and non-traditional media report on disasters is probably a good thing.  Such rapid information sharing helps emergency services find people and increases local intelligence to what is happening as it happens.  However, after the event, what does it mean for building resilience?  This is more challenging, after the disaster the state is still required to provide security and stability for its citizens.  However, at present, the means with which it prioritises against such future disasters could well be skewed by its experience of previous events and also the biases of the local media that hold the state to account.

Does present day opinion drive long term planning?

When it comes to preparing for the long term, could implicit bias formed from short term events have an undue impact on government planning?  Are local or short-term, high-impact events always going to be more significant for the policy maker?  Is policy formation doomed to be reactive?  Is it inevitable that governments will always put their resources into preparing for the same kind of disasters that they have experienced in the past? There is a real philosophical point here - are the disasters of the past likely to happen again in the future, or is that they would have greater political cost if they happen again?  How does this reflect on preparation for other disasters?  Does it lead to us preparing for the last disaster?

Disaster prediction.jpg

Image from http://www.hacienda.org/ho-network/ho-nw-2010-09-fire-department-makes-plans-disaster

Perhaps this isn’t so bad, one truth that we could all do with accepting is that climate change will continue to increase global temperatures and the frequency of extreme weather events. But how does this help drive a policy response?  Should governments work on more generalised forms of disaster response, forming larger more complex strategies for global reforms, or should they stay with their traditional, reactive approaches.

Additionally, what role does the media have in how we prepare and respond to these increased natural disasters?  Should it become more aware, more able to communicate its own biases?  For example, at the global level are natural disasters more significant in Western countries - purely because the weight of reporting in the media? This is an ethical point, and a challenging one.  Truly, does the news carry greater currency when it immediately applies to you?  For further research on global media reporting patterns, have a look at this analysis that breaks down the reporting on the recent US Hurricanes and the flooding in South Asia.  

GDELT analysis on Hurricane versus Flooding coverage.png

Image from - http://www.irinnews.org/maps-and-graphics/2017/08/31/hurricane-versus-monsoon

Context is key.

In disasters an awareness of what’s happening is obviously important.  Such awareness is invaluable in the eye of the storm and traditional and non-traditional media are both incredible in terms of their efficiency and scope for transmission of information, which can and will continue to save lives.

However, post the event, how well does the media form a narrative around future planning? How well does the government prepare for different issues, rather than trying to block and prevent issues of the past?  

In the future could we be able to form better plans by having a better awareness of the prevailing narratives of the day, or will these continue to settle and drive policy for the next 5 years after the event (5 years is typically the length of elected office for most democracies). Could thinking about how we research and model data allow us to start to understand our biases a little better and try to intervene before a narrative is set that will drive only one kind of response?  After a disaster or a crisis can we unpick the assumptions we make around the future?  Having had one disaster do we need to build bigger defences in the same place or change how we think about building them in the first place?  

With the technology at our disposal and an increased awareness of threats and opportunities for the future can we help governments to be less reactive?  Can we review the mechanics of policy formation, is it necessary for the press to be the constant adversary of the state and the constant nature of governments to defend their decisions?  

Trying to address these questions could change how we think and act in the future?  It’s not easy to change, but by at least being aware of our own assumptions and biases around the future, we may have a good starting point.  To do that, projects like GDELT - can help us keep track and monitor how and where differences in reporting occur and hopefully help us better understand how we prioritise our future responses.  At the same time, accepting that the future challenges presented by climate change will be complex and far ranging and will require significant changes in our approaches if we're to address them in the future, would probably be a good thing.  But sadly, a step too far for some people.

 Image from https://www.theguardian.com/commentisfree/picture/2017/aug/29/ben-jennings-on-donald-trump-tropical-storm-harvey-climate-change-cartoon

Image from https://www.theguardian.com/commentisfree/picture/2017/aug/29/ben-jennings-on-donald-trump-tropical-storm-harvey-climate-change-cartoon

Comment

Comment

Context brokering - understand the hype.

If you haven’t heard the term ‘context brokering’ - you will do soon.  The 2016 Gartner ‘Hype cycle’ highlighted it as a smart machine technology that will begin to mature over the next 10 years.  

 Gartner Hype Cycle 2016 - from http://www.gartner.com/newsroom/id/3412017

Gartner Hype Cycle 2016 - from http://www.gartner.com/newsroom/id/3412017

To understand this development a little more it’s probably worth defining what we think ‘context brokering’ could mean and its different levels of application, as at present it seems to covers a range of different problems and timescales, which can cause confusion.  So, let’s look at the first part.

Context

At the time of writing (19/07/2017) - if you google ‘context’ you get the following:

In life (and business generally) we regularly make plans to deliver a particular outcome or to get to a particular point.  To form such plans, we generally need some kind of understanding of what’s going on.  This is where the context part comes it.

Having a picture of the ‘context’ you’re currently facing (to use the google definition - the circumstances that form the setting for an event, statement or idea), or likely to face in the future allows you to both understand what decisions you might need to make and any assumptions that could be implicit in them.  To understand this a little more, have a look at our post from January 2017

Once you understand what context means, you start to appreciate the issues of ‘timeliness’ (when do you want to do something) and 'relevance' (what sort of data is there to help you understand a context, and how regularly is this data produced) - and these two qualities relate to the ‘brokerage’ side of things.  

Brokerage

A ‘broker’ is someone or even a section of computer code that offers a particular service and this is generally to find or provide some kind of data to help you make a better decision.  In the application of context brokering, the broker is (in theory) a type of service that finds and returns data relevant to the particular context required to form a plan or make a decision.  

However, the decisions that people need to make vary greatly, both in terms of complexity and timeliness and this is what’s crucial to understand when considering how context brokering could work.  What’s crucial about ‘context’ in the context of context brokering (I won't do that again, I promise) is that it currently has different meanings - and these generally relate to the timeliness of the data being used.  By exploring what 'context brokering' could mean, and how it could be applied we've deduced that there are two key applications for how context can be useful in the real world:

  1. Immediate prediction of consumer behaviour
  2. Understanding of strategic insights

To understand these two different applications, we've unpacked them a little here.

1. Immediate prediction of consumer behaviour

The immediate predictive benefit of context brokering is probably in the form of brand and consumer insight generation.  With advances in big data, many organisations generate and/or have access to a large amount of data across a wide range of sources.  By managing and aggregating all these different data sources organisations can start to generate particular contexts, perhaps for how a brand is performing or how consumer spending happens.  For example, an online sales platform could use different cookie data to track what associated web-pages a shopper has viewed before and around a purchase.  This could yield interesting predictive data, for example, do people look for certain products during certain climatic conditions for example, in a heat wave - do sales of hats and fans go up? Or would a person purchasing sun cream, mosquito repellent and beach towels, be interested in new sunglasses?  As such, does such an understanding of unique contexts derive a commercial advantage?

To fully get value from immediate context brokering, there are a number of research questions to consider.  For example, how does an organisation bring together and model data?  For a large organisation with an established infrastructure, this isn’t likely to be an issue - the sources and data collection mechanisms already exist - it’s mostly a question of making sense of the data and being able to translate it into some form of action or insight.  It is perhaps a bigger challenge for an organisation trying to understand the utility and applicability of this type of service to their business, for example, if you don’t have access (or the need) for big data and you’re not generating masses of data, what value will it have to your business?

Additionally, if context brokering is likely to add value to your business, how wide ranging are you data sources?  How accurate a context can you produce just using data from facebook? Can you get a more accurate context from looking across as wide a variety of social media channels as possible?  If so, what are the cost implications for drawing data from such a wide variety of sources?

Another question to consider is what technologies exist to make context brokering a reality? This is currently a hot area.  Predictive techniques are improving as people use more sophisticated statistical models.  Machine learning can be applied to train algorithms to detect patterns and find specific terms in larger and larger datasets, ditto for machine vision algorithms and visual data.  At the same time, databasing and data storage continues to rapidly increase as does processing power.  All these technology trends make the immediate predictive benefits of context brokering increasingly enticing.  However, it’s still worth reflecting on the fact that however pure the model and the maths, at some point, context has to equate to action for it be of value and this is something that can often be forgotten.  Essentially, the final challenge is making sure the right kind of predictions are linked to the right kind of behaviours!

2. Context brokering for strategic insights

Another application for context brokering relates to less immediate decision making and relates more to research and development.  Timeliness is not such an issue in this application - its the scale and breadth of data covered to inform a decision that’s important here.  At present, it’s the kind of activity that kicks off many large projects, especially in research, policy and academia. The common element to immediate prediction is that, such projects are undertaken to determine what we believe the state ‘truth’ is around a particular issue/idea or event.  

This form of 'strategic analysis' is less time pressured than immediate prediction, but the sources and ranges of the data used to inform our actions, plans and decisions are still important.  In the past, organisations have generally done some kind of early activity like a literature review, or assigned an intern or student to summarise the research around a particular issue.  From this, an assessment is produced on why we want to do something or follow a particular course of action. Context is really important here, often we base our first principles for a course of action on our belief around a certain event, paradoxically, this often the point where we do the least amount of research and it can be subject to a high degree of bias and poor research.  For example, if too small a starting dataset is used our assumptions and lack of research can be quickly exposed as the work is shared more widely.  This is where context brokering can offer a decent alternative to such traditional techniques (which we see today in the form of literature searching and workshops).

Using context brokering for strategic insights improves how we gather, store and map knowledge; enabling us to have greater confidence in our initial assumptions or understanding of complex problems.  As a technique it can also allow us to produce ‘ontologies’ for particular problems that can allow more specialist data gathering and improved research gathering and network understanding and this can allow us to learn and gather more data more efficiently.

However, as with immediate prediction, there are challenges.  Selection bias can have a large impact for such a technique, especially if a small data set is used and if particular terms favoured knowingly or unknowingly, the process can simply yield more things to confirm a particular view of the world.  Additionally, the issue of perfection versus relevance still applies greatly - however good our model is, it still needs to be communicated with decision makers who need to be able to understand and interact quickly with the main findings of the model, while at the same time trust that the model relates to ‘real’, trusted data.

Additionally, strategic insight generation is probably based on a more limited format of data. Where there are considerable conversion issues for immediate prediction, strategic context brokering tends to rely on text-based analytics (this could reflect the longer lead time in data used for research and development planning?)  This means it is, in some ways, a simpler area of study, one that can benefit greatly by further research of the applications of machine learning for speeding up how the data can be processed and used.  However, its still worth coming back to the potential bias a human can apply in such analysis - but does the intelligence and insight that such human input provides outweigh the downsides? This is a key issue for further research - one that data scientists and analysts continually grapple with. How do you configure the optimum balance of machine-based learning to improve the efficiency and scale of human analysis?  What role does the human analyst have in the analysis process, when at least for the next 20 years, they are likely to remain the best predictor of context and its translation into specific insights, actions and implications?

With all these points on board, and to offer some kind of conclusion to this post, it’s probably worth defining and thinking about what 'context brokering' could mean in the future as we start to understand its applications a little more.

An updated definition of context brokering

Context brokering is a service that enables actions and insights to be generated from broad sources of data and information.  It can be applied with different levels of timeliness - from the immediate to the strategic.  Immediate context brokering as a service applies advanced forms of computer science to provide actions and insights either to another system or a human.  Strategic context brokering, applies the same principles to wide ranging problems that have considerable published literature (often from a scientific or research basis) to map and better inform decisions and insights to be formed around the dataset.  

Additionally, another thing to reflect on is how context brokering works as a process - which whatever the timeliness of the data, tends to rely on the following process.  

  1. Definition of problem and sources

  2. Data gathering

  3. Data storage

  4. Mapping

  5. Action/insight generation

  6. Feedback to 1 (as required)

Final thoughts

'Context brokering' is a newly emerging area and it’s exciting to be in it.  Our own insights have come from the smaller scale applications of strategic context brokering, but what’s interesting is how applicable many of the techniques are to different sources and timescales. However, it may still be worth reflecting that certain principles for analysis still hold true, and are perhaps more important than ever when applied to the era of 'big data'.  As well as issue of timeliness and relevance, trust is still key.  How much do you value and rely on your sources? It is your sources that will ultimately still drive and sustain the validity and quality of whatever context you produce.

What do you think?  If you have any thoughts you’d like to share on context brokering, please either add them here or drop us a line at info@simplexity.org.uk!

 

 

 

 

 

 

 

 

 

 

 

Comment

Comment

Using context brokering to map the strategic consultancy industry

There are many applications and levels that context brokering can be applied.  To provide a basic example, we’ve applied a simple network analysis to map and understand the market for strategic consultancy services (at this stage we’ve focused the mapping on the UK, but a large number of secondary, global sources have been found in the search).

To do this we designed an analysis process that took openly available data on strategic consultancies based in the UK.  Using a starting source of around 26 strategic consultancies, which we took from http://www.consultancy.uk.  We implemented a series of crawlers that would return organisations and sources connected to these source organisation websites and produced the following maps to summarise the data we found.  At this stage, we've used a fairly limited sample - we haven’t taken the searches further to specific organisations or government departments who clearly also care about strategy.  The purpose of these maps are to show the utility of mapping network data(particularly for subject matter experts) in the early stages of a plan or strategy formation.

 Map 1 - A network diagram illustrating consultancies and businesses related to 'strategy'.  'Primary' relates to source web sites, 'Secondary' relates to further web sources uncovered in the crawls.

Map 1 - A network diagram illustrating consultancies and businesses related to 'strategy'.  'Primary' relates to source web sites, 'Secondary' relates to further web sources uncovered in the crawls.

In addition to the sources for strategic consultancy, we were also able to harvest email addresses for different contacts in the organisations.  Map 2 below, shows how the organisations broke down into specific email contacts.

 Map 2 - Bubble map indicating which source organisations provided email contact details, gathered through crawls.

Map 2 - Bubble map indicating which source organisations provided email contact details, gathered through crawls.

How does this context add value?

Such a network analysis of open data allows us to produce a context for who cares about strategy and potentially highlights who could be interested in context brokering as a service. We’ve aimed this study specifically at the UK strategic industry to illustrate how context brokering can be applied to a sector that prizes strategic insights and one that also produces a wide range of rich data on strategy.  By mapping out who we believe the ‘players’ are - we have a good start point to work from.  We can add more sources and contact details as we find them, but also use these starting sources to gather further data and literature for further contextual analysis, such as topic modelling.

See what you think and if you have any questions about our dataset or analysis, get in touch at info@simplexity.org.uk!

Comment

Comment

Could the Silver Economy promote healthier, more sustainable ageing? A case study using data from the Netherlands.

Around the world, people are ageing.  The phenomenon is more marked in the developed world.  In Europe and focusing on The Netherlands specifically, the average age of the population has gone from 71 in 1960, to around 81 in 2014.  This trend has been seen in many countries around the world and has led to an increasing number of policy decisions on how to better support and utilise this increasingly significant population.  For example, ‘The silver economy’ as a concept is more positive than traditional concepts of ageing, that tend to focus on the simple provision of care and support to the elderly community.  The silver economy as a concept, focuses on how economically significant the +65 age group can be.  Could the significance of 'silvers' in the future lead to a change in how we treat and support people as they age?  Could policies and attitudes to ageing become more nuanced and lead to increasingly diverse ways in which this group can contribute and support local and national economies?

To help understand the Silver economy better and the opportunities and challenges it could present in the future, we analysed some of the current research around the concept - specifically in the Netherlands.  This gave us a clearer idea of what trends and insights currently surround the silver economy and ageing in general.  The analysis also allowed us to produce a ‘topic map’ that summarised these trends as well as low frequency ‘outlier’ trends and insights of general interest to the silver economy and ageing (for the method on how this was conducted please see - The silver economy in Holland a data driven Horizon scan).

As the topic map above illustrates, despite the positive opportunity the Silver Economy represents, the data gathered in this analysis suggests that a lot of the current data and research mostly focuses on the current constraints, costs and concerns around ageing in general.  To understand this further, we’ve broken each theme down into specific narratives based on the data collected (using the most frequently occurring keyword themes as a means of prioritising them).

  1. Care

Society will see continued demand for care for ageing populations.  With an ageing population there will be considerable demand around how and where care is provided.  What constitutes ‘care’ can be quite varied for an ageing society, social support and welfare provision will continue to be important for the ‘early aged’ (the silvers in the 65-75 age range) but becoming more chronic and concerned with the provision of long-term-care and geriatric medicine for the ‘older aged’ (+75 years).  

There will continue to be considerable speculation around how care is provided to ageing communities.  In addition to the type of care, there is considerable discussion around different processes of care delivery.  For example, across Europe there are very different models for how and where care for ageing people is delivered.  In many countries, there are models of ‘familism’ in which individuals provide direct care for their ageing parents and relatives by often having them live in their own homes together (as is the case in countries like Spain and Italy). Other countries like the UK and Holland, tend to base care on state-based models, with ageing individuals more likely to have to fund (with or without state-support) their own care requirements, which are provided by state, or state-subsidised care workers.  Across the developed world there are many different variations mostly between these two sources of funding for care - a continuum of care between the individual and the state.

Deductions for the silver economy.  Ageing is a complex process.  As a person ages their care needs will change and diversify as people go from ‘younger’ old age, to advanced ages.  At present, many nation states, including Holland, use well established social care and pension models to address these costs.  How resilient are these models for the future?  How could they be improved to reflect the increasing health and longevity of people post retirement age (65+)?  Could the silver economy represent a new employment sector for adults in traditional retirement age?  Could such communities be better incentivised and empowered to organise care systems more efficiently and in more beneficial ways than state-controlled systems that treat all members of the 65+ community with a dated, one-size fits all policy?

2.  Health

Health care needs will continue to diversify for an ageing economy.  As our knowledge of medicine and technical solutions to health care problems become increasingly sophisticated, the health care needs of ageing populations are continuing to diversify.  This trend does increase the health and well-being of the average person, as lifestyles become generally healthier and care continues to improve (as reflected in increasing life expectancies).  This also creates questions around how people can age more healthily; for example, could such a trend enable people to grow old in manners that see all of the many different components of their health addressed? As well as the clinical and functional needs of health, can the increasingly important issues of social care and mental health (especially loneliness and isolation in ageing communities) be more specifically addressed?  Additionally, how will issues such as dementia (projected to continue to dominate health care provision) and other chronic diseases be addressed over time to help promote healthier ageing?

There will be considerable demand to address the health care costs of ageing in the future.  As people continue to live longer lives, the demands to access health care will continue to grow as more treatments are available and people requirement them for longer periods of time. Technology will represent a potential response to address some of these costs, for example, loneliness (a common concern for many ageing communities) can be addressed more rapidly today and in the future using community based initiatives and increasingly accessible ICT technology.  Additionally, smarter, age-friendly homes can improve how people are supervised for care, potentially making support and care provision in later life easier and more cost effective.  However, as scientific and technological knowledge advance, the need and desire for ‘solutions’ to the ‘problems’ of ageing will also increase.  Such demand is more likely to increase the overall cost of ageing, with insurance, individuals and that state often being the main sources of finance to provide them.

Deductions for the silver economy.  In the Netherlands, and many other European countries, health trends will continue to have significant impact on ageing.  In one sense people are likely to be healthier for longer and lead more active, independent lives.  This could lead to significant empowerment of ‘silvers’, who could remain economically significant greater and greater ages and, again, could represent a significant driver for the silver economy.  In rethinking how silvers contribute their both their economic influence, but also greater available time made possible through retirement schemes based around the 65+ age range, could the young older age represent an important sector for care and organisation of elder care (+75 age ranges)?  Such considerations could be important especially for countries like Holland because, as health care demands and access become increasingly diverse and complex, the financial burden imposed on the state to provide current levels of care could be highly significant for the future.

3. Service Provision

Do current services meet the needs of an ageing society?  Within the data there is a general reflection that the requirement to support an increasingly ageing society represents a future challenge on current infrastructure and services.  People are living longer, but social and health care models are not, generally, changing to reflect this.  At the global level, this is seen as a considerable discussion surrounding who should provide care - is it the state, is it the individual or their families?  At the national level (in countries like the Netherlands and the UK) debates often centre on how these services are provided, generally with the state being on one end of a spectrum and private health insurance becoming increasingly significant at the other with family care and volunteer services somewhere between these two options.   In such debates, there are often long-held cultural assumptions that the state or the individual ‘should’ provide care.  Due to the polarity of such beliefs and a lack of clarity of who should be providing care, there can often be significant gaps that older individuals can fall through when questions of ‘who should be providing care’ are not addressed.  In some countries, the state picks up the burden, in others the vulnerable, and the aged who require the most support can sometimes be left with nothing.  Is this the best way?

How could models of service provision change?  Currently, a high level of care in many countries is provided by either cheap, unskilled labour (often fulfilled by migrant workers) or volunteers and family members. Family support as a model of social care could change in the future should traditions around shared generational housing (and the general cost of housing) change, additionally, as family size decreases (a generally accepted trend of development) and general costs of living and housing rise, will future generations be less disposed to the direct provision of family care?  As well as family, a considerable proportion of unskilled care provision is often undertaken by migrant workers.  How does this impact on future service provision, if political isolationism (seen in policies such as Brexit, or current US policies on immigration) means that migrant workers are less supported in a developed country? A significant proportion of the labour required to deliver care services to the silver economy could be reduced.   Additionally, in some countries (especially those with poor national economies) there is currently a considerable shortage of skilled and unskilled paid healthcare providers as they seek better employment opportunities abroad.

Deductions for the silver economy.  A more nuanced awareness of ageing and the benefits initiatives like the silver economy could provide represent a significant opportunity for service provision, for both ageing individuals and the state.  At present, it is often the informal, volunteer and charity sectors that addresses many of the gaps in welfare provision for the ageing society, perhaps reflecting the significant differences in care models from the state and the individual.  Could the silver economy represent a way of organising the informal provision of care for greater benefit to the individuals and local economy?  For example, could the contribution of the newly retired (who often contribute to the volunteer sector for elder care) represent an important demographic for the organisation, management and delivery of many aspects of care to the older aged - especially for social support?

 Image from   http://www.silvereco.eu/

Image from http://www.silvereco.eu/

4. Ageing

People will continue to age, but perhaps more healthily.  As scientific advances continue to drive longevity and health improvements and as society becomes more educated on healthy behaviours it is likely that people will continue to ‘age well’.  As a result the average age of the population is likely to continue to increase in the developed world and life expectancy is likely to continue to rise.  Male life expectancy is likely to improve, with men living on average, slightly longer, although women are still likely to live longer in the future.  This is mostly driven by changes in behaviour and an increasing awareness of how to stay healthy.  However, as society progresses, the issues of ageing - such as dependency and frailty will become increasingly important to address to keep people fully healthy for as long as possible. Additionally, the psychological impacts of ageing will become as important as the physiological ones, with issues such as loneliness and mental health becoming increasingly important to address.

Deductions for the silver economy - addressing frailty and reducing dependency could be significant ways in which the silver economy could help address some of the current challenges and costs of the ageing process.  Addressing how frailty arises in older people could have a significant impact on the health and quality of later life and potentially reduce the level of unnecessary hospitalisation and institutional care.  This, in turn, can help reduce dependency on the state for the continued provision of care but, more importantly, help improve the quality of life as people move into advanced ages.  Could the resources and skills of the generations that constitute the silver economy enable a fresh look and a new approach for care provision that provides both more sustainable care models but also a healthier ageing process?

5. Pensions

Retirement age is likely to increase in the future.  People are likely to live for longer, as a result it is likely that most countries will need to increase the age of retirement.  How countries do this will see considerable variation, many will increase the age of state pensions and retirement through a gradual process that reflects the gradual increase of average age in the population.  However, change is not likely to occur at a pace that reflect this distribution of economically productive populations and the continued perception that the young are working to pay for the retirement of their elders.  This is a challenging perception, often driven by demographics, for example, in the Netherlands ‘baby boomers’ account for 28% of the national population and middle aged groups (those between 35-44) account for 12%.  As a result, is it likely that less people will be working more to sustain those older than them in progressively longer periods of retirement.  Does this represent a significant argument for more nuanced plans and policies surrounding retirement?  How could this relate to pension schemes in the future?

Deductions for the silver economy - could the silver economy represent a new way of thinking about retirement and pension provision?  How many people currently retired devote a significant proportion of their time and resources to volunteer services to help people older than themselves?  Is it possible that the silver economy could represent a new form of employment for the newly retired and younger generations alike in a combined generational effort to build better economies around the realities of care provision to an ageing society?

6. Data

There are considerable differences in how different countries address ageing.  At present it is clear that there are considerable differences at the state level in how different countries provide pensions and services for ageing populations.  To improve and provide better forms of sustainable care, a comparison of different national systems could illustrate how different models, from volunteer to family care, through to fully state-based care are provided.  Such research could allow a better understanding of how to adapt current policies to better and more economically reflect the needs of increasingly ageing human populations.

What data exists on ‘silvers’?  The notion of the silver economy relies on people living longer and healthier lives and the assumption that many of these people would either want to give up their retirement years to continue to work and/or continue to have significant economic influence?  Is this this case?  How real and influential is the silver demographic?

  As a 70 year Billionaire could Donald Trump be the champion of the Silver Economy? Image from   https://www.washingtonpost.com/blogs/right-turn/wp/2017/01/15/the-senate-intelligence-investigation-must-tell-us-if-trumps-team-conspired-with-russia/?utm_term=.e50e3ba74b34

As a 70 year Billionaire could Donald Trump be the champion of the Silver Economy? Image from https://www.washingtonpost.com/blogs/right-turn/wp/2017/01/15/the-senate-intelligence-investigation-must-tell-us-if-trumps-team-conspired-with-russia/?utm_term=.e50e3ba74b34

Implications for the silver economy - presently the silver economy is an idea.  Its potentially enticing on a lot of levels.  At a basic economic level, silvers represent a significant source of spending power and an increasingly significant market.  In more abstract, policy terms, such economic potential could help address the increasing costs of ageing but also help provide greater employment opportunities for people longer into their lives.  But, is this the case?  Do newly retired people want to continue work, do they want greater employment opportunities or have they not worked enough?  Additionally, how many of this demographic actually do contribute to informal care and the volunteer sector - data on informal care is limited and often hard to collect.  To understand what the silver economy could be and how it could benefit society generally, more data is required to understand, how and if it can be applied.

Final thoughts on the silver economy

The silver economy as a concept seems to present a variety of different opportunities and challenges. Its promise is enticing and could reflect how ‘silvers’ have benefited from more consistent economic conditions that have limited other younger generations.  Could the economic and political influence of silvers change how we think about social care in the future, leading to more nuanced ways of responding to the increasingly complex demands of ageing? 

However, when thinking about the silver economy and how it could help drive more sustainable ageing, it’s worth remembering that a number of assumptions have been made surrounding how and who delivers such care in current systems.  At present, care for ageing populations tends to be delivered through a range of different providers - from informal family care (generally provided by women), the state (often underpinned by migrant workers) and volunteers (often themselves of retirement age).  The silver economy could represent a useful policy initiative to help co-ordinate and better resources such informal and formal systems. However, to avoid such a policy being overly aspirational and out-of-touch with the community it is seeking to support more data is required to understand how such a policy of empowerment could help people ageing.  

A recent example of a similar policy is the ‘Big Society’ that was implemented in the UK without the full research into how and who it could benefit.  This policy was based on the assumption that people would volunteer to fulfil the need for a wide variety of service providers without understanding the scale required to do this, could the silver economy suffer from such similar assumptions?  

As a concept the silver economy is enticing, but what is the appetite amongst the newly retired and how would it be delivered to address and support the current service providers, and most importantly, the elderly (and silvers) alike?

This research was delivered to inform an event in December 2016 organised by Future Consult for the Dutch Rijkswaterstaat that helped understand early warning signals for the silver economy in Holland.

Comment

Comment

The Silver Economy in Holland - an example of a data driven Horizon scan.

The ‘silver economy’ is a term used to describe how the increasingly healthy and demographically significant +65 population could be of greater economic significance in the future.  Thinking about the silver economy could highlight considerable economic benefits and many governments and businesses are thinking about how to better engage with this increasingly significant demographic.  Working with our associates at Future Consult we did some analysis to help the Dutch Rijkswaterstaat better understand what the implications of an increasingly significant silver economy could be for Holland.

To do this we applied a form of topic modelling and expert mapping, that is sometimes referred to as ‘Context brokering’ today.  This post covers how this analysis was conducted, in a separate blog we’ve detailed the key findings from the analysis and the subsequent discussions it was used to facilitate.

Using context brokering to understand strategic trends.

To understand the silver economy and the benefits it could bring there is a considerable wealth of knowledge available around ageing generally.  Ageing and the silver economy relate to research in the fields of demographics, society, health, the economy and even as far as resources and infrastructure, so they are very complex, multi-disciplinary areas of study.  To conduct any analysis on this subject, we thought it best to reflect such complexity and design a basic data gathering method that bought in data from a wide variety of open sources reflecting the different sources of data.  So we based the way of gathering the data on the following process:

Doing this, we defined an initial search that returned 18 open reports detailing the silver economy, society, ageing and limited the geographical range to Holland, or Europe more generally.  After gathering this data as reports we then applied machine reading techniques to extract the most significant keywords for the combined string set of all the documents, doing so allowed us to sample the most frequently occurring key terms:

This data was then analysed further to look at the interconnections between the key terms and, a further level of analysis was conducted to ‘tag’ further terms and specific trends and ideas with the intention of labelling and discovering any interesting ‘outliers’ or signals for new and novel ideas for trends.

Doing this analysis allowed us to start to resolve the complex issue that the silver economy represents into a series of different topics, from the most discussed topics to the least.  This kind of information analysis (albeit from a small dataset) enabled us to generate a simple, ‘topic map’ to inform and guide further facilitated discussions with representatives from across the Dutch Government, Academia and Industry.  This approach, provided a clear context to start discussions around initial assumptions in real data and provides the earliest start point for evidence-based decision making.  The full dataset for the analysis is available here and the ‘rich picture’ produced using Gephi is available here.  For those wishing to engage with a dynamic data visualisation that illustrates trends and interconnections in the master data set, this is all provided in the gephi, rich-picture visualisation to access this data, please contact the team at info@simplexity.org.uk.  For those interested in the ‘top level’ strategic narrative around the data, please see the image below and the discussion of the specific trends and themes (and overall feedback on the technique) is available at the following blog post - The Silver Economy in Holland.

Comment

Comment

Context brokering - how do you apply it?

To better understand what context brokering is and how it can be applied in decision making, it’s worth considering the following hypothetical example:

A CEO of a large UK multinational organisation specialising in mobile phones has asked the business development manager what the international strategy for engagement in Africa is.  This happens in a board room and, as often happens, the BD manager knows nothing about Africa because he’s worrying about Brexit and Donald Trump, like everyone else.  The CEO isn’t happy about this, so she asks the BD manager to prepare a full briefing on the strategic options for improving their role and relationship in African Markets.  After this, the BD manager goes away and runs through a few options.

Option 1: Expert Literature Review

There is the ‘tried and tested’ option; commission an expert on Africa to produce a paper that tells them a range of strategic issues.  Once the paper has been delivered (probably at considerable expense that directly relates to the urgency) someone in the BD Managers team will condense them into a powerpoint presentation, perhaps with a detailed report of research that they can reference if challenged.  Is this good, is this bad?  Well, it’s good as it does give you answers that can be put back to the board (arguable in a linear, bulleted powerpoint format).  This traditional approach also suffers from limitations as it depends on the scale and the process through which the data has been assessed (often the biggest value has been given to the analyst who compiled the report and learned the associated knowledge in its production).  Such reports can easily be biased and often based on a small range of reports that are limited to the number that the analyst can comfortably process in the time available to them.  Also, if its based on a small number of people and papers, the assessment is at greater risk of being biased toward particular issues or outcomes.  

Option 2: Produce a context map  

An alternative option to commissioning a single expert is to produce a context map.  At present,  this does represent a significant cultural change to how many organisations currently conduct their strategic planning.  Context brokering works on the principle that the best thing to do early in your planning, is to define and gather as much data as possible and then summarise what you believe the insights and themes around an issue could be with some kind of qualifier for how valid you think the data could be (relating back to the source data to illustrate where the insights came from).

So going back to the considering the future of Africa, using data visualisation and mapping tools a context map (or topic model) can be produced.  Such a map summarises the data behind an issue and provides a start point for strategy making.  This produces a map that is a lot more engaging and derived from a wider range of sources (there is theoretically no upper limit to the number of reports that can be analysed and mapped, although at present our own experiments at Simplexity Analysis are around 1000 documents). Such outputs are less static than bulleted lists and can be used in facilitated sessions with experts who can then interact with the context and add their own insights as required to further enrich our understanding of the context. Have a look at the one below produced to provide a context for future strategic issues surrounding Africa.

 

Mapping the data around an issue in this way can be daunting.  What was the domain of traditional research and literature reviews is now increasingly contested with data scientists and analysts talking in numbers and code and arguing in shades of technical purity around who’s process for mapping is more accurate (is it a complete reflection what’s in the data) or quantifiable accuracy (if you take qualitative data, is it worse than hard number predictions)? Perhaps this is why its challenging for decision makers to interact with such new techniques as context brokering does represents a cultural change - the best way of addressing this, is probably to be open and honest in the data used to make the assessment, the assumptions behind it and the limitations in the development of the context.  In the past, that’s what the weight of a large volume of research would convey.  Now, it’s probably the scale of the data that has been analysed.

Which option works best?

Concluding again with the Africa example, what’s better - a bulleted powerpoint presentation of ideas, referenced with a weighty research tome (that, lets face it, few people are going to read).  Or a map, summarising a range of options that can be discussed and assessed by the board, or through associated activities that equate to action for the board to sanction and the associated data available for analysts to reference further as required?

For more information detailing differences of approach for mapping and analysis, please see the following presentation that outlines the differences between conventional analysis and data driven approaches.

Comment

1 Comment

Context brokering. What is it and what does it mean for strategy?

Context brokering, is a relatively new term that broadly relates to using data to provide a context around a particular issue (other techniques like topic modelling or ‘concept testing’ are sometimes also used to describe a similar analysis process).  Context is particularly valuable where people, in business or government, need to derive insights and understanding around a particular issue, that can be highly complex and involve a large range of data.  This is where context brokering has a strong link to strategy, which is where someone, generally a leader has to take the data and do something with it.

Forming a strategy, or even a plan, generally requires an understanding of what’s going on.  Having a good understanding of the context allows you to both understand what decisions you might need to make and any assumptions that you could be making.  This really isn’t new.  As an analyst, the first thing everyone tells you to do is start by understanding a particular problem or issue.  To do this we generally start by gathering data.  There are many ways we can do this and the choice of method usually depends on the time and resources at our disposal.  But however we do it, be it from simple google searches through to a detailed literature search, the aim is the same - to gather as much data as we can to inform our understanding of a particular issue or topic.   

So, to form a context we need to first gather data and then decide on our approach on delivering a particular outcome (our strategy).  In the modern, data rich world, this is often quite a challenging thing to do - we now live in a time where it’s not about too little data, but too much.  We continually face questions about how reputable our data is, so understanding how to refine and understand the data is becoming increasingly important.  Traditionally, this used to be limited to how much information the human gathering the data could process.  So, in a way, we’re roughly limited to say around 10 reports of maybe 20 pages a report, perhaps a 100, if you’ve got an inhouse team of people and some analysis processes to help you triage and summarise the increasingly complex research data.  

Today though we are a lot less limited by human processing.  There are many options and dashboard solutions that enable people to gather a lot more data and make sense of what is being said.  Making sense of data is now increasingly important and the range and the scale of the data is increasing.  So in some ways, the data available to be understood is far greater, potentially less biased and not limited to the human processing bottleneck.  But, this creates a new range of issues - how accurate are the processing algorithms applied to them and how and where should the human intervene to select out the most important aspects of the data for context?

And this is the challenge we now face - attempting to balance tools and techniques that allow us to gather and structure more data, whilst providing a useful, accurate and informed context that enables us to make better decisions and form policies and actions.  And this is where context brokering can help, but it can be a complicated process that yields deceptively simple outcomes, so to understand how it’s applied and how it can differ to traditionally applied techniques it’s worth considering an example.  Have a look at this post that explains things a little more.   

1 Comment

Comment

The 'scan of scans' - using machine learning to improve horizon scanning

Over the past three months, we've been testing different machine learning and machine reading techniques to help better understand large volumes of data.  We've been aiming this specifically, at foresight or 'horizon scanning' because we feel that this is an area of analysis that could greatly benefit by more thorough, data-driven analysis.

To conduct a meta-analysis of foresight material we designed a system that uses crawlers, machine reading, cloud-based databasing and data visualisation scripts. Using these processes we implemented a three stage process of analysis - searching, mapping and analysis.

The scan of scans

Searching

Following this process, we developed a detailed search set of data, this widened the initial list of starting sources from a list of 20 sources to a further iteration of around 150 sources.  Such a growth in starting data was made possible by good engagement with experts in foresight, but then from the use of crawlers that bought back a large range of document sources, which were then data mined and assessed for viability and suitability.

Analysis

The search phase led to around 1200 research reports (as either pdf files or html files converted into pdfs) being gathered. Of these, 1050 were deemed to be relevant to foresight and the remaining 150 were either duplicates, write-protected pdfs or deemed out of scope.

Upon selecting the relevant source literature, we split each individual research document into its constituent strings. This was done using bespoke machine reading processes that tag source documents with metadata (relating to the title, author and source organisation) and then splitting each document into its constituent strings, each assigned with the appropriate metadata. 

Once split into strings, all the data was then held on a bespoke DJANGO database navigated through a graphical user interface (GUI). Using this GUI, an initial data visualisation was produced that illustrated the keyword frequencies contained in each source document. 

Mapping

With all the data gathered and hosted on the main database, there were around 11,000,000 source strings that could then be analysed. We conducted two forms of mapping on this data and the associated metadata.

1. Topic modelling to determine the most frequently occurring themes and concepts in the full dataset.

2. Expert mapping to determine the key contributing authors for the data for future testing. 

Topic modelling yielded a rich picture of the data contained in the documents set and gave some indication of the overall concepts and interconnections between the documents.  The data was mapped using Gephi, an openly available graphing platform. 

Using the detailed topic map, allowed a simplified, topic map that summarised most frequently reported themes and concepts in the dataset to be recorded.  This produced a higher level ‘topic’ map that combines the most frequently occurring terms in the data with low frequency ‘emerging’ terms of potential strategic relevance to foresight analysts and policy planners.

Although this is a stylised representation of the data collected in stages 1-2. It is valuable as tool for enabling structured foresight exercises and scenario development to be developed around data that can be evidenced and accessed for further policy and decision making. 

Having determined the themes in the data, the search data collected in the search phase, was then used to map the contributing network of experts and source organisations.  After determining these, it is the possible to contact the sources and ask them to comment on the findings (especially the high level map) to add their analysis and insights to the initial maps. A sample, contributing expert map (high level) is below.

Final thoughts

Using a data-driven searching processes for gathering and modelling topics and expert networks can improve current processes for thinking about the future. Following a structured auditable processes is likely to increase the confidence and accountability in foresight analysis. Such processes also represent an important bench mark for making foresight more ‘quantitative’ as they allow metrics to be generated around the scale and range of data collected. Such metrics can be used to form the basis of confidence and probabilistic assessments which could increase the rigour of foresight analysis, moving the discipline away from current techniques which are often difficult to quantify and subject to considerable levels of bias and group think. 

To take this work forward, a useful next exercise could be to assess current foresight processes and benchmark them to see what markers and metrics can be used to test predicted outcomes.  Processes that use machine reading and data visualisation yield a large amount of 'hard data' it would be useful to better understand how these could be applied to improving long term prediction.

Comment

Comment

Future applications of blockchain technology

Blockchain is a relatively new technology and currently there is significant research and interest in the refinement and development of specific aspects how it works.  For example, the three areas below summarise some of the areas that are currently being researched.

  1. Proof of stake versus proof of work

  2. Using the blockchain to contain confidential information

  3. Blockchain ‘bloating’ and data storage

Proof of stake versus proof of work

Virtual currencies like Bitcoin use the concept of ‘proof of work’ to maintain the distributed ledger across the blockchain and provide rewards to miners.  However, different means of storing value in the blockchain are being researched.  At present, an alternative store of value known as ‘proof of stake’ is being developed, this is based on the perception that proof of work algorithms can be subject to certain limitations, such as inflation (consensus algorithms and mining require constant expenditure of resources to work normally, something that bitcoin addresses by paying for this cost in the pre-agreed creation of coins and therefore driving inflation).  Additionally, mining does require a considerable amount of computational power to conduct at scale, which therefore increases both energy usage but also drives centralisation of mining, as to consolidate costs, mining initiatives pool efforts.

Proof of stake has been developed to create an alternative to mining.  It works on a different principle based on validation, rather than rewarding work done.  For proof of stake, ‘validators’ are given a stake in the network, for example an alternative system to bitcoin is Ethereum, in which ‘ether’ is used a unit of value and in which validators are issued units to which they ‘bond’.  ‘Bonding’ a stake means that someone deposits some money into the network and essentially use it as collateral to vouch for a block.  Rather than proof of work, where chain validity is dictated by the proof of the work that’s gone into generating a chain, proof of stake implies trust based on the chains with the highest collateral.  As a process, proof of stake is thought to be greener as it is less energy intensive as coins or tokens need to be locked up to process transactions.  However, there are currently security considerations around the approach as, if small groups of validators own the majority of the coins then it can be more vulnerable to double spend attacks.     

Ethereum coin image from http://digitalmoneytimes.com/understanding-ethereum-casper-proof-stake/

Using blockchain technology to contain confidential information

Having seen the application of blockchain for virtual currencies, research is currently being conducted to how this principle be applied in other contexts.  For example, in a paper commissioned by MIT, a system has been developed that uses the distributed hash table from blockchain to store shared secret information. The paper, entitled ‘Enigma: Decentralized Computation Platform with Guaranteed Privacy details a system that can enable the decentralized organisation of classified information without the need to be regulated and controlled by a central authority, which theoretically could both improve accountability and make secure systems more resilient.  

Enigma uses blockchain technology to share data between different nodes; preventing a single party from having access to data in its entirety and instead, sharing it across all members of the network, who all have a small, seemingly random pieces of the entire dataset.  Doing so, means that Enigma is able to overcomes some of the issues of bloating and the scalability of data storage that can be seen in bitcoin.  For example, bitcoin blockchains, often ‘bloat’ and store large volumes of data relating to computations and transaction histories that are continually maintained in every node in the network.  Theoretically, this means that the increased scalability and efficiency of enigma enables other computations to be undertaken on the data it contains, allowing the data stored to be subjected to deeper levels of analysis and, in theory apply an attribute based access control (ABACS) style level of control that allows context specific access policies to be applied to the data.  

Image from - http://www.ibtimes.co.uk/banks-looking-mits-enigma-bring-perfect-secrecy-blockchains-1525232

Blockchain ‘bloating’ and data storage

As well as researching alternatives and new applications for blockchain, a large level of research is also being conducted into making blockchain more efficient, and address issues such as ‘bloating’.  Bloating occurs as more transactions are made which means the blockchain has more data to record. Eventually, if the blockchain grows too large it can become difficult to share or store. To prevent this in bitcoin, blocks are presently limited in size and the maximum number of transactions per second are limited.  At present, such bloating is occurring at a slower pace than the associated data storage and communication technologies required to store and transmit blockchain data.  

At the moment, it's assumed that although bloating and inflation will increase the average size of blockchains, there will continue to be the associated infrastructure to accommodate such growth, although it will continue to represent a compelling research requirement to increase the storage efficiency in the blockchain, possibly driving the development of alternative value stores such as proof of stake, which would reduce the impact of bloating.  Additionally, research into alternative ways of distributing and storing data around the blockchain network will enable data to be encrypted and accessed with different applications than just as a cryptocurrency.

Comment

Comment

Blockchain security

Now that blockchain is becoming an increasingly important technology for conveying and storing value through bitcoin and other newer, virtual currencies, how its security is maintained is becoming increasingly researched.

Although highly encrypted and well-designed for resilience, blockchain technology is not without vulnerabilities.  A variety of attacks could be undertaken on blockchain based technologies (to understand some of the technical terms here - like mining, forking etc - please see the blog on how blockchain works):

Grinding attack

In Bitcoin, the process miners follow to provide hashes to seal blocks is sometimes called ‘grinding’, as they move through block headers one by one, trying to seal them.  A grinding attack occurs when a hostile actor uses greater computational power than other miners to outperform legitimate miners and find kernels that allow them to perform the main chain.  This is seen to be a serious potential source of threats to the development of a blockchain based on ‘proof-of-stake’ coins.  

Double spend attack

A double spend is an attack where the given set of coins is spent in more than one transaction. There are a couple of main ways to perform a double spend attack:

  1. Pre-mine one transaction into a block and spend the same coins before releasing the block to invalidate that transaction (This is called a Finney attack).

  2. Send two conflicting transactions in rapid succession into the Bitcoin network. (This is called a race attack).

 

Finney Attack

A Finney attack is a fraudulent double-spend attack that requires the participation of a miner once a block has been mined. The risk of a Finney attack cannot be eliminated regardless of the precautions taken by the merchant, but the participation of a miner is required and a specific sequence of events must occur. Such an attack challenging to perform and only makes sense for the attacker when the gains from the attack are significant.

Race attack

In Bitcoin, someone who accepts payment immediately on seeing "0/unconfirmed" could be at risk of a double-spend occurring.  For example, if a hostile actor successfully communicated one transaction to the merchant but at the same time communicated a different transaction that spends the same coin that was first to eventually make it into the block chain, a race attack would occur. Bitcoin users can take precautions such as only connecting to well connected/known nodes to to reduce the risk of a race attack but the risk cannot be eliminated.

>50% Attack

A greater than >50 Attack (often known as a majority attack) occurs if a hostile actor controls more than half of the network hashrate.  Such an attack can occur when an attacker submits a transaction which makes a payment while privately mining a blockchain fork in which a double-spending transaction is included. If, after waiting for a certain number of confirmations (n) the merchant sends the product and if the attacker happened to find more than n blocks in this time, the attacker can release the fork and regain the coins.  Alternatively, if the attacker has been unable to find more blocks, they can also try to continue extending their fork with the hope of being able to catch up with the network.

The likelihood of success for a >50% attack depends on the attacker's hashrate (as a proportion of the total network hashrate) and the number of confirmations the merchant is waiting for. For example, if the attacker controls 10% of the network hashrate but the merchant waits for 6 confirmations, the success probability is on the order of 0.1%.  However, if the attacker controls more than half of the network hashrate, this has a probability of 100% to succeed. Since the attacker can generate blocks faster than the rest of the network, he can simply persevere with his private fork until it becomes longer than the branch built by the honest network, from whatever disadvantage.

Vector76 attack

A Vector76 attacks is also known as a one-confirmation attack and is a combination of the race attack and the Finney attack and occurs when a transaction with one confirmation is double-spent.  Similar to a race attack, such an attack can be mitigated by not allowing incoming connections and using explicit outgoing connections to well connected nodes.  Such attacks also cost attackers, as they need to sacrifice blocks (and not broadcast them) by relaying it only to the attacking node.

Brute force attack

A brute force attack is similar in nature to a >50% attack.  Such an attack occurs when a hostile agent submits a transaction which pays the merchant, while privately mining a blockchain fork in which a double-spending transaction is included instead. As with >50% attack, an attacker can then wait for the merchant to make a certain number of confirmations and send the product, while the attacker looks for more confirmations with the intent of building a larger fork than the main network or catching it up.  If the attacker never manages to do this, the attack fails and the payment to the merchant will go through.  For such an attack the attacker requires a relatively high hashrate.  The probability of success is a function of the attacker's hashrate (as a proportion of the total network hashrate) and the number of confirmations the merchant waits for.

All of these different types of attack have led to the development of new approaches and applications for blockchain technology (increasingly based on new and different theories) - these are explored in 'Future applications of blockchain technology'.

Comment

Comment

How blockchain works

A lot of data in the public domain details how blockchain and associated technologies (such as bitcoin) work and the concepts associated with them (such as mining, forking and wallets).  This is a complex area of study, combining computer science, hardcore maths and economic theory - not always the easiest blend of subjects.  But stick with it, because it is interesting and likely to be increasingly valued in the future.

Blockchain’ is a digital encryption process developed for the decentralised, virtual currency known as ‘Bitcoin’.  The ‘blockchain’ itself is a database architecture based on transactions. Essentially, this is a chain of ‘nodes’ (or ‘blocks’) that run together in a system, that when all combined form a bitcoin.  A full list of all the nodes in a block chain provides a full copy of a currency’s history.  This ‘ledger’ can be very valuable as it provides a full list of every transaction a particular bitcoin has ever been involved in.  

Such a system works on the following principles: Within each ‘block’ there is something called a ‘hash’ made of the previous block.  A hash is an algorithm that turns a large amount of data, into a fixed length hash-code.  For bitcoin, there is a distinct algorithm called SHA-256 that generates random numbers that serve as unique codes for each hash.  This process (known as ‘hashing’) is applied every time there is a transaction, so, each time a new transaction occurs a new block is made and the previous block is ‘hashed’ and stored in the new block.  When this happens, the starting block is known as the ‘genesis’ block, as it is used as the start point to make the next, new block.

One powerful aspect of block-chain coding is that each time a new block is made, the data in the genesis block effectively remembers the initial start data (the hash retains the previous store of data and it is this that makes the ‘chain’ in block chain).  This means using blockchain for bitcoin transactions allows you to work back to determine how much value belonged to each address in the chain at any point in history.  This can be done because each block is guaranteed to come after the previous block chronologically because the previous block's hash would otherwise not be known and stored in the block. As the blockchain gets longer it also becomes harder to modify, as every previous block in the chain would also have to be regenerated.

What happens when a blockchain forks?

Sometimes, something called a ‘fork’ can occur in the blockchain.  This happens when two blocks are formed at the same time.  In bitcoin, when this occurs two separate chains start to form as new blocks are generated around each node. These two chains with a shared genesis are identical until the point the chain ‘forked’, after which these chains then exist in parallel and two separate networks have been created.

 

In the case of bitcoin if a blockchain does fork, it is possible that the coins owned by users on one side of the chain won’t be recognised by those on the other side.  However, if someone owned the coins before the split - they will find that they own the coins on both sides of the split, but they will need different wallets to access and spend each set.  Such forks tend to occur when part of the wider network is not fully compatible with the rest of the blockchain network and transactions have not been accepted by the other users.  Bugs and software compatibility issues can also lead to forks occurring.  When forks do occur they are generally resolved by users abandoning the weaker side of the fork (the shorter side) and re-engaging with the rest of the network.

What is a wallet?

A wallet is a specific, private key assigned to each user to access the blockchain. With bitcoin, a wallet is an encrypted private key that allows access to the specific balance and transactions made by that user.  The private key provided from the wallet is used to sign transactions and provide signatures (mathematical proofs) that they came from the owner of the wallet, they also prevent transactions being altered once they have been issued.  Once a transaction occurs it is broadcast between users in the network and will then be confirmed by miners, who go on to maintain the integrity of the distributed ledger.

What is mining?

 

Mining is the process through which users of currencies like bitcoin keep track of transactions.  Using the blockchain as a distributed ledger, all transactions are stored and recorded around the network of users holding the currency.  However, it is important to confirm transactions and store them in a general ledger so they can be tracked and recorded and this is done by miners.

Miners maintain the blockchain by maintaining the ledger of all the data it contains.  This is important as it can be used to explore any transaction made between any addresses using the blockchain.  Additionally, whenever a new block is created it is added to the increasingly long blockchain, and whenever it is added, a constantly updated copy of the block is given to everyone that participates in the network.

To ensure the integrity of the ledger can be trusted and the blockchain is not tampered with, miners ensure blocks are ‘hashed’ - this means that a mathematical formula is applied and a code that consists of a mixture of letters and numbers are applied and stored with the block at that specific point in time.  Across the whole network, each hash is unique and generated to reflect the data within the block, so if one character is changed in a block the whole hash will change.  Additionally, each hash itself contains all the previous hash information in the chain, enabling the full transaction and encryption history of the block to be retained all the chain.  Such a process maintains the integrity of the chain because each hash is linked to the chain content.  As a result, any tampering or modification generates a new hash and all users along the network can see that the chain has been accessed and a new hash generated.  

Miners work to provide the hashes required to ‘seal’ blocks.  To do this they compete in an open marketplace using specially generated software that hashes blocks.  They are motivated to this with rewards, for example, with bitcoin, every time a miner creates a hash they receive a reward of 25 bitcoins and the blockchain is updated.  However, to reduce inflation of the bitcoin price and devaluing the process of hashing the bitcoin uses something called ‘proof of work’ to make the process of mining harder to achieve.

What does ‘Proof of work’ mean?

Bitcoin relies on something called ‘proof of work’ to ensure the hashes used to seal blocks are of sufficient quality and standard.  Hashes for bitcoin need to have a specific format (for example, a certain number of zeroes are required at the start of each hash).  Additionally, although miners need to change the hash in a blockchain, they are not supposed to access the data contained in the chain.  To modify this they use a piece of data called a cryptographic nonce (an arbitrary number that can only be used once), this is combined with the transaction data to create the hash.  Once created this is assessed to see if the hash fits the required format - if it doesn’t the nonce is changed and the process is repeated until the hash is of the appropriate standard.  Such a process can take some time and all miners in a network are trying to simultaneously find a nonce that works for that specific hash and they are all, essentially competing for the same reward and this is how they ‘earn’ their coins when they form a successful hash.

Understanding how blockchain works, is very useful.  In the next post, we look at how the technology could be vulnerable.

Blockchain Security

Comment

Comment

What is blockchain?

If you haven’t heard of it before, ‘blockchain’ is a branch of computing that specifically concerns the software processes and algorithms that power virtual, or ‘crypto’ currencies, of which the most popular is Bitcoin.

There is currently a lot of research being conducted into Blockchain based technologies.   How the technology is applied could change many things, from how we think of monetary value through to how we store and manage encrypted data.  In the posts below, we’ve put together some details on trends in Blockchain research and some basic details on how it all works, as it can be pretty daunting to the uninitiated!

The three posts below provide a bit more detail on the technology, its security and how the technology could evolve.  

  1. How blockchain works

  2. Blockchain security

  3. How blockchain technology could evolve in the future.

Blockchain basic.png

Comment

Comment

Does all foresight start with google? 3 ways to improve your forecast with open source data.

'OK, what are we doing about the future?'

For most businesses, and probably most of us in general, being asked this question can be a daunting prospect.  Your particular need for thinking about the future, probably depends on the organisation you work for. For example, government departments often have to plan for projects that last over 5, 10 even 20 years in length, so they at least need to try to make assumptions on what the future could be like over long periods of time.  For businesses, this often varies - for fast moving sectors, like the media or fin-tech, a year is a long time away and five year years seems like a lifetime! However, in other sectors, like R&D, Engineering, Defence and Insurance, companies have projects that can run out to the 2030 or even the 2040 horizon.


So, at some point, most organisations are probably going to need to plan, or at least think about the future.  When they do, they generally have two approaches for thinking about the future and both of these are generally based on a combination of open-source information gathering and established foresight practices.  And, when trying to go through their options they probably start in the same place that most us of start when we're trying to find things out - google.

So, to think about this further, I've summarised two of the most common foresight related activities and looked at how the relate to google searching, or if 'open source intelligence collection', if you want to be fancy.

Foresight activity 1 - Running a workshop.

For a lot of organisations, the best way to show that you are thinking about the future is to do something that proves you are doing something about thinking about the future!  At the moment, the most common way to do this is to convene a workshop in which a range of facilitated techniques (usually scenarios) are used to generate ideas around future trends and outcomes.

For such activities, groups of experts are assembled and asked certain questions and tested on certain outcomes.  Typically, this google is applied in heavily at the event design stage.  Some consultancies and larger organisations, will have their own networks or course, but often for emerging issues and new ideas, this network will need a lot of maintenance.  As a result, google is, generally the first step in finding ‘people’ - the great and the good, for finding those who know about stuff and then getting them together to talk about what could happen.

I’m not going to go into whether, group events are a good are bad thing.  I think the important thing is that they serve a purpose and if used at the right time in a project can really help gain knowledge and a deeper understanding around a range of trends and issues.  However, they’re not the only option in futures analysis and they can be subject to a large level of selection bias and group think, so its good to reflect on this.

Foresight Activity 2.  Running a literature search

This is where google rides even higher than in event design.  To compile a view on the future, a common thing to do is to gather as much data as possible on a specific issue.  It’s just so easy to do!  So, you have a research problem - ‘What is the long term economic stability of Europe out to 2030’, the very first thing you’ll probably do is google it.  This will lead to you, spending the morning, the day, the week clicking through links, reading blogs and foreign policy articles discussing pertinent trends.  Pretty soon, you’ll have a lot of data, a lot of different views, and then find yourself having to make a call on what the most significant ‘future’ is going to be.

As with using focus groups, compiling a literature review of open data is not a guarantee of improving how you think about the future.  It again has its limitations - does the data you’ve gathered reflect your own personal bias? Are your sources reliable?  How do you make sense of the bewildering array of links and references out there?  At the time of writing this (12/09/2016), if you google ‘brexit’ you’ll get 140,000,000 search returns.

Understanding open source intelligence

So, back to the first question? Does all foresight start with google? Well, yes it probably does, but then again do all forms of research and analysis today start with google ? Like most other disciplines, does foresight find itself needing to adapt to the scale and abundance of open source intelligence?  If we assume it does, here are some things it could do to improve how open source data could be used and accessed:

  1. More accountable predictions. Finding clear, distinct predictions about the future are hard.  Long term predictions are often deemed as too complex and too difficult to do, or worse, too risky,  ‘What if I get it wrong?’  This, although understandable, probably only worsens the situation for trying to understand the future.  The lack of clarity and measurable predictions means that a wide range of competing views are gathered that are very difficult to test, leading to a situation where everyones ‘invents their own future’ rather than trying to contribute their own predictions to a combined assessment of what the most probable future could be.

  2. Understanding the role of data and the role of experts.  It pays to be clear on this, what do you want experts for and why do you want to gather data?  Would it be better to spend some time assembling data that allows you to form a hypothesis around the future which you can then test with experts, rather than just asking senior executives for the answer?  Additionally, really knowing why and what questions you expect to answer using scenarios or workshops will also really help.

  3. Think about how you store and model data.  Maintaining data on what has been predicted and what other people have said about the future (in the past!) is actually really valuable.  It allows you to maintain your own archive and knowledge around what’s happening, and crucially, what’s more relevant to your business and sector.  This is a real challenge for most organisations - as it costs money and time, both to resource the maintenance of the data, but also to protect the data.  But it can, and will pay dividends.  This is because it provides another store of data, that can be contrasted against whatever open source data you get from further google searches that can and will happen, ever time says to you 'OK, what are we doing about the future'.

P.S - Other search engines are available!

Comment

Comment

Talking Carriages

Inside the mind of a typical commuter on the 0845 to Paddington.  The first two trains were cancelled and there is no room to sit.  Everyone is standing in the walkway in the quiet carriage.  Where everyone is bound by the same rule.  To be quiet.

Sometime, probably after the Blitz, it became somewhat un-British to talk to people you didn't know.

Especially on trains.

Look around you.  Discreetly, of course.  What are people doing?  They are, probably, cocooned within themselves.  Looking at screen, reading the metro or encased within big, wraparound headphones.  Everything, everyone owns their solitude, for fear of talking to each other.

There could be good reasons for this.  Terrorism probably.  Is that why it's not good to look at other people or talk, or smile because perhaps you're threatening, perhaps you are the threat because such behaviour is abnormal.

But.  Is this so, when did conversation and acknowledgement of other humans become a threat?

https://beyondtheflow.wordpress.com/2014/11/28/crime-in-the-quiet-carriage/

It could be the modern urban environment.  We all know that it’s dangerous to engage.  You are just asking for trouble.  Don't whatever you do, make eye contact, there could be beggars around.

Perhaps that's another reason.  Engaging with people is trouble, especially when there are a lot of people all crammed together in metal boxes all heading in the same direction to sit, opposite a computer somewhere and conduct most of their interactions through electronic words, squeezed into and around emails.  

“Kind regards and best wishes.”  Hollow expressions of informal acceptance that don't really mean much.

Perhaps this is what we settle for as an interactions now - kind words in emails and posts.  Click if you 'like' me.  Like the playground when you were a kids - walking around in a chain ‘Join on if you want to play ‘Action force.’

But back then, it was real.  If you had to ask someone something, you had to physically ask them - not just ‘drop them a text’.  With technology, it seems we are absent. It’s just too real to talk to people.  Better to use the screen, you can always walk away from that without being too embarrassed in public. Leave twitter.  Switch off your phone, put it in a draw and walk away.

Wait but won’t your friends worry.  Best to check again in an hour, wait, make it thirty minutes.  Wait, make it five.  

So welcome back, you can occupy yourself quite happily in your own little head, in your own little portable appliance, while you stream the most popular music of the day, while everyone else around you does the same.  All of us, all the time, committing some vast amorphous mass of virtual content, that just grows and grows and goes nowhere.

Unlike the people on the carriage, who carry on forward, wait for the doors to open and they can all get out, and get on with things.  Not have to spend time, doing nothing in the company of others.  Careful there - that’s when the panic starts if you’re not occupied.  The thoughts race.

What if I did this, what if I did that?  What would people think if I did this?  All of us together, but all of us alone.

Shhh...calm it down.  Distract yourself. Revert to the traditional British pasttime of quietly judging others.  Generally, those who are making more noise than you - which because you're alone is generally anyone who is talking to someone else.  They'd probably be friends, or worse, people with small children, or even groups of children, being loud.  All of these people are not adhering the British values of being seen and not heard, how dare they live and have life in the presence of people who just want silence...

But do we though?  Do we just want to be left alone?

More and more of us suffer from anxiety.  More and more are lonely.  This isn't something that we readily admit to.  Again, it takes us back to feeling judged.  If I admit to being alone, if I admit that I'm unhappy because I haven't actually spoken to any other person for two days, perhaps two weeks, then face harder questions about myself and my situation.  It’s just easier to sit and ‘tutt’ at the others, get back to your screen, your book, put your headphones on.  Especially if you're in the quiet carriage, right?

You're not there to be disturbed.

But, along the way sometimes, the strange times - when the train breaks down and you’re all sat there waiting for the irregularity to pass, someone might just talk to you, might just notice you.  Often it’s someone, from out of town, a foreigner, the country mouse, that either doesn't respect the rule of the city, or just plain doesn't care.  Such people we say, are full of life and make their own rules...because they just talk to people.  Unfortunately, they are generally not British.  Social class, societal expectations fear of looking stupid or being fleeced, means it better not to talk to people you don't know.

So, all of us here, crammed together in a metal tube and none of us talking.  Our only common feeling is one of quiet frustration and building resentment for the unseen others that have crammed us all together.   This isn’t comfortable, not physically nor psychologically. Everyone wants to be away from everyone else and its’ exhausting to keep up the facade that we are all ignoring each other.  Wait...don't look up, the guy in the beanie hat caught your eye.

But then, perhaps if being overly British is part of the problem, then its also part of the solution.  We like to compartmentalise, so maybe that’s what we should do.  Put your quiet carriages at one end of the train, and talking carriages at the other.  A place that if people are nervous about their days, about travelling, they can talk to each other, without the fear of somehow breaking a silent taboo...or looking like a crazy person or a terrorist, or just someone who isn't British?

Could a talking carriage work?  Hmm...I don't know, sounds a bit like lefty-pinko wool-gathering, besides, it would just invite problems, what about the beggars, the city, all these people it isn't safe?  It won’t be long before someone abuses the system, so best not to try it.

That's right, who actually wants to talk to other people on their way to work. What was I thinking?

Best to just sit here.  Look away.  Post a message on facebook, #feelingblessed, when I haven't actually laughed in three weeks, and then feel worse when no-one likes it.  Careful, that woman looks old and she might want you to give up your seat, best to look away.  

And so it goes.

Comment

Comment

Understanding Loneliness - literature review method

A literature review was conducted to map and understand the research base surrounding loneliness.  Research was collected using the resources made available through the Campaign to End Loneliness website and wider searching of free to access, open-data resources was also conducted. This led to 129 research articles being collected from peer-reviewed journals, government and non-government policy projects and online journalistic sources.  The full range of data collected has been stored on a google docs folder, which can be accessed at this link:

UNDERSTANDING LONELINESS LITERATURE ARCHIVE

Upon collection the source documents were split into their constituent strings and combined in a central database.  This was done using bespoke ‘Simplexity Analysis’ processes, that analysed the combined string set for the source data and sampled the most frequently occurring keywords (note - this sampling involved taking the top 25 frequently occurring keywords (known as 'L1' Keywords), and then taking the top 20 keywords associated with each of the L1 keywords (known as 'L2' keywords).  Using this sample enabled a systems map of the most frequently occurring keywords (and relationships between them to be visualised.  This visualisation gives an overview of the main themes and concepts contained in the data.  The data was visualised using an open source software platform called ‘Gephi’. 

Using Gephi scripts and structured metadata for the source string data enabled us to produce a ‘meta-analysis’ of a wide range of literature research (based on a technique known as ‘grounded theory)’.  This has enabled us to understand themes and ideas in the current research and highlight new and emerging ideas.  As a result we produced two maps.

Rich data map - this is a visualisation (displayed through gephi) - of the top 25 keywords and their associated top 20 keywords, with relationships between strings charted. This is a powerful means of visualising a large range of data, without human bias as the diagram directly reflects the word frequency scores.  Additionally, using gephi enables an analyst to move from the map, through to the actual data (i.e. strings and sources).  This enables high level deductions to be tracked back to the source data, which can be useful for policy planning. The full rich data map is available at the following link:

UNDERSTANDING LONELINESS DETAILED DATA VISUALISATION

Main findings map - this is a visualisation of the main themes, loosely based on the frequency of the topics occurring.  Due to the complexity of the data contained in the rich data map, the main findings map is offered as a stylised representation of the data; its main purpose is to be an early visualisation of research themes and questions surrounding loneliness.  Such a map can be more subject to bias (it is an analyst’s interpretation of the data in the rich data map), but (we believe) it's useful at the early stage of project as it provides an simple reference for potential themes and topic areas for researchers seeking to better understand how to tackle the complex, social phenomenon that is ‘loneliness’.  This map also contains links (where appropriate) to the source data contained in the project archive, where relevant papers, blogs and newspaper articles highlight interesting concepts and/or emerging research ideas are kept.  The full map is available at the following link:

UNDERSTANDING LONELINESS TOP LEVEL MAP  

A narrative (made using the top level map) has been produced, using the top five themes to explain current ideas and topics in Loneliness research, forms the basis of the ‘Understanding Loneliness blog’ which is available here.

We hope providing our maps, the data and the method we used to produce this analysis gives a helpful way of understanding current loneliness research and adds to the open-resources that help us better understand this complex, yet very simple, problem.






Comment

Comment

Understanding loneliness

Despite being a very simple issue it is surprisingly complex trying to understand what constitutes loneliness.  It can, and does, affect everyone but occurs in a range of situations and contexts as diverse as humans themselves.

To help understand current research around loneliness a little better we conducted a literature review and collated open data, developed and maintained by really useful, free to access, resources like the Loneliness research hub maintained by the ‘Campaign to end loneliness’.  Using this research data we produced a model of our current understanding of loneliness to highlight some of the current thinking behind its causes and a range of solutions currently being investigated.  To learn more of how we did this - our detailed method is available here.  

This meta-analysis of the data outlined some principle themes in loneliness research, these are contained in a detailed research map, found in our method post.  However, we've also also produced a simple 'knowledge map' that summarises the top themes and ideas in the research.  Many of the research ideas link to the source material, where available.  For this top level map please visit the following link:  

Understanding Loneliness Top Level Research Map

Using the summary map, we have also produced the following narrative, around the current themes in the loneliness research.

Breaking the Taboo of Loneliness

Similar to public perceptions of mental health the general attitude toward loneliness is, hopefully, transitioning away from being a ‘taboo’ subject.  Such a gradual change should be seen as positive, especially in Western countries, like Great Britain, where traditional notions like the ‘stiff upper lip’ still represent a significant behavioural norm, especially in older generations who have firmly established such stoical values throughout their lives.

But, as the research shows, there is a wealth of evidence on why trying to treat loneliness as something that no-one talks about is a bad idea.  Loneliness, happiness and health are all linked and an individuals mindset, can be a key factor in driving social isolation.  A desire to suffer in silence, and not be a bother to anyone else, can unfortunately create a self-fulling prophecy, through which people become locked in their own private circle of suffering.  Hopefully though, by recognising how this belief can form and with continuing efforts to address the social norms that drive such beliefs, things can improve.  

The research also shows the clear links between mental health and loneliness.  Similar to the work being conducted to address loneliness, a lot of work is being done to raise awareness and address how society perceives and understands mental health.  For example, the Mental Health Foundation - continues to campaign and promote awareness and understanding of mental health. Again, the aim is to address the norms that see many people not seeking help for mental health issues that can see them becoming isolated and withdraw from social contact.

The causes of Loneliness can be different, but the symptoms are the same.

Loneliness could be defined as ‘isolation’ - a sensation of being alone when someone doesn’t want to be.  Isolated individuals can feel like they are the only person alive, going for days without talking to anyone else.  This is often despite the fact that they are generally surrounded by people, especially if they live in urban environments.  The causes for such isolation can be varied.  Mental health is a factor, physical health is another.  Often people with chronic health care requirements will have to adopt lifestyles that are more restricted and contained, leading to them being more immobile.  This can affect them, and additionally, the people that care for them, again leading to isolation for both those suffering from illness and the people that care for them.

After health, there are aspects of identity and group membership that can also lead to isolation.  For example, people who belong to minority groups and communities can see their social and support networks change over time, particularly as they age.  This can lead to people in ethnic minorities, immigrant or Lesbian, Gay, Bisexual and Transexual (LGBT) communities feeling like they are in smaller and smaller groups as they go through life.

Further to the sense of things changing beyond their control, further changes occur through life that also increase isolation.  Divorce, bereavement, children growing up, all these changes can see people having a lot less social contact then they used to have, and a sense of frustration and a general lack of control that can further drive the stresses of their situation.

Current research suggest that it is this sense of isolation and the mindset with which someone responds to it can be crucial in how they live their lives.  For example, something known as ‘Dispositional optimism’, defined as a 'generalized tendency to positive outcome expectancies' could be associated with well-being and successful ageing.  Similarly, mindfulness based techniques to help address stress can help address the mental pathways and mindset that can help prevent loneliness becoming a ‘self-fulfilling’ prophecy. 

    Image from - https://gretchenrubin.com/happiness_project/2013/11/feeling-lonely-consider-trying-these-7-strategies/

 

Image from - https://gretchenrubin.com/happiness_project/2013/11/feeling-lonely-consider-trying-these-7-strategies/

Loneliness does impact on health.

Whatever its causes, the impact of loneliness as a cause of isolation does have a negative impact on health.  Loneliness is shown to impact on heart health, general mental well-being, cognitive function, dementia and eye-sight.  This is why trying to tackle the societal values and constraints that can drive it are so important.

Understanding how loneliness and isolation and impact on health does also affect how we appreciate the issue.  For example, many people enjoy time alone.  However, it is important to recognise that time alone, through your own choice and action, is vastly different to being alone due to forces outside of your control.  It is likely that future research into mindfulness and mindset will highlight what the optimum level of ‘aloneness’ is for an individual and this is likely to depend greatly on lifestyle, community and work-life balance.  All of which differ and change markedly as we go through life and could explain why the demographic most likely to feel the cost of isolation are elderly women; a demographic group more likely to be lonely due to bereavement, declining support networks and poor health.

Loneliness and age

The research shows that loneliness in our later years is a significant issue and a number of further research projects and policies have been implemented to start addressing it.  As we research loneliness and ageing more, we can help people more but at the same time, as our understanding increases we can start to see how it impacts on different age groups, often for very different reasons.

At this stage there is not a huge amount of data, but there is increasing media speculation surrounding ‘millennials’ and loneliness.  The 'connected generation', are often seen to be empowered by sites like facebook which give younger people unlimited potential to form friendships and join communities.  Unfortunately, it doesn’t always seem to work that way.  Social networking is changing how we interact, often for the better.  However, as well as the cybersecurity issues they present, another potentially negative impact of social networks are the effect they have on individual happiness.  Social networks can drive sensations of loneliness as people feel pressure only to share positive messages and ‘virtue signals’.  This can drive individual insecurities and a sense of exclusion, making people less happy and more isolated as they watch and monitor their on-line identities. 

Such forces can lead to anxiety and a considerable sense of peer-pressure surrounding a technology that is supposed to bring people together, albeit virtually.  A further challenge is that such virtual communities can detract from being 'present' in real life communities, as people are more comfortable interacting online.  This can, lead to people being less comfortable in real life social settings and more comfortable in their virtual selves.  Is it the case that more and more people today experience anxiety in shared social settings; do people not talk to each other in public because they don't know how?  

    Image from [http://www.marketingmagazine.co.uk/article/1305632/why-social-media-constructing-reality-unworthy-anxiety]

 

Image from [http://www.marketingmagazine.co.uk/article/1305632/why-social-media-constructing-reality-unworthy-anxiety]

Technology; the cause and the solution for loneliness

The research suggests there is a bit of paradox around technology.  It would appear to be both a cause and a solution for loneliness.  With older generations, unable to see or talk to someone, something as simple as a phone call can make a world of difference (see ‘when I get off the phone, I feel like I’ve joined the human race’).  Similarly, being able to use and understand the internet and social networking can show improvement for cognitive function and counteract many of the symptoms of isolations elders experience.

However, for younger generations - it is perhaps the primacy of technology and the significance it holds in their lives that can drive isolation.  Perhaps what we're seeing is that social media is growing up? Are we starting to question whether we are happier as people because the bulk of our interactions are virtual?  The symptom of being ‘alone together’ seems to be the next aspect of loneliness that we are coming to understand as we start to address and understand this complex phenomenon of our times.

Final Thoughts

As our understanding of loneliness improves - we can see ways to address it, and technology does seem to play a key role; both in where and how we use it.  But there are other, simpler things we can do; group exercise programmes bring people together both socially and for health benefits.  Mindfulness strategies and behavioural programmes to address the stresses and mindsets that can cause isolation are also showing promise.

At the same time, when you look at how different societies treat and respond to loneliness, you start to understand how social policies, norms and traditions do have a big part to play.  This is why campaigns to address deep seated thinking patterns and stigmas surrounding mental health are so important and (as with some many things) it seems mindset is key.  Often though it's the collective mindset and the values and beliefs of wider society that need to be addressed, and although this is challenging to achieve, it can happen, generally one person at a time.

Chris Evett is a Futures Analyst, specialising in social trend analysis.  The full dataset and research for this article is available here

Comment

Comment

Monty Python - The Holy Grail of innovation

As well as being awesome, there is a lot to learn from studying the development of Monty Python's Flying circus (it also gives you a great excuse to watch the movies again).   

Looking back at it now, it would seem that the success of the group was inevitable, but reading Michael Palin's diaries, will show you just how uncertain, and how organic its development.  How to write and produce with a team of 5 young male Oxbridge graduates and an American animator, all of whom had different motivations, values and beliefs.  At times this differences made them pull together and at other times fight with each other for control or for the expression of the 'right' vision.

Image from [https://en.wikipedia.org/wiki/Monty_Python#/media/File:Monty_python_foot.png]

Perhaps it was the fact that they were driven people, with different motivations that caused them to move forward.  Certainly, one thing did unify them - their open mindedness when it came to innovation.  This seems to be a crucial aspect of their development and perhaps, this was most tested when they made movies - the 'Holy Grail' being the best example of just how tough they had it in the early phases and how hard they pushed to be innovative.

The Quest for the Holy Grail

Getting increasingly fed-up with making a tv series (without John Cleese since season 2), the python team looked for a new challenge for all six of them - Michael Palin, Terry Jones, Terry Gilliam, Eric Idle, Graham Chapman with John Cleese returning to the team.  They decided to make a film, and with a bit of research and chatting together they went with one based on the British Legends of King Arthur.

They'd previously made a sketch-based film 'And now for something completely different' that had aired in the US, but were keen to make a 'proper' film.  They did this by forming their ideas, researching their material and drawing on their strengths both in writing and production.  They drew on the influences of cinema at the time to both capture the reality and the grimness of the middle ages (reference the mud eaters scene!) which provided the counter-point to their extreme, generally anarchic humour.  But, what's seen as a work of genius today and one of the most referenced comedies of all time, faced many challenges in its production.  When you look at these challenges you can see how some of the constraints they presented, actually made the film more innovative (well, I think so anyway).

Image from [http://tellyspotting.kera.org/2014/05/16/monty-python-infiltrates-game-of-thrones/]

Money was a massive issue throughout the project and impacted a lot on the final story (especially as Terry Jones and Terry Gilliam spent most of their money on smoke machines).  But the limited resources did drive them, forced them even to be innovative.  Consider the coconuts.

The fact that knights don't have horses and instead have their squires make horse noises with coconuts was driven by the fact this this both a)funny, but b) they couldn't afford real horses.

Image from [http://www.joeydevilla.com/2012/11/22/trotify-a-bike-attachment-that-uses-a-coconut-to-make-you-sound-as-if-youre-on-horseback/]

Challenge 2 - Reputation

A lot of what Python did is still seen as controversial today, so back in the 1970's this was even more keenly felt.  This meant a lot of established institutions didn't want to work with the team in the development of the film.  This perhaps isn't surprising, but it did provide another constraint.  Two weeks before production and after Terry Gilliam and Terry Jones had scouted venues in Scotland, the National Trust banned filming in their historical sites because, according to Terry Gilliam, 'we wouldn't respect the diginity and the fabric of the buildings', which, as he pointed out, were buildings where people historically were tortued, disembowled, hanged, burned as witches etc.

This lead to everything being filmed at one main castle; Castle Stalker, which was privately owned.  And it set up another joke; before the 'Spamalot' song, Patsy says that 'Camelot' was only a model.  Which is true - it was a 12-foot high cardboard cutout.   (We can debate whether the decision to ban the team filming in National Trust properties led the development of 'Spamalot'!)

Challenge 3 - Organisation

This was the first story-based film the team had made together.  As a result they needed to mesh together their smaller scenes and sketches into an overall narrative.  This was difficult, they were all young men and very successful, each with their own creative opinions.  This led to 'Terry's' being the joint directors (a challenge because Terry G focused on visuals and Terry J focused on jokes).  There were also tensions over the lead actors, John Cleese felt he should be Arthur, whilst the rest of the team felt it should be Graham Chapman because he was a better 'straight' actor.  This lead to a recurring tension as Chapman was struggling with alcoholism during much of the filming, further driving friction through the already hectic shooting schedule.  Did these tensions over roles made everyone work harder in everything they did, and so spur on better performances and funnier jokes?

The challenge with organization was also seen in how the final film was cut.  There was such a limited budget they couldn't afford to commission quality music to accompany the film.  This meant initially they used a very rough and ready, small-scale musical production and this combined with the running order of the early cuts left those who watched the film with a sense of 'profound depression' which is about as fair off the mark as a comedy can get.

Using challenges for innovation

Thankfully, through lots of hard work and faith, more editing (with lots of audience testing), the right sequencing and the use of library music, led to a film that not only 'worked' as a comedy, but also went on to break box office records in the UK and US.

In its final form, the film is really very funny.  It is something of an oddity but for me that's part of the appeal.  I've not seen a film starting with a focus on 'medieval realism' ending with contemporary police arresting King Arthur.  To some this could be a bit of a challenge, narrative rules are subverted considerably, but, given that's what Python was about, it works.

What's more interesting is that by experiencing all these challenges, by having conflicts and resolutions amongst themselves just to get the final project delivered, the team ended up closer and the film funnier.  The lessons they learned in developing Holy Grail paved they way for them to make 'Life of Brian', which is probably their best film and perhaps one of the best comedies of all time, maybe down to the strength of the ending.

Personally in all I strive to do, I take a lot of solice in how the python teams abilities developed.  I feel there is a lot of debate about how best to innovate these days, but because some of the processes that drive it can both be uncomfortable and risky few people actually 'do it'.  So, here's to the crazy few that try. 

Personally in all I strive to do, I take a lot of solice in how the Python's abilities developed as they faced their own challenges.  I feel there is a lot of debate about how best to innovate these days, but because some of the processes that drive it can both be uncomfortable and risky few people actually 'do it'.  So, here's to the crazy few that try. 

And remember, without the Holy Grail these words would never have been said.

'He's not the messiah, he's a very naughty boy!"


 

 

 

 

Comment

Comment

'Life is not linear'. Using network diagrams to improve your narrative.

When I was an editor on Global Strategic Trends, we produced a network diagram of future trends that was so wide ranging and colourful we called it 'the sneeze'.  It was a complex mixture of nodes and connections that was frankly, daunting even to trained analysts let alone the poor drafting team who had to, somehow craft a narrative from something that looked like this?

Life is not linear

So, how do you communicate data from a complex mixture of concepts and connections?  To answer this, it's worth thinking about how we currently communicate narratives. 

People like stories and (in Western culture especially), they like their stories to be linear.  Stories tend to follow a very basic structure of a beginning, middle and end.  The perils and pitfalls a protagonist experiences on their 'journey' is all part of a linear progression to an end point that illustrates the premise of the story.

Similarly, most communications in real life follow well-established linear structures.  For example, academic papers tend to consist of an introduction, a method, results and then conclusions.  Again, they work linearly to a point as do most policy papers and shareholder reports.

But, when you're looking at network diagram, understanding how to resolve the data into information and craft a central narrative is tricky.  The best thing you can do is be aware of what it is you actually looking at.

Communicating networked information

The challenge with a network map is that you have to use it at the right point in your drafting.  You need it before you try to write a structured report.  Think about it - when you're studying, at what point do you make a mind-map?  You do it at the beginning of your project.

The reason for this is actually quite simple.  The point of the network map is to show you as full a picture as possible of a problem or idea.  By mapping concepts in a non-linear, network map you start to get an awareness of the full range of issues.

Additionally, if your map is weighted, you can also get an understanding of most frequent topics and interconnections.  For example, you might be able to understand the most frequently occurring, or most popular idea.  Additionally, you might also be able to see which particular nodes are the most connected?  What does this tell you?  Does it show that a particular individual is the most important expert in a particular field?

The real power of using a map to start your drafting is that you can then base any linear communication you want to produce on the structure you've seen in the data.  For example, using a network map as your start point you could produce a linear output with the following structure.

1.  Introduction - overview of the network map and how it was produced.

2.  Most significant nodes - what are the largest points on the map and why?

3.  Most significant connections - what do the connections between the nodes reveal about the problem?

4.  Outliers - what strange, low frequency, unconnected issues are there around the map?

If you approach your map in this fashion you'll probably be able to then focus down on the main issues that fall out of it and these can be presented as the big, strategic deductions for you assessment.

Working with the data you have collected to generate your structure can be a little strange at first, but it can reward you by being a real reflection of the research you have collected.  It also, lets you make a bit more sense of something that looks somewhat bewilderingly organic and rather alarmingly like a sneeze!

Comment

Comment

Using short story competitions to predict the future.

The Economic and Social Research Council has just announced the winners of its 'World in 2065' writing competition, and very good they are too.  The winning entry, by James Fletcher is called 'City Inc' and is a future vignette about the rise of the City State.  Other entries from Josephine Go Jeffries and Gioia Barnbrook, both explored themes relating to climate change and how they could impact on the future. 

Looking through these great entries, it does make you think about all the other submissions.  By necessity for short story competitions you do have to have a winner, that's how these things go.  But, if you think about it, the reason we commission creative exercises about the future is about getting ideas.  So, if you run a competition what happens to all the ideas and insights contained in the entries that didn't make it to the short list?

Now, generally there could be a good reason that stories didn't make it to the short list.  Probably, they were difficult to read, possibly they were incredulous or maybe they weren't entertaining enough.  The reasons they didn't make the cut will be defined by the selection parameters of the judging panel, in this case, a high profile one (including Tash Reith-Banks from the Guardian) really knows its stuff artistically.

But, I can't help reflecting on how short story competitions could be best used to gather ideas about the future.  We know that they don't really provide a more accurate view about what could happen, so what value to they have?

The value of ideas from short stories.

The real value from short story competitions comes in the ideas and possibilities they raise.  This is where they are so valuable and its so important for them to be creatively unconstrained and really highlight as wide a range of ideas as possible.

Unfortunately, this is where the traditional means of assessing competition entries fall down a little, as they only select and take forward a small proportion of the entries for further exploration.  It means all the ideas contained in the entries that don't make the shortlist aren't used.

As a futures analyst, I feel this is a shame as it wastes ideas.  But, I do understand, that for a competition, you do need to have a winner, and choosing an entry based on its artistic merit is a good way to go.  However, there is an alternative...

Data mining competition entries

In this data-led age its now possible to rethink how we run competitions.  So, as well as choosing an overall winner, you can also filter the ideas contained in all the entries, this will give futures analysts what they want - i.e. a broad a range of ideas about the future as possible.  Then when you take these ideas and map them, you get a real understanding of the bulk of what people are thinking when they write their entries.  For example, take the following map.

 Sample Map from the Scan of Scans.

Sample Map from the Scan of Scans.

This data visualisation is based on the content of around 300 foresight reports and summarizes that most frequently occurring terms.

Now, if you take this approach with all of your competition entries - you then have an additional source of ideas and material.  Which for a futures analyst is highly desirable as any one of these could be a potential lead on a trend in the future...

Unfortunately, at present most competitions aren't designed with getting this added value from its entries, however, perhaps as we become used to being more data-driven this could change in the future.

And, for those sci-fi short story aficionados out there - the data driven approach in journalism, was actually predicted by Paolo Bacigalupi in his short story called 'The Gambler' - in this media agencies track the most popular stories using a live data visualisation called the 'Maelstrom', a brilliant name for a living, evolving complex mess.  And its such things that can be a little daunting to work through.  But for the first time we have the tools and the know-how.  It would be great to start using them to get more value from short story competitions...

Comment

Comment

Taking the 'P' out of STEEP - Forecasting Geopolitical data

Future prediction is a tricky business, especially when you don't have hard data to work with.  If you're forecaster in a domain that uses lots of numbers; like a meteorologist or a demographer, then you can use statistics to help you make predictions with increasing degrees of accuracy and assurance. However, for people in softer domains - i.e. subjects that don't really lend themselves that easily to 'hard' numbers, getting a consistent baseline to make predictions from is tricky.

This is especially the case for 'Geopolitics'  - which is in itself a hard subject to a get a handle on.  Put simply, this is the study of what motivates a particular 'nation state' and how it intends to achieve its aims.  Now, this is something that's very difficult to agree on (even my rough definition here could be contentious!).  For example, how do you define what China wants right now?  You can't really, you just have to make some assumptions based on what you know about the state in question and use this to come up with a rough idea of what 'China' as a functional entity wants.

 [Image from -    
  
 0 
 0 
 1 
 8 
 50 
 Evett Solutions 
 1 
 1 
 57 
 14.0 
  
  
 
  
     
  
 Normal 
 0 
 
 
 
 
 false 
 false 
 false 
 
 EN-US 
 JA 
 X-NONE 
 
  
  
  
  
  
  
  
  
  
  
 
 
  
  
  
  
  
  
  
  
  
  
  
  
    
  
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
  
    
 
 /* Style Definitions */
table.MsoNormalTable
	{mso-style-name:"Table Normal";
	mso-tstyle-rowband-size:0;
	mso-tstyle-colband-size:0;
	mso-style-noshow:yes;
	mso-style-priority:99;
	mso-style-parent:"";
	mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
	mso-para-margin:0cm;
	mso-para-margin-bottom:.0001pt;
	mso-pagination:widow-orphan;
	font-size:12.0pt;
	font-family:Cambria;
	mso-ascii-font-family:Cambria;
	mso-ascii-theme-font:minor-latin;
	mso-hansi-font-family:Cambria;
	mso-hansi-theme-font:minor-latin;
	mso-ansi-language:EN-US;}
 
   http://prn.fm/wp-content/uploads/2015/03/geopolitics.jpg]

[Image from -  http://prn.fm/wp-content/uploads/2015/03/geopolitics.jpg]

So, often when we're trying to think long-term about the future, you not only have to form a rough classification for what constitutes a country (and what it's national interests are) but after you've solved this (!) you can start to think about what this state might do.

Using themes to understand the future

Using current futures analysis techniques to think about states can be a little tricky.  Firstly, we like to use scenarios to predict (very roughly) how a particular state will behave in a particular context.  This is kind of useful for giving a broad range of outcomes that could happen, but is highly subjective and speculative (great fun though!).  Secondly, futures analysis often tends to deal with 'themes' - i.e. trends (things that could happen in the future) tend to fit into catergories.  For example, in intelligence analysis there is a technique called 'STEEP'.  This is a way of splitting trends into a rough categories that could happen in the future.  For this, the trends are put into one of the following types.

 'Social' - For example, there will be more people in the future.

'Social' - For example, there will be more people in the future.

 'Technological' - New technologies will arise that drive social and economic change.

'Technological' - New technologies will arise that drive social and economic change.

 Economic - The pursuit of economic opportunity will remain a significant force for progress.

Economic - The pursuit of economic opportunity will remain a significant force for progress.

 Environmental - People and states will continue to need resources - food, water and energy.

Environmental - People and states will continue to need resources - food, water and energy.

 Politics - State 'X' will be at 'Y' by 'Z'

Politics - State 'X' will be at 'Y' by 'Z'

STEE(P)?

As you'll see in the very brief, hypothetical examples I've given above, most of the classifications tend to discuss general concepts [Note - there are lots of alternatives and derivatives of STEEP that widen the system to think of other areas, such as military and law].  But, when you take Politics (or Geopolitics), you tend to be talking about something slightly different as you're not really talking about a particular driving force, instead you're talking about actions, choices and desired outcomes.  And when you're doing this how do you differentiate between the needs and actions of a state? Aren't they just ransom to all the other trends described in the social, technological, economic and environmental areas? 

This is a real challenge for forecasters.  Chiefly, it seems because, when you get into politics you're starting to talk about behaviour. When you deal with geopolitics you are as much thinking about likely motivations and actions that a country is going to take.  Additionally, you are also thinking about how a state acts and responds to the trends you've already described.

When doing geopolitical analysis on any country, you'll see that most of the events and trends occurring both at present and in the future for that country, relate to those points I listed above.  Being procedural about it - you could say that these are generic issues for any state and Geopolitics is really a discussion of this.

This then becomes a real challenge for your forecast, as if you're not careful you end up duplicating all the generic trends you've captured in the 'STEE' data capture and then re-drafted them in your 'Geopolitical analysis' where you have effectively duplicated your thematic analysis but in the context of individual nation states and regions.  The tell tale sign of this, is if you're ended up with a forecast that's as substantive for its geopolitical analysis as it is for its thematic, but is broadly saying the same thing you've already said but with a national focus!

So, what to do next? Is it time to take the 'P' out of STEEP?

P = Prioritization

What's the solution?  Well, it's a tricky one.  All that you can do really is be aware of your data and manage it carefully.  If you've collected all your trend data, you then need to bring it together in a way that enables you to get a clearer understanding of which countries are the most significant in the future.

To do this, you need to assign and prioritize trends and then, once all the thematic data has been collected, somehow weight the trends and potential contributors to the trends.  Doing this allows you to determine which countries are the most consistently contributing to trends in certain areas.

For example, if every trend you collect relating to rare minerals, mentions or attributes China. Then you have a key 'actor' with regard to that trend.  Similarly, if you keep learning about health trends and that the UK is the most significant source of anti-cancer research, you then have another marker to assign to that particular trend. 

You then get to a point, either in your drafting schedule or your analysis, where you are bringing together your trends and assigning meaning/ownership to them.  Doing this enables you to start thinking about who is shaping and driving these trends, and if the same names keep coming up, you can make a pretty solid recommendation that certain actors are going to be significant in the future.  For example, if the US is consistently associated with information technology development and that you collect 300 trends highlighting the role of Silicon Valley in the global knowledge based economy then you can probably say, with a degree of confidence, that the US is likely to be significant in information technology for a reasonable amount of time (perhaps 5-30 years?).

Geopolitics is important, it's just very challenging to predict and current futures methods don't really deal with human behaviour very well (to be fair, what models do?).  By being clear on the data you are collecting it, classifying it appropriately and analysing it through a measured, considered and auditable process will enable you to at least quantify some aspects of what states are doing, by looking at the outputs they produce.  This, at least, gives you some kind of rough quantitative basis for discussing the future behaviour of nation states.

And if else fails, you can always use scenarios.  Or even better, just get everyone to play 'RISK'!

 Image from - [ https://geopolinquiries.files.wordpress.com/2014/09/keep-calm-and-study-geopolitics-1.png ]

Image from - [ https://geopolinquiries.files.wordpress.com/2014/09/keep-calm-and-study-geopolitics-1.png ]

  







 






Comment