Skip to content

2

Laurent Haug quoted Ethan Zuckerman's point which is really worth repeating in forecasting and futurist circles. Since I am, at least partly, finding myself backing a bit to my original turf of technology forecasting this is a good reminder of a rule of thumb that I actually used as an argument for the future success of the Internet in the mid 1990:s.

"If you’re not getting porn in your system, it doesn’t work. Activism is a stronger test - if activists are using your tools, it’s a pretty good indication that your tools are useful and usable."


[From Ethan Zuckerman's post …My heart’s in Accra » The Cute Cat Theory Talk at ETech]

1

It is interesting to note that even people in the IT industry recognize the need for managing the complexity in the world. Chris Potts writes in his blog (Enterprise Architect = Scenario Planner | Advice and Opinion) about the need for the Enterprise Architect to perform scenario planning in order to embrace the uncertainty in the business which he or she is supposed to support. (found via Enterprise Architecture, Development and everything in between: Scenario Planning)

I couldn't agree more. My concern is that this isn't understood in the IT community at all. But in order to try to change this I will here give you CIO:s and IT architects a tip.

Being a consulting scenario planner with a (too) long history in a big IT-department my basic method is to use scenario planning and add a simple cross matrix analysis in order to understand the business effects of the IT choices you have to make.

200802261511.jpg

In practice this means that you

  1. develop a set of future scenarios, each describing the future needs and structure of a possible business situation (or get the already dnoe scenarios from the strategy department)
  2. describe the different possible IT-related directions you can take
  3. identify the evaluation parameters which is used to measure the real business value of IT and scale them from 1 to 5 (1-5)
  4. workshop through all the quadrants where you evaluate how well that particular IT-direction will work out in that particular business scenario.
  5. Summarize, present and evaluate the result

The trick here is to have good evaluation parameters as well as involving bot high level IT people with high level business people in the process. If you don't give the participants too long time to dig down into (irrelevant) details you will have a (usually non-existent) strategic discussion about both IT and business at the same time.

200801031415

A couple of weeks ago I noted an increasing pattern on Google Trends when it came to Canada and searches for the word future. This trend seems to continue even this year and peeks more than before... And it seems to be unique to Canada!

I guess it is some kind of cultural phenomenon. Are their any Canadians out there who can explain this??

Technorati Tags: ,

I got a mail yesterday promoting a new site called Future Scanner. It happens now and then that people finding my blog also want to use it to promote new things. As reader of this blog notice I am a bit skeptical to go somebody else's errands. This time it took me some minutes testing and subscribing to some of the RSS-feeds before I got convinced. Future Scanner is really a tool in the right direction.

Now I have a new place to go when researching a scenario project and want some input for thinking through the dynamics of the scenarios for a certain year in the future

If you are interested in the future click around a while on Future Scanner

Technorati Tags: ,

4

Inspired by this post in Google Blogoscoped (found via 43 Folders) and the statements that people are predictable I got reminded of the power of Google Trends. A service that let anyone look into the gathered statistics from probably the single largest information hub in the world. When the service was announced I sat for hours and draw graphs for different words and pondered the underlying reasons for strange correlations.

Today I tried the word "future" and inspired by the post mentioned above, the word "depression".

Future-Depression-All-1

Interestingly enough there is definitely a correlation between the curves. It becomes most evident at the end of the year when searches for "future" goes up and search for "depression" goes down. But look at the use of the word future in the lower graph counting the occurrence of the words in the News flow. It seems to be a gradually and steady increase in use of the word "future" while the use of the word depression stays stable and much lower.

It suddenly struck me that maybe there could be a difference in the results for different nations when it comes to these words as well. When searching an English speaking country close to Sweden where I live (UK) the result was slightly different.

Future-Depression-Uk-1
What struck me here was that the search for both the word "future" and the word "depression" actually dipped close to the end of the year. The difference in volume between the use of the words seemed to be slightly closer than the graph above, and both graphs seems to slowly point downwards.
Searching other English speaking countries lead me to Canada.

Future-Depression-Canada-1

In Canada there seems to be a an even more different pattern. The searches for the word "future" peaks dramatically just before the end of the year and searches for the word "future" seems to be increasing over time.

What about the US?

Future-Depression-Us-1

Whoops, what a difference! The searches almost doesn't differ in volume at all! Can we draw the conclusion that US is in a more volatile psychological state than both UK and Canada or what?

To me this seems to be significant results, but what does it really show about the state of the world or the state of the nations? Comments anyone?? Are there any elaborate sociological research around the Google Trends anywhere??

This was a brief search without any more elaborate analysis, but I think I will use Google Trends more as a thermometer when it has been more established.

And what about connecting GapMinder with Google Trends? That would be explosive!!

Technorati Tags: ,

The other day a friend unexpectedly referred to Fermi problems (attributed to the famous physicist Enrico Fermi). I am the one who once (in ancient times) studied mathematics and physics and he is a marketing /advertising guy so I was a bit surprised. When realizing in what way he used the notion of Fermi problems I suddenly saw its pedagogical significance when describing the value of forecasting in general and more specifically the value of developing scenarios.

In Wikipedia we can read this about Fermi problems:

The classic Fermi problem, generally attributed to Fermi, is How many piano tuners are there in Chicago? A typical solution to this problem would involve multiplying together a series of estimates that would yield the correct answer if the estimates were correct. For example, we might make the following assumptions:

1. There are approximately 5,000,000 people living in Chicago.
2. On average, there are two persons in each household in Chicago.
3. Roughly one household in twenty has a piano that is tuned regularly.
4. Pianos that are tuned regularly are tuned on average about once per year.
5. It takes a piano tuner about two hours to tune a piano, including travel time.
5. Each piano tuner works eight hours in a day, five days in a week, and 50 weeks in a year.

From these assumptions we can compute that the number of piano tunings in a single year in Chicago is

(5,000,000 persons in Chicago) / (2 persons/household) × (1 piano/20 households) × (1 piano tuning per piano per year) = 125,000 piano tunings per year in Chicago.

And we can similarly calculate that the average piano tuner performs

(50 weeks/year)×(5 days/week)×(8 hours/day)×(1 piano tuning per 2 hours per piano tuner) = 1000 piano tunings per year per piano tuner.

Dividing gives

(125,000 piano tuning per year in Chicago) / (1000 piano tunings per year per piano tuner) = 125 piano tuners in Chicago.

It is really about having a reasonable correct estimate (at least the right magnitude) based on what you already know rather than having a correct answer. The point is that sometimes this is enough for your purpose. Especially when that information is all you have at the moment.

When working with the future you never have enough information and you have to use the information you have in an intelligent way in order to understand what this means for the future. The point then is not to predict the future in a deeper sense but using your brain and available data to produce at least some intelligent conclusions about the future which will help us take decisions today. I would argue that these conclusions are similar to the Fermi estimates you do when not having enough information.

Since many years I have been working in and with organizations which is suffering from increasing complexity. The problem is usually not the complexity in itself but the inability of the top management to acknowledge the current level and understand that this particular degree of complexity require a different set of methods and skills. It is most likely so that it is meaningless to talk about management as one area of expertise for all these levels of complexity, uncertainty and self regulation we see in organizations today.

One of my problems have been to find a way to talk about different levels of complexity in organizations in a way that a management team understand. Sometime around 1996-97 I had the idea of mapping all mail conversations to provide us with a pretty good view of what was happening in the company at a certain time. I didn't succeed in convincing anyone that this was a good idea and since I had a limited budget the idea was scrapped.

From a completely different area comes images the call structure from a Linux system in comparison to the call structure from a Windows system as an argument Windows being much harder to secure than Linux:
» Why Windows is less secure than Linux | Threat Chaos | ZDNet.com

The Linux system looked like this:

Syscallapache

and the Windows system looked like this:

Syscalliis

Talking about images tell more than a thousand words! If you connect this concept to what is going on in the area of developing sociograms for different organizations you maybe could get an understanding for the complexity in the organization in a different way than before. Of course you could make a mathematical complexity analysis and find out a lot more of the system, but what I feel is needed is a social science theory and a taxonomy that says something about which methods and approaches are valid for different levels of complexity. I am talking pedagogy like e g www.gapminder.org.

Another issue is of course tracking how complexity changes over time and how individuals manage to reduce the complexity in their immediate work situation and how the system emerges.

From a futurist perspective it would be interesting to assess in what state different organizational structures are. Which are in fact able to formulate a direction and move in that direction and which are not? Since I believe that horizontal organizing step by step becomes more effective and traditional vertical organization suffer from a flattening capability this could be another way of looking at the development over time.

This is probably connected to Dr Ichak Adizes ideas of organizational life cycles as well.

Wow!!

For me being a complex and visual thinker it was really amazing to find a site which collects so many visualization techniques in one page. I see it as a really valuable tool when working with scenario planning project are looking for ways to present the data to explain the relation behind driving forces to back up your scenarios.

Look for yourself at: http://www.visual-literacy.org/periodic_table/periodic_table.html

Thank you FutureHIT / post

Technorati Tags: ,

Google have released a new fascinating and probably controversial tool onto the world: Google Trends. By simply letting users search directly in the meta-data some of the higher level patterns of what people search for is out of the box. If you e g search for "united states", you get a list of regions from where this particular search is being done from. In this case most of the searches for "united states" have by far being done from inside US. Surprise!

It becomes more interesting when you get surprised over the answer. It took me just a couple of minutes to find results which made me wonder what Google Trends really shows. When searching for "sweden", Nigeria surprisingly comes up first on the list. Why do Nigerians search that much for Sweden? I tried with countries like Norway, Denmark and Finland, and they themselves show up first on the lists of search origins.

Another peculiar thing that came up was when I searched for "system dynamics", the greatest inquiring region by far was Iran. Why is that? Do Iranians have some special relation to system dynamics or are they just inclined to search the Internet about it?

The discussions about what search engine statistics really can tell you have now started. Especially since a tool like this sounds like it could be extremely valuable for different kinds of business and military intelligence.

To me as a forecaster the value of a tool like this is huge. The development of how people search for certain words can tell a lot, even if you have to be careful on how you interpret the results. Look for instance on the rise for interest in Wikipedia and compare it to the rise of the word blog. It becomes quite obvious that the word "wikipedia" since the middle of 2005 catches on faster than the word "blog".

4

After reflecting a while about Jamais' idea about open source scenario planning and after reading Art Hutchinson's comments about some difficulties I thought I should add some of my reflections about problems an open source approach could face.

After having teached scenario planning for several years, as well as being involved in several scenario projects on different levels I had problems accounting for the different levels of quality in the results. It is no secret that scenario planning is an art and not a method, which of course have something to do with it. What I see almost every day is that in overall young students have problems grasping and formulating driving forces, uncertainties and possible chains of events of some quality. What comes out are scattered sentences which doesn't seem to connect to each other. The ability to reason on that level of abstraction, time-span or uncertainty doesn't seem to be present.

After having delved into Elliott Jaques theories of time-span capacity and how people are able to manage uncertainty and levels of abstraction I have found a tool for analysing this (there are many, many references but the article "Are you big enough for your job? Is your job big enough for you?" by Judith McMorland is pretty new).

What I find is that people with a certain level of capacity (level IV and above) are in general capable of talking about abstract and uncertain chains of events in a intelligible way. At lower levels people in general can understand and appreciate the result of a scenario process, but are seldom able to construct them with any consistance or quality.

Why does this theory apply to this approach and not say open source software development? Probably because when constructing something "mechanical" you can test things and see if they work. Different individuals can provide the project with their own code, but the test comes if the code works or are being better than before. What we don't see in open source processes is which individuals are being ignored or all the code that don't match the quality level and thus is being replaced by others. The working code is a brutal fitness test which effectively filters out the crap.

In an open source scenario process such a filter is not possible and most of the crap will quickly hide all the goodies.