Future of content – Conclusions

This is the conclusion to the Future of Content distributed article:

In Part 1 I argued that content would move towards being free and widely available, from an economic and quality argument.

In Part 2, Ray Corrigan argued that in fact online content gave new types power and control to large institutions and they would have the weight of the law behind them to prevent the open access I outlined.

In Part 3 Patrick McAndrew focused on education, and the openlearn project specifically and examined the motivations for universities to give away content, and what this meant for the university business model. He also warned that free, open content may generally be thought of as a desirable aim but it might mean we lose some things we cherish currently.

In Part 4  Will Woods takes a technical perspective and argues that there are nine influencing trends now which will see a move towards more, but not necessarily all, content being free.

(There is also my response to Ray’s post, Ray’s response to me, my response to Patrick, and Ray’s response to Patrick – phew!)

In conclusion then, the changes wrought by the digitisation of content and its distribution via the net are the biggest challenge facing those who work in content industries now. They are faced with two choices essentially:

i) Find ways of maintaining the publisher model, by managing the rights and use of content through a combination of technological and legal controls.

ii) Find new business models that give away content but build and sell services around it.

The struggle between these two modes of operating will define content industries over the next five years I would suggest.

Let us take an example of broadcast content. Making high quality content is not easy or cheap, so although the web 2.0 world sees everyone become a broadcaster and the mass become the media, the type of content that can be produced is limited to an extent. Wikipedia has demonstrated that mass distribution of process can work with content as it has with software, but it is difficult to imagine how a high quality television series (think historical drama such as Rome) could be produced for free, by distributed individuals. It might therefore be argued that high quality content will become more valuable, not less. If the first option is chosen then a few television providers have the quality content, which is paid for on a subscription basis.

However, although the web 2.0 content cannot compete on quality it can compete on diversity and quantity. This sets up competition for user/viewer time and attention. The viewer has a choice – do I spend the next half an hour watching Rome or do I listen to a podcast about a subject I am really interested in, watch a couple of inventive YouTube clips and read a blog posting by Stephen Fry (actually the last one will take all of the thirty minutes)? If watching Rome costs me money then the second option becomes more attractive. So, if the second option is chosen, the broadcasters decide to make their content freely available (now you can watch it on YouTube while reading the Stephen Fry post), and they make their money through advertising, sponsorship, or through providing the broadband you are using to watch it.

It seems that there are competing pressures in society currently, which Stewart Brand identified all those years ago:

"On the one hand information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other."

In content businesses now we see this conflict in the development of increasing control through the legal and technical frameworks that Ray has set out, and at the same time there is a massive movement to widely distributed, freely available content. It may be that this is a struggle that is never finally resolved, but instead revisited regularly.


  1. John Connell says:

    Great series of cross-postings,Martin. I think it’s safe to say the experiment has worked, and you may see a few emulators come long in your wake.
    The richness of the debate means that I am still digesting all that you and your co-conspirators have offered us.

  2. Steven Verjans says:

    Interesting dicussion, but I find myself thinking: It would be nice if all the texts were in one single location. Goes to show how Web2.0 we really are…

Leave a Reply