This week ‘trolling’ seems to have taken the leap from societal irritant to public safety issue.
It’s interesting when an ongoing, underlying problem that’s been around for a decade suddenly leaps onto the front pages. It’s almost as if an entire community of journalists are looking for an consistent, predictably occurring issue to report on that requires no research other than a quick glance at their Twitter streams…and, oh yes, it’s August…
What I find interesting about this debate is that it seems to be another staging post in the decline of ‘internet exceptionalism’ – the idea that the online world is somehow different to the real world, and brings with it its own rules and culture.
One part of this is an ongoing technical debate about ‘what Twitter is’ – is it a conversation in the pub, a scrawl on the toilet door or a newspaper you read to pass the time – and what is the legal status of all of these elements. I remember an impassioned debate in the office around this in the age of ‘Giggs-gate’, and the conversation doesn’t seem to have been satisfactorily resolved (though Lord MacAlpine has done a good job of embarrassing a few people who deserve to be embarrassed.)
However, to me the interesting element to this new trolling debate is the undercurrent that ‘something should be done about it’. By its nature, this statement really means ‘someone else should do something about it, with little input from me other than righteous anger.’ And ‘someone’ tends to be an amorphous blend of the Government, large corporations and the police.
This to me is a sign that the Internet is now clearly finishing its first phase, its ideological younger years. The whole concept of networked information was predicated on the idea that there is no ‘someone else’ – that all users should take responsibility for the processing power, data, cost and evolution of the shared network.
This personal responsibility survived beyond the initial days of computer scientists in California and Washington and predominated in the early days of the consumer Internet too. Trolling of one form or another has been around a long time but the assumption was that it was everyone’s job to ridicule, report an moderate abusive and irritating behaviour on the internet, much in the way that you would (hopefully) pick up litter in the park. The whole point was that for the first time we had a space that wasn’t edited, or curated, but was a common playground for us to freely share views.
This ideology burns strongly in some of the smaller, healthier, more culturally homogenous communities online, and of course the continuing wave of ‘information martyrs’ that are reinventing the global security debate. But the sense of collective responsibility online seems not to have survived the transition to large, mass usage, highly commercialized community spaces.
This may be inevitable – it may simply reflect the fact that vibrant cultures struggle to survive under the weights of mass participation and IPO’s built on personal data.
However, before we heckle our own freedoms of speech out of existence, I think it’s worth stopping to think about what we might be giving up. And also to retain the sense of perspective that when we look at a bank of headlines about the evils of the Internet, that they are being carried by newspapers…which is slightly analogous to shaping your feelings about Christmas based on the opinions of turkeys.
Consciously or unconsciously, when classic editorial media outlets are writing any story about digital media businesses, they are writing about their commercial competitors, and any good Marxist would warn you of the dangers there. Though of course the irony is that there is nothing you can do to drive time spend on Twitter than to ignite a conversation about Twitter.
There is no doubt that in order to maximize the sustainability of these communities as self-sustaining communities rather than editorial media then we as users need more user control. This means better tools – to moderate our communities, and to manage our data. But to paraphrase…I totally condemn your decision to be an idiotic, abusive, irritating troll, but I will fight at least fairly hard to defend your right to do it.
After more than 2 years working in the US, I will be returning to the UK in 48 hours. During those 2 years I have been asked pretty much once a week: “what is the difference between working in the US vs the UK”.
It’s a complicated question, and my experience isn’t broad enough to really put it in a neat summary, but here is a brief 5 point summary to capture some thoughts on what is most different.
1. Working: Hard vs Always
‘So, does everyone work really hard over there?’ Well, I work in advertising, and I do a lot of pitching, and I think it’s fair to say that anyone who is doing that works pretty hard everywhere in the world – and the US and the UK are probably the two most competitive markets, so it gets pretty extreme in both.
If I was to define the difference, it’s that I have yet to experience in the US quite the level of intensity and speed of collaboration that I experienced in the UK in a comparable scenario. Sometimes it feels like American groups across agency and clients take the long road – long meetings, complex processes, many disciplines creating a lot of work rather than rolling up their sleeves and committing to collectively take as many shortcuts as possible (which is one definition of working hard.)
The big difference is that apart from a few entrepreneurs and manic CEOs, I haven’t found many Londoners who view their weekends, holidays and nights’ sleep as fair game for working time. On one project I worked for 4 consecutive weekends, and regular 16 hour days – and this wasn’t viewed as a titanic individual effort, but just keeping up to speed.
In fact, my penchant for weekends free of email and 2-week holidays once a year can come across as pretty eccentric in the US.
2. Money: Embarrassment vs Riches
Londoners might stereotype Americans as pretty brash individuals who can never stop talking about money. Americans might stereotype Brits as charmingly over-apologetic Hugh Grant-a-likes. And there’s actually a lot to this comparison.
In this, I’ve come to the conclusion that it’s actually the Brits who are a little strange in this respect – I don’t think American businesspeople have any particular compulsion to talk about money (when doing business, discussion of money probably comes with the territory.) On the flipside, many Brits seem to have a pathological aversion to even mentioning money, whether it’s budgets, salary, investment or taxation.
This is particularly a challenge in the innovation space – some Brits seem to think that the brilliance of an idea should be enough to carry it, and that is shouldn’t be sullied with commercial considerations. Whereas American innovators are much happy flitting between funding rounds and engineering scrambles. Which is one reason why the ideas:businesses ratio tends to be much healthier on the western coast of the Atlantic.
At this point, a blast of the US intro to the Apprentice as a money-flavoured sorbet.
3. Innovation: Thinking vs Doing
Within my own field of media thinking, there is no doubt that the UK leads the world. Pretty much every agency I know in New York has a British Head of Planning, Strategy or Innovation. The Brits just seem a lot more comfortable with random digression and with highly conceptual or theoretical thinking in the workplace, even in fields like advertising where it is clearly a highly valuable commodity everywhere.
The downside of this approach is that it is easy to feel that concepts are being sullied by their contact with the real world and with the action plan. This is accompanied by a fear of failure that doesn’t exist in the same way in the USA. It’s fine to try something that might be a bit wrong, learn from it, try again, fail again, and eventually strike it lucky. A four time failure who wins in the end in the UK would be perceived as a comical story of the plucky underdog. In the USA, that’s just a success story.
So innovating through thinking vs innovating by doing – it’s fair to say that a bit of both is probably the best place to be. Which, to anticipate my conclusion, is probably why teams that mix the two cultures tend to be so effective.
For a shorthand, a UK strategist is likely to say ‘I just want to think about interesting problems’ – their US equivalent is more likely to say ‘I just want to do cool shit.’
4. Job titles: Descriptions vs Definitions
Nothing is more amusing to the Brit abroad than having a meeting with a VP, and SVP, and EVP, a Managing Director and a President. In our minds, this is a shorthand for obsession with hierarchy, and whilst the UK is hardly Sweden for egalitarianism it is fair to say that the sight of a young graduate employee with their arms around the CEO is far more likely at a British Christmas party than its US equivalent (if there even is one.)
However there is a broader truth behind this – I don’t know how much of it is a function of scale, and how much is a function of culture, but US job roles tend to be a lot more specific, whether by discipline, department or hierarchy. There is also much more of an expectation that you will keep to your role, and that your work will be defined by it.
This is great for accountability, but it can be pretty inflexible when it comes to working collectively. This is why after much deliberation Jumptank elected the inscrutable, hierarchy-busting title of Partner. This came in useful many times in driving collaboration without borders.
5. Opening a conversation: Apology vs Storytelling
I have a somewhat hackneyed presentation opening/ice breaker that goes: “I have to start with a quck apology. Because I’m English, and that’s what we do.” Look, I even did it then – I totally undermined myself before I even said what I wanted to say. Now I’m going to have to think of something else, but it served its time for a few cheap laughs.
I have had UK colleagues emerge shocked from meetings at their US counterparts’ enthusiasm about talking about themselves, and had US clients totally misunderstand my stream of caveats and seek to reassure me that the work was really good and I should stop apologising. I have also interviewed candidates in New York who have spent 20 minutes telling me their life story and their many successes before I have even asked a question.
Neither of these are optimal, clearly, but one thing that has astounded me (mainly in the bar rather than the office, but it stretches there too) is the incredible ability of the average US working male for storytelling. For a Brit, a story told in public tends to last 3 minutes and end in an ironic observation about life. Here, I have been told stories that were an hour long, stretched over many years of experience and extracted at least three peaks of hysterical laughter.
This is a skill worth learning, in business as in life.
US vs UK: who wins?
If the answer isn’t totally obvious, both cultures are wonderfully mad, unique, dynamic and frustrating. If there is one thing I have learnt it is that the two in combination are a force to be reckoned with, and I am proud to have had great experiences in both countries and hopefully learnt from them too.
After a decade that has worshipped the ‘crowd’, and the infinite capacities of the collective intelligence, the tide seems to be rolling back as people once more begin to suspect the fallibility of ‘Groupthink’ and once again idolize the tyrannical creative genius.
'People don't know what they want until you show it to them'
This underlying dialogue affects everything – our attitudes to politics, the way we structure our organisations, even the art we admire as we fall in and out of love with auteurism.
On a more banal level, it impacts the way that we go about effecting tasks, and how we construct the goldfish bowl of collective actions – from conferences, to workshops, to committees and boards. (For the practical minded, I will end this post with some suggestions on how to apply this thinking to structuring working groups, if you are really busy, you can go straight there.)
The basic truth I am exploring is that different unit sizes of people are better at different things.
Without doubt technology has changed those dynamics, taking the theoretical concept of the ‘general will’ and hardwiring it, making group action possible at much greater scale and speed than ever before.
One of many beautiful visualizations of the creative community of the internet.
But humans remain substantially the same in their cognitive capabilities and emotional instincts. Which means that nothing foolproof can be created by lone geniuses, and nothing truly beautiful can be created by wise crowds.
So how do you know how many people you should get to do what?
1) Individual Brilliance
I spend a lot of the last year or more tirelessly advocating an open-minded an open-hearted sense of collaboration, and an openness to letting ideas or projects be strengthened and improved by exposure to as much external stimulus as possible. 99% of which I stand by.
But then I rediscovered Rousseau’s Confessions…and read for the first time Steinbeck’s incredible East of Eden:
“And this I believe: that the free, exploring mind of the individual human is the most valuable thing in the world. And this I would fight for: the freedom of the mind to take any direction it wishes, undirected. And this I must fight against: any idea, religion, or government which limits or destroys the individual. This is what I am and what I am about.”
This is an incredible book. Disturbing but compelling. And best read alone.
These both reminded me that any really original, well-developed and imaginative idea, a synthesis that has integrity and power, is developed in a period of solitude and meditation. The same even applies to blog posts.
2) Pairs Solve Problems
Budding philosophy students are sometimes puzzled to flip open Plato and discover not well-reasoned essays, but these rather strange little two-character plays, often featuring Socrates and some randomly selected, often very confused 4th Century BC Greek called Phaedo or Phaedrus or Parmenides.
The pairs are rarely evenly matched in intellect or moral clarity, but the dialectic celebrates the fact that two heads are better than one. Generally one of the partnership has the driving thesis, but by dealing with the confusion, agitation and indignation of their interlocutor, the thesis will gradually become better and better and rise to the level of genuine synthesis.
Dialectic - so powerful that even Watson can do it.
The same works for being a detective, or indeed for comedy. And as anyone who has ever spent the late hours of the evening doing a wall review of an important presentation with someone who simply refused to understand it, it works there too. Ultimately, pairs are great at solving problems, and if you want strategic solutions, putting two heads together is generally the best way to do it.
On the other hand, this is what happens when one smart person tries to do dialectic on their own, or with a hostile partner:
3) Triumvirates Create Friction
Ancient Rome at the close of the Republic was a fanatically warlike culture. As a result, it is deeply unsurprising that they opted twice for the triumvirate form of government, first with Caesar, Crassus and Pompey, and then much more finally with Octavius, Lepidus and Mark Anthony. Because nothing is more certain than that a triumvirate will end up fighting.
Three heads are fightier than one.
The competitive urge create by triumvirates, both for dominance and for each other’s exclusive attention, can briefly create a highly creative and expressive dynamic. If you are looking to open up a question, find some disruptive ideas, or get beyond obvious solutions, a trio left alone to fight for an hour might get you there. If you are looking to create a stable form of government, look elsewhere.
4) Fours Execute Missions
Four is my desert island group size, being both the size of my current team, but also the size of all the greatest bands in the history of music. It is also apparently the size on Navy Seal units.
A group of four is however generally considered highly unstable – which is not my experience at Jumptank, and hopefully is not true of Navy Seals, but obviously and spectacularly is true of most of the greatest bands in the history of music, who very often after a decade of intense brotherhood and success shift into a couple of subsequent decades not speaking to each other.
The tricky thing about fours is that they create a lot of opportunities for pairing off into cabals. But if this can be avoided, a four can be highly effective. There are two key rules to getting the most out of a four:
a) Make sure they have a clear, emotional shared mission. For the Seals, I guess this is easy – for bands it is harder because the mission is so often success…which means that when you achieve it, the band breaks up.
The Stone Roses - firm friends up until the moment they made a single penny of money
b) Make sure the group has a diverse group of skills and personalities. This is the whole strength of fours – it is probably the largest group where a diversity of skills can be effectively hammered together. When everyone has a unique contribution to make, the power of the whole can be incredible. When everyone is stepping on each other’s toes, it can spiral quickly. The Beatles worked great when there was a funny drummer, a thoughtful guitarist, a melodic bass player and an angry rhythm player. When they all wanted to be everything, their time was up. Whereas the Stones, where Charlie Watts is happy to be ‘the bed I lie on’ (Keith Richards) are still just about bumbling along fifty years in.
5) Democracy is Odd, not Even
Anything over 4 is basically hopeless for actually creating anything, but if you are making decisions, there is definitely some wisdom in numbers. Groups are good at criticizing and judging (as long as something is pretty well thought through already.)
One secret quirk of decision-making groups is that it can actually be pretty important to think about whether the size of a group is odd or even, if there is any chance that they might formally or informally need to vote on something. The group will want to act effectively as a group – which might lead to some consensus voting that could lead you in the wrong direction:
Never ask a band to make a 'majority rules' decision
6) 12 Apostles, not 12 Angry Men
Once you get to double figures, the group begins to lose the ability to even make decisions effectively. I mentioned this to someone recently, and they raised the test case of jury sizes – if a group of say 12 (also an even number) is so bad at making decisions, why has it been a key historical unit for judging?
My answer is this:
The point of having 12 people on a jury is not to make quick, effective decisions – but to maximize doubt. If that is your goal, by all means have a 12-person decision-making panel. If not, I would think 12 apostles – great for spreading the word, for going and doing likewise, but not a nimble and accurate decision-making group.
7) Dunbar’s Number: 100-230
The final unit size I’d like to consider is the ideal maximum size for a team, or a company. The key element to remember here is that you aren’t just herding cattle – you are trying to create highly effective and sustainable networks. The ideal number here is ‘Dunbar’s number’ – that a group shouldn’t exceed roughly 150, a number limited by the size of a human’s neocortex.
If it feels to you like a healthy community ought to be able to be bigger than this, that’s because you’re thinking about the wrong number. The question isn’t how many people are in the group, but how many relationships exist within the group, and therefore how well integrated it is. This increases exponentially, as below – so 150 people actually means 11,175 effective working relationships to maintain. Ow.
The number is actually supposed to vary a bit with intelligence levels, meaning you can stretch the effective group to about 230 in academic circles for example. Though I personally am yet to see a cohesive unit of 230 academics.
8) The Wisdom of Crowds
Many people have already written brilliantly on this subject, both on ‘community crowds’ (eg very large groups of programmers working on improvements to open source software) and the intuitive capabilities of ‘real crowds’ (ie randomly selected, much better at Who Wants to be a Millionaire than your best friend, but occasionally spectacularly wrong.)
My only contribution would be that real crowds are good for research; and community crowds, whilst connected to a larger effort by software platforms, tend to work in practice in ones, or twos, or sometimes, very briefly, in triumvirates.
GROUP DYNAMICS IN PRACTICE – WORKSHOP EXAMPLE
1. Give well-stimulated individuals the scope to create and craft ideas.
2. Encourage dialectic pairs to improve and substantiate those ideas.
3. Allow threes to get a couple of disruptive new ideas on the table.
4. Give a well-balanced four the mission of bringing the ideas to fruition.
5. Get a group of 7 or a 9 to sift and choose the best ideas.
6. Use the broader group (12, 14, 18…?) to go forth and spread the word.
7. Don’t get a team of any larger than 150 to execute anything, ever.
8. Test your solution against the crowd – don’t expect them to design it.
They appear in some of the greatest works of art in the history of the world – because they had a) money and b) taste.
For centuries, the only way to get your art made was to make it for the Church, or in some cases for the Court. One of the key elements that spurred the great artistic reawakening was the pure economics of the situation – a new wave of more worldly masters, mainly merchants and venture capitalists. Of whom the undoubted rockstars were the Medici.
Out of love of glory bound together with love of art, they empowered artists, writers and poets to broaden their horizons, to explore the secular, and to express themselves on the grand stage.
Closer to the present day, we have this:
Now Richard Branson is no Medici (thus far.) But he is a representative of something we are all familiar with, whether as professionals or as irritated film goers – that as the commercial models of the past video industry crumble, we see increasing roles for businesses in funding (and appearing in) entertainment content.
It’s easy, and perhaps right, to be pretty suspicious of this force, particularly as it migrates to a more subliminal level, and to the screen in your living room rather than the public forum of the cinema.
They really could have leveraged the space on his forehead more effectively to improve brand recall...
But the fact is that the economics of the world of TV are changing. This will definitely mean a more integral role within the world of TV content for brands and companies. I would argue that there is also a potential role for individuals, and that the day of the great patrons will return. Why?
In the past, there were 4 things that gave you power in the world of TV, and that made the TV channels and networks powerful (loosely quoting Andy Lippmann, of the MIT Media Lab.)
1. Access to a mass form of distribution
2. Money for and capability at promoting content
3. Venture capital to invest in creative projects
4. The taste and expertise to act as a curator and editor
What is interesting about this is that we sit at an interesting turning point in the world of TV, where the first two are changing beyond all recognition.
What an Apple TV almost certainly won't look like
Distribution is through the internet or wireless connection – and YOU own and pay for that. And promotion…well, increasingly, you do that too, because the recommendations you and your friends make to each other are much more powerful than any advertising campaign.
So that leaves two aspects: venture capital for creative projects, and the taste and expertise to choose the right content and support the right talent.
Which sounds an awful lot like artistic patronage. Add this to an app-led, tablet-style interface and the chances of watching the Bill Gates or the Madonna channel seem more and more likely. Especially as Twitter, Facebook and Google Plus increasingly condition us to expect a Web of content organised around people, rather than topics or channels.
This wouldn’t be anything brand new – CurrentTV feels near as dammit like the Al Gore Channel.
Current TV. Good, but ahead of its time.
And it won’t necessarily be individuals…it could be collectives (like the Coppolla/Bogdanovich/Friedkin ‘Directors Company of the 1970s)
The Directors Company - good, but ahead of their time (and generally mad and power-crazed)
But in the next phase of TV, artistic patronage could come back in a big way.
Innovation means doing things differently to do them better. That means breaking rules and conventions. And not all rules are of the ‘wear a tie to work’ variety.
When breaking rules, always be sure to make the typefaces off-horizontal
In order to make things work better, society as a whole needs people to test and break the ‘big’ rules too. These people are criminals.
99% of criminality is of course not socially productive. But, either by intent, or by the response it provokes, it does form an essential function in helping society to innovate and improve. Here are some unlikely examples:
1. Exposing flaws in outdated systems.
When you think of a criminal, one of the stereotpyes that first leaps to mind is one that was formed in the midwest of the US in the early 1930s. Dillinger, Bonnie & Clyde, Machine Gun Kelly, the Barker Gang – a flurry of iconic figures that all derived from a very specific time and place.
This spectacular outburst of criminality derived from two seminal technological innovations – the creation of cost-accessible machine guns and cars.
But what it exposed very rapidly was the Achilles heel of the American police system – the disunity between states. As huge as the United States is, the addition of the car enabled criminals to travel between 9 different Midwestern states, and thus 9 different legal systems, within a couple of hours. Once you got over the state line, you were fine.
Have car, will travel. 6 states in 2 years. 6 legal systems.
This had some negative consequences of course – primarily the militarisation of the FBI and subsequently the police. But the real innovation that they drove, and which J Edgar Hoover used their notoriety to champion, was a centralisation of shared criminal justice data. Which was the first great data project that began to make the United States a more unified society under law.
So think about criminals as like the ultimate QA testers of a civilised society.
2. Pioneering in the grey areas of the economy.
Sometimes, the rules simply aren’t very clear, because the dynamics of society, information and the economy are changing far quicker than the ability of legal processes to catch up. Particularly in areas relating to media and technology.
For example, from the moment of the invention of the internet, the entire principle of music ownership and copyright were effectively a Wild West…which the massed ranks of copyright lawyers, record labels and music publishers had little or no interest in exploring. Then came Napster (invented, of course by Justin Timberlake).
Whatever you think, or thought of Napster (and I am sure a few Metallica fans out there thought it was a bad thing) there is little doubt that they rapidly accelerated a wonderful burst of innovation in the music and content business. And without the Napsters, or Pirate Bays, or other people who by simply not caring about the law define themselves as criminals, it is difficult to make progress.
This need for legal flexibility in the area of content, media and technology continues to flummox us. They need to be answered at some point, but rigid initiatives like SOPA are not the answer. And to even see where the lines should be drawn, you have to allow pioneers to colour in the grey areas.
3. Destroying ‘holy cow’ legacies of the past.
Lest we forget, the legal standards of the present have not always been reflected throughout history – the law is evolutionary, not absolute. Nor is it always ‘progressive’ – it frequently oscillates between extremes and loops back on itself. This can lead to some surprising results, and even very trivial crimes can have significant effects.
For a spectacular example, we can look at one of the most frequently asked historian’s questions: why did the French Revolution occur. Everyone agrees that literature played an important role. Traditionally, people thought the seminal text was this:
It turns out that the flood of cheap political pornography in 1780s Paris, depicting Marie Antoinette, Fersen and Louis XIV in unflattering sexual poses, may have contributed somewhat more to encouraging people to overcome their religious scruples towards the position of the monarchy, and get behind regime change.
Absurdly, given the neo-censorship of the web, this is the closest I can get to finding a political cartoon that was considered only mildly risque at the turn of the 19th century
Fairly trivial crimes, driven by disrespect and entertainment, can have highly constructive outcomes – when they challenge unjustifiable authorities that are holding society back.
This seems like a fertile area. Any other thoughts on the relationship between crime and innovation?
Elsewhere in the history of illicit entertainment, see the development of the internet and the explosion of personal photography/video equipment…and if you don’t believe me, check out how they sold the first Polaroids:
Advertising may get on your nerves in the here and now, but it can be a fascinating lens on how the world has changed. These ads from the 1930s-1950s show how advertising, and the world, has changed. Sometimes, but not always for the better. Spot the differences.
All images here are courtesy of the superb New York Transit Museum in downtown Brooklyn.
1. People used to assume that advertising was supposed to be useful.
This reminds me that in simpler times, advertising WAS useful.
2. And in fact, advertisers used to see it as their duty to fund stuff that people loved (and used advertising to remind them of the fact they’d done it.)
That is a good reason to change my brand of bread.
3. The design of many ads used to be truly BEAUTIFUL.
I am guessing Sunkist ads don't look this nice now.
4. Advertisers of the past were not afraid of wading into the gender war. (It’s a bit more subtle now, though no less pervasive.)
Is this powerfully political, or outrageously patronising? Hard to tell from these 8 words.
5. Health claims were not rigorously examined.
As a lifelong eczema sufferer, I can assure you that Cadium did not change the face of dermatology
6. But you were at least allowed to acknowledge that salt tastes nice.
I have no idea what this product tasted like. It doesn't sound great...
People have always defined themselves by what they aren’t as much as what they are. It gets passed on from group to group: the Brits defined (and continue to define) themselves by difference to the French, and in turn Americans by difference to the English.
This person simply will not pay duties to the hated English without due representation.
Otherness is a fundamental source of inspiration and aggression. Men and women, we are told, are from different planets. National pride and xenophobia continue their uneasy see-saw relationship, despite living in an age of global media, global business and global pandemics. As soon as the psychological power of the boundary between Lancashire and Yorkshire begins to lose its power, it is replaced by the power of the psychological boundary between environmentalists and Clarksonites. Everyone needs a Them.
Them has been one of the great subjects of the rise of the popular media. In particular, from the satirical pornography of the French Revolution to the neon Schwarzenneger epics of the 1980s, via rather a lot of good protest and faux-protest music in the 1960s, raw and paranoid rage against the secret machinations of those in power has been one of the driving forces of the development of what we know as popular culture.
Apparently, the presence of this hot material in Paris in the 1790s is what made revolutionaries feel it was OK to remove this woman's head
This format has been endlessly rehearsed, and forms the plot of roughly 25% of serious TV drama and 50% of Hollywood’s entire output. For me, it reached its nadir in the execrable Adjustment Bureau, on which I expended 2 hours of a recent flight that I could have better spent waiting in the endless, awkward queue for the world’s smallest bathrooms. The Adjustment Bureau was full of a ‘They’ that were all powerful and yet at the same time totally bloody hopeless.
If you haven’t seen the Adjustment Bureau, it is a longer and less good version of this trailer:
Now I realise that power continues to be distribute unevenly and unjustly, and that governments like to get away with doing things they shouldn’t be doing (or at least haven’t asked permission for, like removing the tyrannical leaders of countries.) But as I watched the Adjustment Bureau, I couldn’t help but think of the material I had read from Wikileaks, and for the most part how unbelievably banal and even mostly well-intentioned it was.
The ability of the internet to scale discoveries at lightening speed and the ubiquity of the rolling news camera have largely laid bare the secrets of government. Those that remain buried are, unfortunately, deeper buried than ever. But, for the most part, we have see all the rough edges and hidden errors of our leaders. And frankly, they mainly seem like people we would generally sooner pity than fear. FDR was able to act as President of the USA for 12 years and only two pictures were known to have been taken that showed him in a wheelchair. Whereas with Dubya, we got a pretty constant serving of this:
Now was I thrilled that this guy had his finger on the nuclear button? No. But I knew he wasn’t really a monster, because I do stuff like that all the time.
But that merely moves us from the theatre of fear to the theatre of pantomime. It takes a real work of art to remind us that we aren’t in a theatre at all, and that if we are it certainly isn’t a paranoid retro-futurist conspiracy thriller. This is I think why the Wire is such great drama – not just because of its entertainment value, but because it achieves the almost unique attribute of making every single character rounded and worthy of empathy if not always sympathy. Everyone is trying, everyone is flawed, and everyone is connected.
He allows homicidal drug dealers to die worrying about their hair:
And he allows cynical politicians to express their intentions…which are, at their root, generally positive, no matter how the system may pervert them:
This is a recognition of the reality of the world that we live in – that it is not the puppet show of shadowy cabals or machiavellian geniuses, but a totally interconnected system in which everyone, actively or passively plays their part and affects every other part of the system.
And as the world we live in begins to run on a track of behavioural data and social connection networks, the reality it that more and more the world is becoming a great big human system, in which every one of us is a working part. And the structure of that system is as fragile as it has ever been. It is harder than ever to keep any part of the system secret or separate. And the system itself is much more fragile than it looks.
It has never been so easy to round up a posse
This is a positive thing and a negative thing. It is a danger and a responsibility. It enables some people to break long-standing taboos in the name of humanitarian action, and it enables others to create a pattern of chaos and looting without a cause. But ultimately it is all a reminder that the human system is there, and that everyone is connected.
In this context, it feels like we have taken ourselves down a blind alley and created a worldview that is far too centred on the idea of ‘Them’. You can’t get rid of a sense of otherness, nor should we try. But sometimes it feels like because of Watergate, and Hackgate, and Expensegate, and because of a moviegoer-level understanding of Orwell and Dick, and because of Grand Theft Auto and gang violence and all these things that seem not to translate across generations, we are really prone to devolving to a facile games of Us and Them – which is a dismissal of both hope and responsibility.
We don’t live in the Matrix. Rupert Murdoch isn’t a Bond villain. The TV screen is not the Newsscreen of 1984. We live in the most empowered and interconnected period in history. We all have to ability to reach out to other parts of society at any moment, or at least give their point of view. We have unlimited outlets for self-expression, and a legal system that fights bitterly for our right to use them.
It took mankind a lot to get us to this point, where we have so many tools at our disposal to feed our empathy and impact on our governance. And to watch the modern world slip by us like a bad sci-fi movie or a cartoon Lord of the Flies is a terrible waste of all that effort.
This week saw the closure of an outdated and unprofitable Sunday tabloid newspaper. It also saw a feeding frenzy on the unholy relationship between broadcast media players and politicians. One of these stories is very important.
It is so easy to use pantomime villains like Murdoch and Brooks, or even Cameron and Blair, as the focal point of our righteous indignation. To do this is to ignore something much more fundamental at work. What we are seeing is an assassination attempt on the now long-standing axis of News International and the British Government.
There are some angry people out there.
The white blood cells of the Guardian, celebrities and the massed ranks of the Twitterati are in full onslaught against every chink in the Murdoch armour. They are determined to use this moment, in which News International should have been celebrating their impending ascendancy as masters of the convergent media battlefield, to bring their ambitions crashing to earth.
This is no mean feat. After all, this is one of the most potent power relationships in the UK. To an extent many certainly do not realise.
This is also in the context of an entire past century in which political power and media broadcasting have been inseparable. In fact the political history of the 20th Century can be seen just as clearly through the lens of media change as through the lens of political wings. After all, it was the era of mass newspaper distribution, of radio fireside chats, of movie newsreels, and live televised debates.
When you think of the icons of the political nineteenth century, you might think of them through their portraits. Or perhaps through their speeches, or their nicknames. When you think of the icons of 20th Century Media, you almost immediately think of them through their media appearances. To be a political superstar in the 20th Century, you had to be a master of the media.
Perhaps the greatest of all was Churchill – just one of a list (FDR being another great example) of true gurus of the radio broadcast.
JFK is of course one of the most iconic American politicians of all time – despite a decidedly patchy administrative and moral record. But he was great on TV. Nixon (at this point a hugely respected figure of great integrity) was not.
And to take things to their most logical and ridiculous extreme, let’s not forget that this man is now pretty much the most respected President of the 20th Century.
And played out to its worst extremes, of course the 20th Century brought us the tide of fascism, of Communist based dictatorship – usually established on a bedrock of broadcast-driven cultural brainwashing.
Even in the succeeding and supposedly more cynical age, the power of the broadcast media continued. In particular, still the press, with which politicians remain absolutely entranced, it being the only medium that is truly interested in them, and which enables them to keep score. Particularly the tabloids, which they perceive as being able to connect with ‘ordinary people’ in a way that they have forgotten. And of course to many of them, it is still the Sun wot won it (or lost it).
Some say Kinnock could have lost it without them
Blair and his ‘spin doctors’ were described as a new generation of super-cynical, media-obsessed politicians. In reality, they were the end of the old era – the last generation of effective media managers. They could still, just about, manage public opinion through 3 or 4 really big media relationships, with Murdoch as the centrepiece. But the mere fact that the world of spin is one of the first things we think of in relation to a government that brought peace to Northern Ireland and war to Iraq is testament to the unravelling failure of that form of message management.
And now, we see Cameron, the apparent heir to Blair, the PR man in Number 10, playing out the next stage of this decaying power structure. Suddenly his power base looks fragile, and his big bet on Murdoch and Coulsen looks rash and destructive. Not only because of ethical questions – but because when it really comes to the crunch, even Murdoch’s legions represent a pretty small part of the spectrum of opinion, and a tiny fragment of the playing field of active participation in political discourse.
The relationship with the media isn’t going away as a crucial success factor for politicians. It can only become more extreme as media itself becomes a bigger part of life. But the axis of politicians with ‘The Media’ – ie a small circle of powerful but venal owners and editors – is no longer a sustainable power model. It is more transparent than ever, and there is more of the political discourse outside of their control. It is a more fragile base than ever on which to build control.
Nor are the traditional skills of message management going to retain the same power as before. The idea of owning the ‘news cycle’ practiced so successfully by Blair and Campbell in their honeymoon period, simply do not work if your ‘workings’ can be scaled to the population at any moment, without the need for a broadcaster to drive the distribution. Which is why this kind of approach from Ed Miliband simply will not work any more.
A new generation of politicians will find a new way to bend the media to their ends no doubt, but retaining the kind of control they are used to won’t be possible in the future. We see politicians dabbling in listening exercises and ‘Twitter Town Halls’ as they dip their feet in the future. But it is fair to see we haven’t got it work out yet (I will consider this in a future post.)
One thing that is clear is that as with entertainment and marketing, a distribution model on its own will not be enough. Ultimately content – transparent and compelling actions – will be more powerful than ever.
Media may be making kids get older younger. It is almost certainly making adults stay younger for longer. If they meet in the middle, isn’t this a good thing?
As a doting and anxious parent, I am not always a big fan of the toy industry. Behaviours and attitudes formed early in life shape people’s long-term outlook, no doubt, and some of the materialism and sexualization of children’s play that is inherent in some toy lines really grates with me.
Not in my house
BUT this week I saw Mattel say something brave that had the unmistakeable ring of truth – that the widely accepted phenomenon of ‘KGOY’ (Kids Getting Older Younger) may be drastically over-stated. Kids may be dressing a bit more like us, but for the most part, kids are still kids, and their emotional and psychological development actually proceeds much as it ever did. http://bit.ly/jQATu7
This is part of a much broader myth which which we are all familiar – the ‘Myth of Decline.’ This takes many forms – the belief that society is getting more violent, less compassionate, more dangerous, totally apathetic, more materialistic – and is almost never borne out by rigorous statistical analysis. Nonetheless it is a power instinct, and one in which many media outlets and businesses have a vested interested. So it is fair to say it won’t be going away (and it will probably get worse. Bah.)
The scriptwriters for the Myth of Decline. Everything is getting worse. Apart from the Royal Family.
At the same time, I have been witnessing multiple examples of technology entering the world of kids. My daughter is regularly receiving ’emails’ from family and friends on her mini toy computer. My nephew recently participated in a live treasure hunt in a park in which he was enabled and tracked through GPS. And pretty much everyone I know who has young kids regularly witnesses them trying to control the TV through touchscreen…because the iPad just seems so normal to them.
Media convergence is stopping at sensible, grown-up things. It is sweeping through the world of kids. My instinctive, Myth of Decline driven reaction is to think this is a bad thing. Why? Maybe it is a good thing?
I think there is little doubt that changes in media, in particular ‘social’ media, and bringing kids into ever closer contact with the adult world. But is separation from the adult world such a good thing? And does it ever really exist in practice anyway? The most popular TV show with kids in the UK right from its very inception has been Coronation Street. All of life is there. And as introductions to the world of adults go, maybe it’s not such a bad one.
And at the same time, adults are more and more happy acting like kids. This manifests itself in obvious ways – increasingly, and gadgets, and spend time in gaming like kids do. Could you imagine adults spending so much time on Facebook in a more ‘adult’ age? And in fact childish behaviours are increasingly seen as important elements in the serious business of learning to make the world a better place – as explored by the Lifelong Kindergarten at the MIT Media Lab.
I fact you can probably conclude that for the next generation, childlike ‘Play’ will probably to have the same kind of impact on technological progress as the much more adult ‘War’ did in the last century. And the fact that we are sharing these media and play spaces with children significantly increases the likelihood of benign outcomes.
The other thing that is interesting here is that we are viewing ‘media’ as the dangerous catalyst that is throwing adults and children together – whereas in fact, from the village to the multi-family urban homes, shared adult/child lives have been the norm throughout most of history. It is really only in the ‘golden age’ of mass media that we are just leaving now that the whole concept of a separate ‘youth culture’ has been commonplace.
It has been a fertile counter-cultural melting pot at times – from Teddy Boys, to Punks, to Ravers and beyond. Alienated youth have been able to use media technology to alienate adults from their secrets, from the distortion of electric guitar, to the intensity of 200 BPM, to pirate radio.
But just as often we have seen some fairly mean exploitation of youth culture and youth media, from the Beatles to Bieber. Where some saw in the proliferation of media a chance for the young to question authority, it is fair to say more often the mass media has been used by adults to subjugate the young – and that ultimately in the war against the mass-media counter-culture, the suits won.
Maybe, through social platforms and location-aware services, through converged gaming platforms, kids and adults are increasingly going to start living in the same world. There have been many teething problems, and there are bound to be more. But maybe, just maybe, it will be a good thing.
So we may be at the point of finding some good answers to the problem of the last generation. Great. We are thus far not much closer to solving the problems of our own age – of reconciling with an older generation that is often alienated by technology, but whose size and needs are growing all the time. A problem for another time.
A very inspiring English Literature teacher once told me that all poems were about poetry, and all plays were about plays. Increasingly my newsreaders seems to be telling me that all news is about news.
Now I happen to think that poetry and plays should be about other things too, but what my teacher said stuck in my mind, because for literature by and large it is true.
The artistic process is inherently a meta-process, because any medium we engage, from blank page to blank canvas to blinking cursor, acts a mirror to ourselves. Ultimately when we engage with the world of imagination we only have ourselves as material to work with.
Great artistic works, like Proust, or Hamlet, or, are often acutely meta-textual, to an extent that they feel almost like organisms becoming aware of themselves. And in fact we almost define the trajectory of artistic progression as a medium’s path to supreme self-consciousness.
This is not a pipe. But it is a fantastic student poster.
But the news is different. Because the news, by and large, is one of the most essential tools we have in creating social cohesion and empathy, and to help people to understand real events in the world around them.
But it seems that finding out the news is just getting harder and harder. Because all anyone wants to tell me now is the Meta-News.
What do I mean by this exactly? Well, after reading this watch pretty much any TV news apart from the BBC World Service, or read pretty much any newspaper apart from the Financial Times, and you are likely to very quickly notice that about 50% of the airtime is devoted to coverage of the reaction to the news, or the process through which the news was obtained, or the difficulties in filming the news – with astonishingly little detail on what has ACTUALLY HAPPENED.
The modern news studio - a monument to Meta
Nowhere is this worse than in the world of 24 hour live news, in which the irregular flow of real news poses as significant threat to the much more regular flow of actual minutes and seconds. One thing that remains constant however is the speed at which people speak, film and report the news. That makes it a godsend to the rolling news editor.
This whole phenomenon went way beyond satire some time ago, though it has fed some of the very best, from Brass Eye to Charlie Brooker.
But that doesn’t make it any less disturbing to try to discover the details and impact of the Osama Bin Laden, and have to try to weed out a couple of actual facts from amongst the debris of people’s emails, footage of strange macabre people dancing in Times Square and a randomised selection of tweets.
This last area is particularly painful. News knows that email is important, and that there is every chance Twitter might be even more important. What is the response of TV news? Use it as filler. The ultimate, infinite time-filler of opinion. What’s more, a bottomless pool of opinion – which means you can easily find opinion to back the agenda of the broadcaster. Perfect.
The result – a relentless flow of jabber, which makes the angry angrier, the old-fashioned ever more befuddled, and which to the vaguely tech literate looks like an old-fashioned headmaster putting on sunglasses and trying to do some tricks on a skateboard.
All of which is silly, and infuriating, but worst of all, represents a collective shrug by the news broadcasting industry at the creative potential inherent in the most connected age of mankind, to get people to understand and empathize with news in ways never achieved before, in favour of the news equivalent of the music you hear in lifts.
It’s not all bad. Anderson Cooper on CNN is immense, and his fluency in the multi-screen world is awe inspiring – including the seamless interaction of international coverage and inside accounts on Twitter and YouTube into his reports. This is what the new golden age of newscasting could be all about.
In the meantime, the great majority of news coverage is still rather self-excited, and lost in a tedious and iniquitous spiral of-Meta News. Let’s hope it emerges soon.