Category: Uncategorized

Tech (For Good) Can Damage Your Health

James Richards of Chromatrope ponders what the world would be like if as a society we treated tech in the same way we now treat alcohol?

I work in London a lot, and this year more than ever, I’ve noticed that the festive season has brought a lot of very, very drunk people onto the streets. People want to have fun, and for lots of Brits, having fun equals getting as pissed as possible at the Christmas work party. We’ve all (probably) done it and I’ve certainly done it myself.

Some of us are trying to be a bit more mindful about how we approach both booze and ‘technology’. As a society we’ve seen plenty of critiques of binge drinking culture in the UK, and the media is forever pondering why we can’t all enjoy booze like the famously and possibly erroneously moderate French or Italians. The same is now the case for technology. Is it bad for us? Are we addicted to tech? Have we sleep-walked into a surveillance-capitalist-consumertariat dystopia from which there is no escape and no return?

As we all know from banning the mass consumption of alcohol in 1920’s America to the contemporary ’war on drugs’, prohibition doesn’t ever really work.

You can’t ban an idea, and if the idea is to drink, smoke, or drink un-taxed cups of tea, then people will find a way. Prohibition drives the market underground. Robber barons muscle in to make a killing from the human desire to party hearty, and the quality of the product degrades and becomes even less reliable and unsafe.

The consumption of alcohol within society, now happens within complex legal, social and healthcare frameworks that emphasise safe patterns of consumption and the rights of ‘users’ to protect their own health and the health of others. Society once thought it was acceptable for children to drink alcohol or drink and drive, but we now understand that isn’t safe or acceptable and clear legislation protects us all from the harm that alcohol can cause.

Like changing attitudes to alcohol, the best ways of protecting people from ‘bad’ technology is to change the culture of use.

Openness about our habits, and more human shaped patterns of consumption that emphasise quality over quantity and the socially beneficial aspects of moderate drinking are certainly mappable to tech.

Many of us have become critical, not of technology itself, but the way in which it’s appeal to us as upright mammals has been honed and concentrated to a point where some of us are seemingly ‘addicted’ to our devices. The big difference in this equation is that we don’t have to drink alcohol, but we do (increasingly) have to use technology.

Blaming technology for such a vast and wide ranging spectrum of problems is a bit like launching some Cultural Revolution style war against the sea, because every year people drown there. In any case, we can’t ban ‘bad tech’ because tech is both good and bad. Rather than banning the sea or fencing off beaches, we instead teach people to swim and provide buoyancy aids. By enabling people to make safe choices for themselves we protect them, and get to still benefit from all the good applications of tech.

There’s currently a great deal of hand wringing and debate about the dangers of tech, and how we can all collectively ‘do tech better’. The fact of the matter is that for many people, ‘technology’ as a thing doesn’t really exist.

Technology is buried within the phones, computers and applications that we all use. The tech products that now scaffold and enable our interactions, jobs, entertainments and lives are offered to us on a binary basis; the ones that we get for free but which mine and exploit our data; and the other expensive ones, that apparently still do the same.

When it comes to technology, maybe we need to stop worrying about whether tech is ‘good’ or ‘bad’; or whether it should be banned, but instead imagine what the meaningful, lasting and measurable steps towards harm reduction might be. Like the corporate booze pushers, we should be taxing the vast revenues of the tech companies, and investing a proportion of that tax in measures that help the most digitally dispossessed and vulnerable.

We need to be clear about the places and times where we feel it’s ok to consume technology. For many people a glass of wine with dinner is fine, cider for breakfast probably isn’t. The same applies to our dependency and relationship with technology. We have to be more honest about the lines between what is ‘good’ and ‘bad’ tech, and instead be explicit in our conversations about safe levels of harm reduction and apply palpable penalties to companies that obfuscate or lie about the relative ‘safety’ of their products.

What is tech for good? The short answer is that it’s the opposite of ‘tech for bad’ and ’tech for bad’ represents different things to different people.

For some, the problem lies with AI taking away our jobs. For others it’s AI’s role as amplifiers of bias and inequality within society. Other people are concerned about the existential threat that surveillance capitalism poses to our individual and collective privacies and freedoms. ‘Tech for bad’ has shaken democracy and skewed election and referendum results. Tech allows and enables the mass surveillance and control of huge populations, and in China is responsible for an entire generation subject to levels of dystopian social control that would have seemed unimaginable a few years ago. We’re told that tech is isolating and damaging and leads to bullying and blame. Some critics state that tech has brought us to a culture of snowflakes swirling in a never ending social media maelstrom of anger and offence. Tech is also marvellous. Like a hammer that can be used to either build an ark or used to bludgeon squirrels, technology is only as good or as bad as the hands that wield it. Raise a glass with me to toast the future of humane and mindful technology as we all stagger towards the new year. Cheers!

Public Sector Co-Pilots

In navigating towards a ‘public service internet’ it can be tempting to have the vision of a better future, obscured by a dystopian miasma of gloom. The collision of data and AI certainly present us all with some hefty challenges, but we’re still in the game, and there is a great opportunity for public sector bodies to act as our co-pilots en route to the final destination of a functional public service internet for all.

Organisations like the BBC, have a key role in helping consumers to safely navigate the airspace of their own personal data. This activity should be centred clearly around the establishment of the individual as lead pilot of their own destiny; as creator, owner, and protector of their own data. We’ve been really pleased to have been doing some thinking and doing with the BBC and others to help make this future more certain, but there’s always more that can be done.

Everybody needs a little help
Publicly funded agencies, acting in support of the common good, would be trusted by many to perform the critical role of ‘data buddy’. Designed in the right way, and with the correct, compelling route to market, the personas of the ‘data buddy’ might even shape themselves to dynamically respond to audiences. One stop ‘drop in’ products can sit alongside imaginatively realised content formats, video and products designed for longitudinal use. The key message for us all as users and generators of data, is that the world has changed and is certain to continue to change. Users can look to public bodies for the reassurance and support they need for the long haul.

Empowerment and opportunity
The invitation from public organisations to better understand and engage with our personal data, is best framed in terms of empowerment and opportunity, than in catastrophe and doom. The issues for many of us around the ‘affordances of data’ are in fact the issues that affect us and the lives of the people that we care about personally. Understanding (more) and acting (effectively) to respond, relies on a baseline of personal data literacy, and to have a lasting impact and reach we need to frame the stories and activities in terms that engage most people. This engagement is best framed within the language and mechanisms that already speak to us as consumers, using all of the tools, platforms and techniques of traditional and new media.

Data and identity matter
If data is the blood in our new digital bodies, then identity is the brain of the organism. The verisimilitude of our ‘real’ identities, formed from our ‘unreal’ virtual, digital selves is now less certain. Whereas we might once have supposed that our digital identities, were less complete (and completely) a representation of our ‘self’, we now must face the fact that our digital selves, may be a more truthful representation of who we really are. Access to and ownership of these identities is critical, and public bodies can and should act as advocates and protectors of that ownership, being maintained in the hands of the individuals themselves. The impact of a programme of well realised data literacy initiatives, can be significant on the digital health and well-being of the UK population. Establishing the right approach now to capitalising upon our collective data, is as important to the the long term happiness and survival of the citizens as climate change. It’s also just as important to the long term survival of the BBC and other public sector organisations.
#publicserviceinternet
#techforgood

Data

Towards a Public Service Internet

With the revelations of Cambridge Analytica and Facebook’s role as malign hackers of human emotion and democracy, it can come as no surprise that there’s a scramble to look beyond the worryingly dystopian looking internet present towards some preferable futures.

The BBC has historically fallen under the glamour of the larger, shinier and more commercial competition, who are in fact, given the unique way in which the BBC is funded, not really the competition at all. Rather than asking how the BBC can be more like Google and Facebook in theory and method, the BBC could and should be making a crystal clear statement of difference. The corporation should be asking how it can make it’s offer diametrically opposite, even if that means an initial drop off of audience numbers. It should feel a confidence that those audiences, when looking back to survey the smoking wreckage of the data wars, will know that the BBC and other public institutions were on their side.

In recent years there has been a troubling move within the corporation to dodge awkward questions about it’s own (and it’s shell corporation Media Applications Technologies) spending with Facebook,Twitter and Google. Freedom of information requests have been denied on the grounds that information is held for the purposes of ‘journalism, art or literature’, and therefore exempt from the act.

Try to sign up for BBC iPlayer and you’ll be asked for not only your date of birth and postcode but extraordinarily your gender. The BBC insists that this information is protected and designed to improve personalisation, but it indicates a hand over fist data grab and collection that mimics the worst practices of the surveillance capitalism behemoths. Similarly the BBC has toyed with ‘addictive by design’ principles in other digital products. The autoplay function hated by anyone with kids, is enabled as a default on their iPlayer Kids app and impossible to remove in settings when viewing in browser.

The intersection of data and AI represents a range of amazing opportunities and risks for us all. The impact of big data on all our lives, is arguably as fundamental an issue for us personally, societally and globally as climate change. Like climate change, the topic can be illusive and hard to visualise. There are few people who can claim to have a clear or accurate mental model of how data and AI are combining to create our futures. Consumers should be in a position to trust public bodies to assist them with making ways of accessing, understanding and making the most of your data, safe, accurate and clear. Consumers need help in understanding the complexities of their own personal data; many just don’t know it yet.

There isn’t one solution or provocation that can serve to both alert consumers to the existential threats to their own privacy and then engage them with action to mitigate against those risks. The issues that we’re all facing are larger than us, and need to be addressed on a level that is outside the experience and influence of many of us. There are however beginning to be significant signs that the challenges and opportunities for combined data and AI, are beginning to concern the population. This is in part because of a growing coverage in the media of the problems inherent with constant surveillance, the loss of privacy, and the mass abdication of responsibility and ownership of our own data. It is also in part due to a simple passing of time, and an accretion of the larger and small signs of how we really are, into more rationally explicable statements of being and truth. The BBC has a significant role to play, as interpreters of these ways of being and truths, on behalf of and alongside audiences and citizens.

The Inbetweeners – Liminal Spaces and Transitions in VR/360

The Echo Chamber – Notes 05

One of the most interesting of the formal challenges of creating media for VR/360 appears to be how to make the most of the transitions between shots or scenes.

As a filmmaker in the old media linear world, you have a toolkit of tricks at your disposal when cutting between shots. You can fade between shots (black or white) or simply cut between them. There are plenty of subtleties within the fade or cut including of course how long is the fade, and where in the action do you place the cut to create the desired effects of of pace, atmosphere and energy.

In a VR/360 experience the viewer rather that watching the transition is ‘within’ the transition. They have some presence ‘inside the cut’.

A viewer watching TV or video is essentially watching a moving two dimensional image, with any transition between shots an absence of image, that we all now read and understand as indicating the end of one thing and the start of another.

Hal_C._Kern_(Gone_with_the_Wind_-_1939)
Hal C. Kern, Supervising Editor of Gone with the Wind 1939

The fade from shot to black to new shot is probably the device that we’re most familiar and arguably comfortable with as it probably most closely represents the transition that we all experience with our own eyes as we blink (blink and see what you think). Filmmakers have other tricks up their sleeves as well including the wipe, where one shot is wiped away by an incoming shot and the iris transition where the action narrows to black, via a shrinking (usually round) window onto the shot.

In a VR/360 experience the viewer rather that watching the transition is ‘within’ the transition. They have some presence ‘inside the cut’. This means that the rarely considered fade has to become something a bit more ‘immersive’ and can become a useful narrative and experiential device.

Expanding this thought a bit and indeed the transition itself, you can begin to imagine how the transition could become an important liminal space within the film. Liminality is the state of ‘in-betweenness’ where one is underway within a ritual or experience, but not yet complete – literally ‘on the threshold’.

We’re still thinking hard about how we make the most of these liminal spaces within our own film. What we’re certain of, is that the visual transition will need to work in concert with the sound design. We’re developing a series of explicit and hidden audio cues that will prepare (and confound) the viewer to expect certain things to happen.

We’ve been discussing what it would be like if the fade dropped down onto us from above like a black blanket, or if the fade mixed up from our feet to our heads. We spent a lovely afternoon this week with our friends at Ardman Animations considering some other approaches as well, including borrowing from the visual iconography of video games – more of that in another post.

What is certain is that the moments that transition between the shots, these liminal spaces, may end up being as important as the shots themselves – they may in effect become micro shots – which then brings us back to the question of what do we put between them?

Oiling the Empathy Machine

The Echo Chamber – Notes 04

We’re well underway with making our VR drama Echo Chamber for the BBC. One of the words that has been coming up a lot during our research and discovery phase and in meetings is EMPATHY. VR has famously been described by Chris Milk in his TED talk as an empathy machine. It’s a nice, pithy high concept phrase that has in turn been repeated all over the place by journalists and commentators. A more empathetic world is a better world right? I mean, we all want to understand more, to empathise more, because that will mean that we care more. If we care more, we can’t fail to do more – at least that’s how the received wisdom goes. That means that VR is going to make the world a better place. We’re going to be immersed in worlds and introduced to people whose stories we must empathise with.

Chris Milk’s film has reached around 1.25 million views via TED and ‘VR = Empathy’ is a compellingly neat narrative – but with all respect to Chris and all the other creators (myself included) who want you to watch and engage with our media – we would say that wouldn’t we..?

L'Arrivée d'un train en gare de La Ciotat
L’Arrivée d’un train en gare de La Ciotat

There’s a famous and almost certainly apocryphal story of how cinema audiences in 1895 fled in terror from the Lumière brothers 50″ film of a train (silently) thundering towards them. Whether or not the story is true, it demonstrates our belief and hope in the visceral power of media to create impact, make us feel and care and in turn act.

It will be entertainment, spectacle, sensation (depressing probably porn too) that power VR to a popular mass market. In our newly drawn media world the power of VR to act as an empathy machine has to be up-sold. Otherwise nobody will see your film. Nobody will empathise. Nobody will care.

Nick Fraser the commissioning editor of Storyville has spoken about ‘Why Documentaries Matter‘ and in a talk to the Frontline Club he outlined his thoughts and has written comprehensibly in a report for the Reuters Institute on the challenges of keeping documentary vital and relevant. Nick once told me that people want to believe that documentary can change the world, but that it doesn’t. People change the world. In the introduction to his paper for the Reuters Institute he quotes Simone Weil.

The most important modern philosophical problem is attention.

Simone Weil

Perhaps if the real problem we’re facing is with attention, then maybe VR has an opportunity, however fleeting to be the thundering train, the talkies, cinemascope, the perecpto of the moment? We’re beginning to think of VR as more of an attention machine that an empathy machine. If that leads audiences to empathy, laughs, tears, fear and all the rest then great.

So… Sure – VR is an empathy machine. Clap on those goggles, oil the empathy machine and get ready to pay attention, empathise, care – and act!

 

 

 

 

 

 

 

 

VR Play Day at the Pervasive Media Studio

The Echo Chamber – Notes 03

Sally recently spent an afternoon at the Pervasive Media Studio in Bristol attending their ‘Virtual Reality Graffiti Jam and Play Day’. The event provided a fantastic opportunity for us to try out a variety of VR platforms and tech and meet some interesting practitioners in the field.

Producing a VR film is a new adventure for Chromatrope and so the event was well timed to fit into the discovery stage of our production. There were some really helpful takeaways from the afternoon which will feed into the early stage of our project planning and have already helped direct us towards deciding on a suitable platform and story. More on that to follow in future posts…

VR

One of the lessons from the day is that when a story is truly engaging, the immersive experience can be surprisingly convincing and emotionally powerful. Even with a fair level of background noise and the knowledge that other people are in close proximity, a good story melts away the self conscious feeling that you might look like a bit of an idiot to other people in the room.

An engaging story can also be more important than the type of headset you watch it on. An oculus headset looks set to be priced at over £400 when it’s released later this year. However, a Google Cardboard viewer can be bought for around £10. When a decent set of headphones is placed over the top of whatever headset you’re wearing, the success of the experience is all down to the story and the sound and not the platform. We want our film to be seen by as wide an audience as possible so for now a low entry point is of great importance for us.

Richards Crandon, director of On Par Productions was at the event with a couple of Samsung Gear set ups which were running some of his company’s recent productions. The Little Arrow and Conductor 360 films were both very engaging. What became clear from both films is that sound is very, very important for the 360 experience. Firstly it provides vital indicators for the viewer to turn, orientate and notice the thing that is happening (or about to happen) behind a current viewpoint. Secondly, it doesn’t pay to be too subtle with these indicators – if you miss the cue, the action moves on and you’ve missed the vital part of the story that just happened behind your back. Background noise can also be a distraction to the viewer and the subtleties of a ‘binaural’ experience can be easily lost below a general hubbub.

As well as helping to clarify some thinking about VR, the afternoon also raised some questions for discussion with the team as we move towards making our film. Where should the viewpoint go and how will it change? Can the viewpoint change within a scene (and if so how do you do it so the viewer won’t feel sick)? Will the user have control over motion and movement? Will there be interaction with the actors or is it purely about observation of events?

We hope to start answering some of these questions soon, but expect also to find lots more questions along the way. By blogging about the project we want to share our experiences with others and also keep a record of the process for ourselves so we’ll be ready for the next VR project when it comes along! Please join in the conversation via Twitter on #echochamberdrama.