How technology and innovation are changing creative processes

Technological progress has long affected the creative process and many advancements have made the industry a much more diverse and interesting place – as well as giving birth to completely new markets and categories. But could technology replace certain creative functions altogether?

The machines are coming

Each new era’s major technological development has brought fundamental change from the era before, and a raft of predictions too. There was a constant though; while the media changed, most key elements of the creation process remained the same.

And they always involved humans.

Writers, designers, musicians, broadcasters, models, actors, videographers, animators or developers, you name it, there was a human element to every part of the journey. Now with artificial intelligence (AI), could the creative process survive without human input?

Skynet and scaremongering

As soon as anyone talks about AI and machine learning, it’s not long before the Terminator franchise gets mentioned. After all it’s been giving sentient machines a bad name for over 35 years. The conclusion is the moment computers or machines can do things on their own, they will take over and destroy all humanity.

The reality is the Roomba.

It’s similar to the Terminator in that it’s unstoppable and it can move around anything you place in its path – except it does the vacuuming for you. AI is easily accepted when it’s introduced seamlessly or has an instant benefit to the user. In the Roomba’s case, it completes a menial task for you and gives people more time to do more worthwhile things.

When this philosophy is applied to the creative process, it’s no wonder AI systems have already been used to augment or speed up creative processes for decades. For example, image searching even 10 years ago was a laborious task. 10 years before that, every creative studio had stacks of stock photography books that interns had to look through to find images for digital projects or ad campaigns. Now, algorithms with image recognition technology allow you to search via keywords or filters such as ‘similar to’, ‘containing colour’, ‘related images’ etc., offering a range of options in a fraction of a second. AI not only helps you find images, but it suggests alternatives too, which has led to significant time and cost savings.

Predicting consumer behaviour

AI’s biggest influence in the creative realm has probably been within predictive and data-driven marketing. By taking the guesswork – and human element – out of where, when and to whom online ads are served, it has increased open rates, click-throughs and sales. Of course, these customer behaviour patterns are tracked and predicted by complex data engines and are continuously refined as a result. It’s this trackable data that’s added significant value and insight to every stage of the predictive marketing process. After all, if you can say with confidence what an ROI will be, it’s much easier to get buy-in and sign-off from senior management to start a project or campaign.

So how long is it before completely automated design happens?

AI can already suggest basic ad templates; IBM’s Watson has shown how sentiment can be measured from headlines or email subject lines, and plotted against open rates. Google has experimented with AI by letting its supercomputers caption images with great accuracy. In theory, AI can already create the bedrock of any advertising or creative campaign.

What can the machines really do?

Well, the answer is, pretty much anything we can do.

‘Sunspring’ is a sci-fi movie written entirely by an artificial intelligence bot using neural networks. It was trained to ‘write’ the script by processing dozens of other sci-fi screenplays from the 1980s and 90s, and although it was only 9 minutes long it showed how, with relatively little input, a screenplay of sorts could emerge that hadn’t been created by human hand.

What about music?

Drum machines have long replaced bulkier, costlier and moodier alternatives, and there have been many one-off songs with either vocals or music created by AI. A record called I AM AI was the first full-length album that was entirely composed and produced by artificial intelligence. Released last year, it worked in collaboration with a human artist, who provided inputs that AMPER – the artificially intelligent music composer, producer, and performer – used as composing parameters to create whatever it sees fit.

The key element here is collaboration. Humans are guiding the machines. It’s less about replacement and more about helping to augment what’s being created. So what if it happens the other way around?

Machines create and humans refine

Phrasee is a UK start-up that uses artificial intelligence, natural language generation and deep learning technology to produce creative marketing copy for brands including Domino’s, Virgin and Superdry. At the moment, it’s mainly used to generate copy for email subject lines, Instagram ads and push notifications on smartphones, but it’s easy to see how this can be expanded and adapted in the future to include almost any piece of collateral that requires copywriting. As this writer quietly weeps into his keyboard, the automation of copywriting – especially for sales-focused and automated channels – is here to stay, and the results seem very positive so far. As usual, there’s always the next stage, and that’s when humans take a step back from the whole process.

Machines create and machines refine

Generative Adversarial Networks (GANs) are a class of AI algorithms used in unsupervised machine learning. One network generates options, while the other evaluates them. Imagine you need to generate a fake token to get into a theme-park. Your only aim is to get into the theme park; their only aim is to keep people without tokens out. So you need to create a fake token convincing enough to get you in. The only issue is, you’ve never seen what this token looks like. Luckily, you have a friend that’s willing to try to get past security with your first attempt at creating this fake. 

Attempt one, needless to say, your friend gets sent back. But they have more information of what the token could look like. The person at the gate laughed and said, “That’s the worst fake token I’ve ever seen. It’s not even gold.” Ah ha! So now you create a token that’s gold. The next time, “It’s not even hexagon-shaped.” Bingo! So now you create a token that’s gold and hexagon shaped. This back and forth gets slowly refined over many journeys until you not only have the right colour and right shape, but enough of the correct information on the token to convince the gatekeeper that you have a real one. This is essentially how GANs work. Refining and refining again until one of the results doesn’t get rejected by the other network. This is then repeated many times over so you can end up with complete outputs of whatever you choose to create.

The creative process changes when almost anything is possible

Simulating the impossible

The Cannes Lions winners in 2018 are a solid representation of how creatives are using technology to create new ways of communicating with audiences. Using AI and Speech Synthesis Markup Language (SSML), they were able to simulate JFK delivering the speech he was set to give the day he was killed in Dallas in 1963. This involved analysing over 800 speeches previously given by Kennedy to create a database, then use a mixture of AI and SSML to deliver the words he was due to deliver in 'his' own voice.

Using the internet using a landline number

With most people in the world still unable to access smartphones or online services due to financial or logistical reasons, the Colombian government set up a landline number 9000973 (Gooogle) that worked in conjunction with Google’s Voice Assistant to get access to online information. People could use their landline to call this local number, ask what they needed to know, and the Google Assistant spoke it back to them. A novel way to mix old and new. 

A new way to promote retail

Vue.ai is an end-to-end ‘intelligent retail automation’ and among its range of services, soon they will be offering AI generated models you can tailor for your market including the ability to change skin tones and body shape. The AI software will even allow you to map whole clothing inventories on to an AI model’s body. Naturally, this changes the way you would think about any potential casting or photoshoots: could you concentrate on creating product and lifestyle videos now that the heavy-lifting of a full product range shoot can be virtually shot and market-ready in less time and for potentially less cost than a standard photoshoot?

 

In theory, if the more mundane aspects of creative work can be ‘outsourced’ to AI, then it leaves creative teams to concentrate on more complex matters. It’s these elements of automation for the most basic or simplistic outputs that can free people up to be more creative and deliver more compelling work.

What's next?

Anticipatory predictions and output will form one of the next steps in the evolution of creative process. By merging historical and real-time data inputs, and then trusting technology to analyse, predict, and decide outputs, will we allow machines to make key creative decisions? At the moment, human and machine collaboration is still essential to creating coherent content and solutions, but in 10 years’ time who knows where developments in AI, voice or robotics will take us? Or indeed ‘them’.

If you’re thinking about getting our human team to work on your next project, get in touch, we’ve a range of creative processes to ensure we help you meet – and exceed – your goals.

Matt

UX Lead

Let’s build something amazing together

Let's talk