Hang onto your headsets folks, the virtual reality revolution is almost upon us

Ken Perlin, who gave us special effects we take for granted today, looks at what will be ‘normal’ next

Visitors try out a virtual reality suit at the event and entertainment area at the 2014 Gamescom gaming trade fair in Cologne, Germany. Virtual reality looks set to be the next big-concept leap in computing. Photograph: Sascha Steinbach/Getty Images)
Visitors try out a virtual reality suit at the event and entertainment area at the 2014 Gamescom gaming trade fair in Cologne, Germany. Virtual reality looks set to be the next big-concept leap in computing. Photograph: Sascha Steinbach/Getty Images)

You might not have heard of Perlin Noise, but you’ve certainly seen it.

The technique for adding textures to computer images to make them appear three-dimensional and highly realistic is at the heart of computer-generated special effects and animations used in the film and television industry since the 1980s.

For that contribution, which has given us properly marbled surfaces, strange alien skins, burning explosions, weird wavering force fields and looming planets, Ken Perlin was honoured with an Academy Award for Technical Achievement in 1997.

“Needless to say, my Mom was very happy,” he notes on his web page about the process, which also features demos to play with (http://iti.ms/1wbqYUa).

READ MORE

In person, Ken Perlin is not all that noisy, though he is very funny in an understated way, and carries an encyclopaedic knowledge of everything from 1960s cartoon shows and goofy television series to computer history.

And – as he did at the Science Gallery recently – he can keep an audience riveted for an hour in a wide-ranging talk (computer code to music to philosophy) with on-the-fly animations.

Now a computer science professor at New York University, the founding director of its Media Research Lab, and director of the Games for Learning Institute, he says: "I realised at an early stage that I'd have a better time of it if I just did what I liked."

Maths interested him, but only when he began to get some really good teachers in school, and when the subject involved the visual rather than theoretical side of the subject.

He was always interested in art and the creative process, especially animation (Disney's feature film Fantasia captivated him as a child) and by the time he was a university student, he was intrigued by the emerging area of computer animation.

At the time, he tells his audience, art and programming were distinct areas attracting very different types of students. The arty types were cool and had tattoos; the programmers were nerdy and had pocket protectors.

“I wanted to have the tattoos and the pocket protector,” he says to laughter.

And he did indeed help pioneer an area which would epitomise geek cool – using computers to shape a new virtual world, appearing on the big and small screens in special effects, animation, television ads and computer games.

Perlin's achievement goes back to the cult classic Tron, the first major film that used what were, on its release in 1982, the very latest in computer special effects. In an interview prior to his talk, he notes, "I'm even in the credits!"

Which is cool, but not as cool as an Academy Award. His work on Tron was the critical driver of his coding innovations that led to the award – but not in the way you might think.

"The whole thing was a reaction against everything about working on Tron," he says. "The entire aesthetic of Tron came out of the limitations of the software."

Software of the time was developed to work with the hardware of the 1980s.

“Computers were very, very, very slow 30-something years ago. A lot of computer graphics was trying to figure out” what you could do within those constraints.

Perlin decided to think about what might be done if such limitations weren’t there. He worked out a way to create animations controlled pixel by pixel, which would have been demanding on computer resources at the time, but gradually became a trivial issue as computer memory and speed grew.

And he figured out a way to add realistic, visual "noise" to an object – similar to the slightly chaotic effect given by an artistic tool like a paintbrush, with random length bristles but a mostly predictable effect. The opposite of the too-smooth artificiality of Tron, Perlin's Noise became a standard method in animation.

All “sudden” technological breakthroughs, like his own, or say, touchscreens on the iPhone and iPad, are the result of ideas that are visualised before they are commercially possible or have a mainstream application, Perlin observes.

"Those things really have a long gestation," he says, with many people adding on to early ideas in what Microsoft designer and researcher Bill Buxton has called "the long nose of innovation". "What's interesting is when it goes from one single mind to the collective mind", appealing to a mass market and turning into a computing norm, says Perlin, giving the example of former Xerox Parc and Apple designer and researcher Alan Kay, who came up with the idea for overlapping computer windows. It is a simple and obvious management solution for a computer display, but must have seemed strange at the time.

Having an entrepreneur, such as a Steve Jobs, come in, see a market and commercialise the innovation is generally a critical element for success too, he adds.

What will be our next significant computer interface? Virtual reality.

As an aside, he hastens to add that what we conceive of as “real” is actually highly artificial anyway – the virtual reality of another era – whether it be the walls of a house, our clothes, or a paper coffee cup. “A paper cup is a highly engineered miracle that would have amazed people hundreds of years ago.”

Today’s children already navigate a touchscreen world as natives, he notes. But devices à la Apple and other companies are a technological “dead end”. The next step will be ubiquitous virtual reality, where people will be able to see what was formerly on a screen, displayed in the air, and can manipulate and interact with those VR objects.

The arrival of commercial Virtual Reality (VR) via headsets like Oculus Rift is bringing immersive VR worlds into the home. And it's not just a gaming frivolity. It's a step along the way to that Minority Report-style immersive VR.

That in-the-air-around-us interface is a few decades off, concedes Perlin, but clean and fully realistic VR for the consumer is perhaps just two years away.

He's seen it demonstrated already in Seattle at gaming company Valve, in a costly, state of the art VR room.

"It's eerie, how good it is. In terms of what it feels like and sounds like, it's the [Star Trek] holodeck. And it's only about two years till companies like Oculus Rift figure out how to do it commercially."

It's no surprise Facebook bought Oculus Rift, because chief executive Mark Zuckerberg knows Facebook is all about providing a place where people meet, and VR is where they'll be meeting in the near future, Perlin says.

That whole process of a computer interface going from startling concept to everyday norm holds a particular fascination for him, and informs much of his own research and creative work.

“What will it look like when we forget that there are computers? When it works so well that we don’t even think of it?” he asks. “Everything I’m doing is just trying to figure out what’s the next step of normal.”