This is the fourteenth article in the Where We Came From series.
During a trip to the coast, the Harbour Master clan spent a couple of days roaming Camber Sands beach near Rye. While we were down there, I spotted an amusement arcade perched on the edge of one of the beach car parks.
The last arcade I’d wandered through was probably on Brighton Pier about five years ago but whilst the arcade roar kindled feelings of nostalgia, the coin-ops of old had largely been replaced with gambling machines and dancing games. I still hoped to come across some arcade which retained working 80s favourites like Phoenix, Defender or Battlezone.
The beach arcade was closed but I kept an eye on it, waiting for the chance to nose around whatever machines were on offer. I have such strong arcade memories from my single digit years: blazing batteries of screens in dark, enclosed places where ten pence pieces went to die.
Then I saw the sign.
After the 70s, powerful computing technology moved into the home then invaded the space in our hands. The draw of the arcade waned. Pubs now make do with dreary quiz machines and one-arm bandits. No one wants to see the old coin-op any more. It is done.
Culture and technology move forward with each successive generation and, as a necessary part of the process, the young turf out the cultural icons of the previous generational incumbent. Arcades, the Atari 2600 and the Sony Walkman are now historical artefacts.
Today, a whole web site industry dedicated to retro fetishism exists, as well as a battalion of grown-up 80s gamers that aggressively protect their gaming past. At the other extreme, there are those that ridicule the primitive beeps and blocks, considering the 80s an episode of gaming history better off buried in a giant unmarked landfill site, like the place where all those Atari 2600 E.T. cartridges were buried.
Did we lose something when we left the 80s behind? Or are we just prone to losing ourselves in nostalgia?
Consider The Individual
I first heard about Bill Williams’ death, as covered in Stanley Kubrick Is Gone, around a decade ago. I’d entertained the idea that all these brilliant programmers that fleshed out my childhood had become even better developers, still out there fighting the good video game fight. While it was sad to hear that there would be no more games from Williams, it was even more tragic to discover his last projects were “Monopoly” and “Bart’s Nightmare”. From Necromancer to Monopoly? What was wrong with the world?
Williams’ plight was emblematic of the bedroom programmer generation’s problem. They were unable to reconcile their love of digital artistry with the demands of the new corporate paymasters. Paymasters that typically eschewed experimentation in favour of genres and mechanics already established through the previous decade’s efforts. This new industry did not seem to foster the same sense of creativity yet thrived on death march projects. Many just changed career rather than graft themselves onto these meat factories; those that stayed were chewed up by the emerging “EA Spouse” culture .
But I can no longer mourn over this rout of individualism – Steve Hunt made it clear in his interview, The Magic Is Back, that anyone who wants to be an independent developer can go out and do it. If we go back to Brian Moriarty’s interview Radio With Pictures, Moriarty also spoke positively about “talented people finding a voice they might not have been able to realize before” despite his disappointment at the evolution of computer games as an artistic medium. Things are different today.
So I characterise each computer gaming decade in the following way:
- 1970s, Genesis – The first computer game businesses appear with Atari dominating
- 1980s, Democratisation – Home computers enable ordinary computer users to make their own games
- 1990s, Corporatisation – Profitable games require the polish and resources that only big business can afford
- 2000s, Revolution – Game development becomes cheaper and games break out of narrow genres in a big way
Digital distribution, better hardware and the ubiquity of tools have put power back into the hands of the individual. It is interesting to recall David Fox’s suggestion, from his interview Changing Lives, that the mobile app ecosystem appears to be repeating the violent lurch from a community of pioneers to big business seen in gaming during the late 1980s. 
Consider The Technology
But the decade that bridged the democracy and the revolution was not without virtue. Games became far more refined as “products” with a greater focus on design and technological finesse . Software and hardware have both become better at play.
What about the original hardware? In a world where decent emulators exist, is it important to have access to the source technology? Professor Steve Furnell of Plymouth University told me: “I think it’s relevant to know where we have come from. Some of this stuff is just 20-25 years old and that’s clearly, looking forward, within the career lifetime of the students we are now teaching.” Still, emulators will serve in a pinch; everything I revisited for this series was via emulator.
There’s also an ongoing fascination with developing under the antiquated constraints the original technology imposed. Ian Bogost and Nick Montfort made a great argument in Racing The Beam that art arises from constraint and the minimalist environment of the Atari 2600 gave rise to incredible innovation. So now you can find Ed Fries developing Halo 2600 and he’s not the only one throwing himself into the lost 8-bit dimension. 
Consider The Mechanics
A desire for emotional resonance and an aspiration to art has always been with us, but games have had a difficult history with shoot-in-the-head sims obscuring more subtle and introspective titles. But the “art game” movement has gained significant momentum in recent years and there are real efforts to build games that can mean more than just the satisfaction of the kill, of the level up.
But did we lose certain modes of play? Did genres vanish over the years?
No. A great counterexample is that most dead of genres, interactive fiction (IF). It didn’t die, it merely transformed from business to community, as Brian Moriarty implied in his interview. Also consider the example of Andrew Plotkin, now working on commerical IF mobile apps, proving that IF is not dead and a market – no matter how niche – still exists.
What of X-Com? The new reboot XCOM shares little with its ancestor: you can always count on big business to capitalise on nostalgia but only a fool would rely on big business to respect it. But while there are still those who love X-Com, someone, somewhere will make the real successor to it – look at the upcoming remake Xenonauts. Successful modes of play will not just disappear.
At the start of this series, I put up a video of Shiina Ringo’s wonderful Tsumi to Batsu released in 2000. In 2008, she performed a number of her songs with orchestral support; it’s ostensibly the same music, but the feel is completely different. With Tsumi to Batsu, she transformed her original raw, wailing lamentation into something fragile and desperate:
I see similar patterns in game development. The past continues to be felt in contemporary titles in ways both subtle and obvious. At the start of the year I wrote that Shatter overhauls Breakout and removes what was frustrating about the original game design.
Shatter is not Breakout. But somehow, it is.
Consider The Losses
So has anything truly been lost?
Maybe the innocence? It is some irony that Activision, created out of disgruntled Atari programmers not getting credit for their work, has grown into an industry leviathan resembling its estranged parent, Atari of the 1980s. Electronic Arts made its name with forward-thinking titles such as M.U.L.E., Seven Cities of Gold and Archon; it, too, now suffers from being a juggernaut desperate to make mega-sales with each new release. Even the shell that is the modern “Atari” is now harassing Atari fansites that existed for years, a move that seems breathtakingly grotesque in its disrespect.
Maybe the old computers themselves? As there is little evidence that the legion of archiving projects are collaborating, there remains problems regarding the preservation of bygone computing technology. Where are the skills when the machines begin to die? When the memory chips fade and the processors stop? Magnetic media does not last forever and I recently transferred my entire Atari floppy disk collection to PC to preserve them. But it was already too late for many of the disks.
Maybe the mystery? Drew Davidson, writing for Tap-Repeatedly, explored the idea of our modern post-secret world where spoilers are part of the cultural machine. David Fox also touched on this, convinced that it would not be possible today to keep the notorious Rescue on Fractalus! alien a secret for any significant length of time. We’re lodged in an electronic grid full of voices – active 24 hours a day, 7 days a week, 52 weeks a year.
Maybe patience? I was struck by how long some old games took to play. Personally important titles such as Mercenary and Alternate Reality provided expansive worlds which were slow to navigate not just due to design but also performance issues. Frequent disk swaps in the latter proved to be the most serious complaint. Fed on broadband, today we’re all “click-and-gimme”.
Maybe the pioneering atmosphere? There was certainly something different about the frontier of the 80s, when computers in homes were almost a fashion accessory. If you belonged to the future, you needed one of these machines. But technology continues to reinvent itself, consoles warp and shift from boxes into Wii controllers, games leap into social networking and mobile devices. There’s always a new challenging frontier. Today’s young generation exploring art and thought through games are not the mod makers of old aping the big budget titles; they are producing bright, new concepts. This is an exciting time to be a game artist.
So I say this as the conclusion to Where We Came From: we live in great times.
Consider The Cliffhanger
I hope you enjoyed all that, because it was a pack of lies.
Something is gone. Something so powerful and fundamental, it destroys every positive point I’ve made in this essay. It gives life to every web page dedicated to retro gaming, regardless of the decade in question. It informed every word I committed to the series.
 EA: The Human Story by ea_spouse – on the destructive nature of game development culture as fostered by big business.
 For a television parable on the theft of individual achievement by corporations, see the violent, foul-mouthed and captivating HBO series Deadwood.
 Sadly, this also included the blighted period where every game had to feature 3D graphics otherwise it was considered antiquated.
 Historical note here. As I alluded to in Peter Liepa’s interview The Creative Urge with my comment “it could be argued game design was elitist in the 80s”, a form of programmer eugenics did erupt at that time. “Real” programmers who had invested much of their time studying the black arts of machine language optimisation decried anything that appeared to hand the power of game development to the unwashed masses. So I’m not particularly in the mood to celebrate a new race of super-cool retro machine coders.