This is the final part of the Learning Curve trilogy. In the first part, Learning to Walk, I learnt to program and make games on the Atari 8-bit home computer. In the second part, Learning to Run, I wrote a game in machine-language which was sold commercially.

Return to Citadel

I think it was 1995.

It was the difficult second year of my PhD at Reading University, where far too many of my water wave simulations exploded into colourful infinity and I wanted to shoot my research in the head five times. We had a series of presentations from industry types looking to charm the latest batch of Dr. Mathematicians being squeezed out of the academic womb. I was impressed by one guy from a company called Geoquest that made oil field mapping software, and followed up on email. I asked Geoquest Dude how I might make myself more valuable after my research was complete. I told him about my grand game-making plans, hoping this would sell me as an unstoppable code hero who could work machine language like Jimi Hendrix worked the guitar.

His response? He told me to stop.   

Let’s rewind a little. A pathetic pattern of unfinished projects and half-baked prototypes had been broken through discipline. I had set my sights low and dedicated myself, no matter the cost, to finishing one project, The Citadel (Joel Goodwin, 1993). Now I was vindicated. I really could walk the talk. I was a warlock of the 8-bit Atari home computer system. I knew of color clocks and processor interrupts, I knew of secret 6502 opcodes and I knew better than to rely on the operating system when writing a game. What next?

I took The Citadel code and spun a Sokoban clone out of it called Orson (Joel Goodwin, 1995) which became a bonus game for New Atari User disk subscribers. A few months later, New Atari User also published The Citadel as a disk bonus, since the game was no longer being sold. Orson was snappier than The Citadel but the latter was more interesting to play. Somewhere along the line I decided I would make a Citadel II.

citadel 2 planning
Citadel II – Pseudocode
citadel 2 gui
Citadel II – GUI
citadel 2 levels
Citadel II – Level design

But where did that decision occur? Having reviewed the documentation of Citadel II closely, I’ve discovered my memory might have been in error. It’s hard to see here, but on the GUI sketches, I’ve scribbled in “(c) 1994 Joel Goodwin”. Which means I was designing Citadel II in the year before Orson was published. Which means that my retelling of the past, long believed, that Orson led to Citadel II… might be in error.

My nasty little secret was that discipline had fallen out of fashion. Ambition was back in town! I kept scribbling out ideas which were too unwieldy, too dangerous. For example, I’d become obsessed with developing the storyline of Orson – sentient robot risks life in exchange for false promise of freedom and blunders into interstellar politics – into a six-part series which aped the mechanics of Flashback (Delphine Software, 1992). I daydreamed each plot twist, every emotional moment: betrayals, protests, war and – of course! – the inevitable destruction of civilisation. I had everything. Everything except actual game design.

It didn’t stop there. Rage, a side-scrolling shooter. A procedurally-generated space exploration game called DeepStar. A game-centric operating system replacement DAVE (“Damnfine Audio-Visual Engine”). My own take on the Alternate Reality series (DataSoft, 1985) called Nightspire. A fully three-dimensional racing game called Chronotide. A version of the classic pipes game called Ooze. Runemagic, a trilogy of games based on C64 game The Staff of Karnath (Ultimate Play The Game, 1985). An interactive fiction about domestic violence provisionally titled Suicide Dancing. I shouldn’t omit my Sonic the Hedgehog (Sonic Team, 1991) clone as well. I’m sure this kind of daydream paralysis is familiar to a lot of fresh-faced game developers.

joel-goodwin-too-many-titles

Fortunately, discipline wasn’t entirely lost. I tried to make up for all those “lost” years where I hadn’t produced much of note through machine-gunning articles and games at New Atari User, which was now one of the last surviving Atari magazines. Although I was having no luck with the big projects, I was at least able to churn out magazine-worthy efforts which I was proud of. Before New Atari User went out of business in 1998, they published everything I sent them except for Ed, the only sprite editor I knew of which supported editing and animation of multi-coloured sprites.

Unreleased sprite editor Ed with animation for protagonist of Rage
Unpublished sprite editor Ed showing protagonist of Rage

Now you know where I was when I talked to Geoquest Dude. What I was really doing was fishing for compliments because that’s what you do if your head is packed with dreams and you lack the willpower to make something out of them. He let me down gently, explaining it would be best if I concentrated on my PhD research and, if I was going to work on anything, I should learn the C++ programming language. I took the C++ advice but tried to ignore the rest.

Eventually, though, it got through. This poison to my dreams. Wads of design notes and dot matrix printed documentation of my tools and macros lost their lustre and purpose. I was fighting to make my Atari education feel relevant but the truth was obvious to all those people around me. The Atari scene was kaput. Sure, there were still active enthusiasts on the comp.sys.atari.8bit newsgroup like Bill Kendrick and others building custom hardware like Nick Kennedy and Steven Tucker who bridged the gap between PCs and Atari computers. Something else happened though. My desire to make games died.

Within 8-bit culture, aside from a few works such as the bar-raising efforts of Lucasfilm Games, everything looked achievable to me. I was amongst peers. I could take apart every Atari game mentally and figure out how they were made. I saw no mysteries, only challenges. During the nineties, I was only dimly aware of a PC gaming scene so if I wanted to play new games, the action seemed to be in consoles. Between the luscious visual detail of Sonic the Hedgehog and Jesper Kyd’s pumping title theme of Red Zone (Zyrinx, 1994) I concluded that the dream of a hobbyist game programmer making it through solitude, grit and determination was anachronistic, just another relic of the Thatcher era. Games of the future would come from professional teams.

The straw that demolished the camel’s back was noticing my obsession with writing stories for games. Game design as an excuse for fleshing out a fictional world? Why didn’t I just write a bloody book instead of shoe-horning stories into crappy game clones that I wasn’t going to make anyway?

Fifteen years of learning was over. Fifteen years of honing my Atari programming spellcraft was over. Fifteen years ending with a final full stop. No ellipsis, no promise of another season, no hint of a plucky comeback.

But when this penny finally dropped it was covered in barbs. All those refined programming skills were useless, the boy ahead of his time became the man out of his time. I hadn’t paid attention to what people were actually doing with modern computers and missed all the changes in software development practices. I was self-taught and, although some discipline came along for the ride, I had adopted a hack-and-slash style of coding warfare. I didn’t know anything about version control, object-oriented programming or even humble C. I had attached myself to dead systems, a technology necromancer, whilst others had embraced a brave new world. The thought of getting a modern programming job seemed terrifying. The child programmer genius with the magic typing fingers… was unemployable.

It wasn’t all DOOM II and gloom. As my PhD came to an end, industry was crying out for programmers and companies were flush with training programmes to bring fresh young faces up to speed. In 1997, I joined a company that produced software for financial institutions, and learnt the ropes along with every other graduate. Just another statistic, another number on the office roster. I lived with the knowledge that I was just this guy, you know? The prodigy was dead. I was now average and adequate, on the same treadmill as everyone else.

Years passed. I let my creativity lapse.

But a strange thing happened on 29 February, 2004. I received the following email.

Dear Mr Goodwin. Me and my friend enjoyed your atari game so much that decided to release the PC remake of your great game. I will be happy to get feedback from you, the author of the original game, it is really important for us and I also hope you will find our version of the game interesting. It has low requirements (dx7, 2d video 32 mb).

Wuh-what? WHAT?

Ten years after The Citadel’s release, ten years after just 26 copies were sold, someone had created a remake! The game was called The Return to Citadel (Byxon Games, 2004). I checked their website to discover, with a shock, that I was even credited on their web page (attribution since removed). The developer sent me a free copy of the game and I dabbled with it.

The Return To Citadel, titles
The Return To Citadel, titles

I had mixed emotions. I was definitely flattered. The mechanics of The Citadel were not unique; if someone copied my Frankenstein-blend of mechanics it didn’t matter. But Byxon had reproduced all of my levels, levels forged one summer in my girlfriend’s house. I felt cloned! But there was nothing to be done about it. I chose to be happy that someone had been so moved by a game I designed they remade it a decade later.

Still, I did not like the game. Everything was shiny and pulsating, alive and distracting, smothered with irritating, bland music. It lacked the abstract simplicity of the original design and bore the hallmark of a casual game, overdressed to appear expensive and polished. It was light years from the aborted Citadel II. To think this game was selling at $30 a copy and I… what if… what if…

return-to-citadel-2

My game development spidey-sense was tingling but I couldn’t convince myself to leap across the knowledge gap from where I was to being a new millennium PC game programmer. Learning to make a game with PC technology seemed like so much hard work. I didn’t have the time nor the inclination. It didn’t feel the same.

My generation was raised on the home computer and many of us learnt to program and make stuff. Much of our output was derivative but we were figuring out videogames for the first time. This was an era where Atari released a nuclear reactor simulation as part of its official home entertainment line up. We could do anything. We could make anything. We could send it to a magazine or just distribute it amongst our friends – a friend and I exchanged cassette tapes carrying messages coded up in BASIC. It was a democracy. But as the nineties dragged on, it seemed like this home computer revolution had been switched for a fascist corporate state. Games were expensive to make and needed time, money and people. Reports of programmers being used as grist for the game industrial mill began to leak out and I realised this was no life for me. When I made the video “Eulogy for an Atari Childhood” it wasn’t just a eulogy for a phase of gaming but of a phase of game development democracy: I buried more than just games in its final scene. Whither the joy of game development?

But fast forward with a big whooooshing noise to present day. New tools like Flixel, Stencyl, Unity, Twine and GameMaker have tamed game development and the internet has filled in where the home computer magazine and its type-in listings left off. Democracy is back in style.

Today’s fantastic explosion of hobbyist game development fascinates me because I find it so nostalgic. It might sound counter-intuitive, but this is precisely why game-making advocacy book Rise of the Videogame Zinesters (Anna Anthropy, 2012) never struck a chord with me. The book fails to acknowledge that we’ve been through one cycle of discovery, the 80s home computer boom, during which one generation figured out its voice and direction through computing technology. These days are different; these days are the same. I don’t doubt that we are having to fight the difficult legacy of previous videogame decades but today’s cutting edge is built on the cutting edge of before, just as tomorrow’s will be built on today’s. The future is built on the past.

Those of us who grew up with this crude, silicon magic were changed by it. Fortunately, we’re not the last of our kind.

(There is a short addendum to this series.)

Notes

  • Programming as magic is an old metaphor but is particularly apt in explaining how programming feels. Zinesters has a whole chapter called “The History of Magic” although Anthropy has a different take on the magic angle: assembly language is an arcane art which keeps out ordinary Joe.
  • There’s no mention of early high-level languages in Zinesters and it might appear that it was impossible to create games without being an ace programmer. Mastering assembly language was a badge of honour, but many, many awesome games were written in Atari BASIC. Performance problems associated with Atari BASIC could be mitigated by leveraging machine language subroutines that were widely available in magazines and public domain libraries for moving memory, drawing sprites and so on. These were provided for others in the same way as modern frameworks such as Flixel tried to open up game development. Strong enthusiast communities emerged through local user groups and… fanzines. New Atari User started out in 1982 as a fanzine called Page 6. With regards making money, games written in BASIC were harder to turn into commercial properties because of both language fascism and the limitations of the language – I recall Turboflex (Llamasoft, 1982) and Chris Crawford’s Scram (Atari, 1981) – but you could still make a little from magazines during the good years. Better language options such as Action! (OSS, 1983) and the incredible Turbo-Basic XL (Frank Ostrowski, 1985) became available later, but Atari BASIC was the de facto language for amateurs, a position Atari consolidated when they later decided to ship it with every computer by default. What is vitally different today is that performance and language are no longer concerns.
  • Zinesters also references the 1970s to suggest game-making has been closed for decades: “But in 1975, there was no way to make a game on a computer without understanding a computer inside and out.” The home computer boom did not begin until the end of the 1970s and videogaming didn’t really take off until 1978 when Space Invaders (Taito, 1978) hit the scene. Pioneer games like Pong (Atari, 1972) and Tank (Kee Games, 1974) staked out early populist territory and Atari got its console out in 1977. Let’s not forget mainframe games like Empire (Peter Langston, 1972) and Colossal Cave (Crowther & Woods, 1977). Videogaming was being created so arguing that gaming was a closed field is technically true but puts a bit of spin on reality. Hardware was the problem: expensive and lacking standards, it was virtually impossible for anyone to play games. The personal computers of the mid-70s such as the Commodore Pet and TRS-80 were cranky, unfriendly things and aimed at people with large wallets. Once open platforms were created – home computers – events moved pretty fast after that. And, let me be absolute crystal clear here, it was gaming that drove the home computer boom.
  • If there’s one problem with the history I’ve presented here, it’s that my claustrophobic marriage to Atari computing means I was ignorant of the PC programming world. Although I felt the 90s were corporate and game-making was out of the reach of amateur hobbyists, stuff was still happening and tools like Kilk & Play and ZZT emerged. I should probably spend some time trying to dig into the decade, to see if there was a vibrant amateur scene that resembled what we had in the 80s.
  • Let’s take my historical guess and spin it into a prophecy. I believed consoles pushed out the need for computers, the device aimed at pure media consumption displacing more expensive multi-purpose devices. The average consumer didn’t need all that claptrap like Atari BASIC and a keyboard; a controller and a cartridge slot was all they needed. The home computer market collapsed and only business machines remained. PCs were not sexy and not cheap. You bought one because you had to. And then the internet arrived. And if you wanted to be on the internet, you needed a PC. So we had a new “multi-purpose device boom” and everyone had to have a computer again. Here’s the concerning prophecy I’m leading up to. The demand for PCs meant the cost of access to PC technology, access to game-making equipment, fell. We’ve been seeing attempts to steal consumers away from PCs, though. Tablets and smartphones satisfy humble internet needs, XBox One also wants to be your all-purpose media device too and, look, here comes SteamOS. If this urge to displace multi-purpose devices (tools of creation) with media consumption devices succeeds again, PCs and laptops will become more expensive as demand falls, becoming tools of “serious creators” and business only. So we might yet see a re-corporatization of the gaming space. (I don’t think tablets are an adequate tool of creation when it comes to game-making.)
  • I began studying C++ and object-oriented programming (OOP) as requested and wrote a three-part series called Objet D’Art in New Atari User magazine to explain it, using assembly language macros to simulate OOP techniques. I was probably the only person who ever used these macros.
  • The story of Orson grew into a monster. I became attached to a new title, Metal Century, but as threads of the story began to extend into millennia in both directions, I had to dump that too. The new title is Pangaea. It remains on a pile called “multi-novel epics I will write one day” and, purely from a writer’s perspective, frighteningly complex.
  • Sure, I’d like to return to game making. I’m not the developer I used to be, either. These days I’m more likely to design a game about a threeway that destroys time. You might think I’m joking but I have a folder on my hard drive fleshing out its design.
  • The Return To Citadel failed to reproduce Room 21 of the original properly; all of the destructible walls are replaced by blocks, which means the level can be finished in seconds.
  • In 2006, I was bored with finance work and flirted with 3D graphics. I bought a ton of OpenGL books and even started an anonymous OpenGL blog – but I didn’t get very far.
  • Learning Curve is based on an article I wrote in 2003 on another defunct website of mine, Electron Drift, which was meant to be the spiritual sister site of Electron Dance.
orson series synopsis
Orson series arc prior to Metal Century phase

Download my FREE eBook on the collapse of indie game prices an accessible and comprehensive explanation of what has happened to the market.

Sign up for the monthly Electron Dance Newsletter and follow on Twitter!

9 thoughts on “Learning to Stop

  1. This all sounds so familiar, not least the discounting of years of programming experience* because it’s ‘just games’, but thanks for sharing your background, it stirred a lot of memories.

    When reading I was surprised you stuck with Atari 8-bit for so long, and missed the 16-bit bridge to the world of PCs. Atari’s own ST was an excellent development machine, with even a lovely Assembly Language on the 68000.

    Surely we’ve always had game development toolkits for the masses too? I’m thinking Pinball Construction Kit, The Quill, SEUCK, STOS/AMOS, Blitz BASIC etc. (Pretty much) All home computers came with BASIC and shoved it in your face as soon as you turned them on. I assume Anthropy has a US bias, where they seem to have been blighted by consoles.

    *’Experience’ being more useful than ‘Knowledge’ in this case.

  2. Wow, what an article. It left me feeling both sad and hopeful, which I suppose was the point.

    I’m probably so affected by it because I was (and still am, really) the sort of person you described yourself as: makes something small, thinks he’s omnipotent, tries to make something big and just fails. I’ve now learned that you don’t *destroy* your limits when you produce good work: at best, you learn where they are and, perhaps, expand them a *little*.

    If you want a scholarly article on the sorcery of programming, you might try Wendy Chun’s “On Sourcery and Source Codes”. It talks about that powerful feeling when you type or edit code, that your word is law, because it is – you can change the (digital) word with a single line. But it’s also very sceptical about this, arguing that code distances us from what is actually going on in the machine, and this sense of magic is misleading.

    For my part, I think code is exhilarating but this exhilaration can get out of hand. It’s very easy to think “Oh my God I can do anything!” and adopt a chaotic approach to programming and game design.

    And I was also shackled to one language and scene for years. For almost ten years I was stuck coding in a language that could only compile for windows and started to break down after the release of DirectX10. It’s only in the last 18 months that I finally found the motivation to switch to other languages – I learned actionscript and Unity – but it really feels like I “got out” of my rut at the last possible moment.

    I remember watching a game dev video from World of Love where the speaker said “I’ve been programming in the same language for a decade and there’s no reason to change. If you can make what you want to make with what you have, then use that. Resist the urge to always use the most up-to-date tech.” That’s a valid message but it’s also critical to recognise when your tech needs to change.

    One more thing: you are *really good* at game titles.

  3. Truth is I am now sad this series of articles is over. Really loved it. A lot. A ton, actually.

    Also, in regard to the Zinesters thing, I have to add that I firmly believe that making something worthy, beautiful, smart and/or interesting takes effort and talent. Trying to do away with the effort part can only lead to disaster.

  4. Fascinating stuff once more, Joel. And you were cloned! I didn’t see that coming.

    I tinkered with QBASIC in DOS a little during the early to mid 90s, and after Win95 (or was it Win98?) came out I tinkered with a commercial application that allowed users to assemble very basic games fairly simply (I’m unsure, but I suspect it may have been the first version of today’s GameMaker). I never really took it to a serious level, though: like a lot of teens I was full of dreams and short on drive.

  5. @CdrJameson: Glad to hear memories were stirred! I stuck with the 8-bit because of two reasons: we’d built up a lot of education and software on the 8-bit and felt no urge to migrate to new technology and … probably money. I would have loved to play Time Bandit and Dungeon Master and Mercenary II…

    We did have a lot of game-making tools on the 8-bit but I felt they weren’t as “mainstream hobbyist” as Atari BASIC was, which was everywhere and not difficult to pick up. (Level editors with games like Boulder Dash were also very popular.) I’m thinking about making a video with a selection of BASIC games to show what people got up to. As for the US bias in Zinesters, I suppose I can’t really comment, but Atari was bigger in the US than it was in the UK. They had the awesome ANALOG and Antic magazines, had huge public domain libraries and there was also APX during the early years opening up polished hobbyist efforts to a wider audience, many of which were basic. I’m pretty sure a strong enthusiast scene existed in Atari’s homeland.

    @James Patton: Thanks! I tend to feel the same about the trilogy, with this third part making me relive the slow “death” of the scene all over again. It’s also a death of self, too, an admission that I was no longer than young programming whizkid. You’re probably right about limits: ambition needs walls otherwise it keeps expanding to fill up the available room. That doesn’t mean super-ambitious projects cannot succeed, but if I hear yet another bloody developer commentary in alpha stage where they talk about every character and creature having their own lives and behaviours… *cough*No Man’s Sky*cough*cough*

    Thanks for the tip for “On Sourcery and Source Codes” which I’d not hear of before. I think abstractions are pretty dangerous. On the programming front, I like to turn to Joel Spolsky’s “Law of Leaky Abstractions” and Martin Thompson’s work on “Mechanical Sympathy” aggravates a lot of developers as it suggests we’ve got too comfortable with wasteful hardware abstractions. On the game design front, I did write a furrowed-brow piece about abstractions way back in 2010.

    There are more complicated problems around switching langauges and technology, which were beyond the scope of Learning Curve, on the constant race to survive obsolescence. Industrial programming can also be like that too. I’ve ended up in an Excel VBA rut; it’s a pretty good rut to be in, if you work in the financial industry. But it’s still a rut, so I thrash around in C# and Java any chance I get.

    @gnome: Thanks gnome. Of course, you’re right. Nothing beautiful will come without effort. I’m not sure the “Zinester perspective” is trying to do away with effort, but rather remove the unnecessary heavy lifting around game making. e.g. Unless you’re working cutting edge 3D engines, you probably don’t need to know any assembly language, for example.

    @Shaun: Yeah, it’s a lovely little twist at the end. In fact, this series was meant to be an exploration of what it feels like to be cloned, I thought that was sufficient justification to talk about my past… except the story of an Atari programming childhood took over and that seemed to be what I “was supposed to write about”. The words made me do it. (The Zinester bit came to me in the middle of Learning to Run; I realised this was my time to address a thorn in my side.)

    Tinkering is fun (I have tinkered with Flixel and Unity) but to move onto the next stage requires a little bit of extra perseverance. That discipline to say YES I AM GOING TO DO THIS. I have a feeling that if you get over that hump, it gets easier. Well, that’s all programming really. It’s almost the fear itself which is the problem. If I wasn’t writing Electron Dance now, however, I would probably be working on some games. My research and writing over the last few years has convinced me I should make something again. Of course, there’s my Twine next month…

  6. Thanks for a great series.
    The bit about game development not being accessible, in the notes… well, I haven’t read the chapter in question, and mea culpa on that one, but I was making very basic text adventures as a kid during the early 80s in BASIC. It’s true we were early adopters of the computer thing though.

  7. Joel,

    Some scattered thoughts before I hit the sack.

    I have not read Zinesters book, but judging by the title – and the general attitudes of many self-professed indie developers – I assume it advances the idea that the indie movement is new, unexpected, and fighting the evils of The Man.

    I’ve maintained for years that the increase in the number of independent developers is a repetition of history. At least on its surface. After all, developers today face different economic and technological realities than those of the 70s and 80s. But I think there’s a common motivation at the root of both eras: creating expressive artifacts for other people to play. And at a conceptual level, the mid-level SDKs and engines you mentioned, remind me a lot of the BASIC days. (Especially Twine).

    I was one of those kids that dabbled with Tandy CoCo BASIC, but never bloomed into a programmer. But I understood program flow, and that paved the way towards VB and C++ in my teens. I even learned TI-BASIC so I could build a Legend of the Red Dragon clone on my TI-85. (I got about 10% through it before I ran out of memory).

    And so on. Computers are this wonderful playground for expressing feeling in a very structured way. I’ve never lost that drive, even after a 15 year detour into academia. Your series reminded me that it will always be a part of my life, and now more than ever. I’m going to reply to that personal part on my own blog. Will post a link when the article is done 🙂

    Thanks so much for all of this,
    Chris

  8. Amanda– thanks. The chapter in question draws some examples from the 70s era, including assembly language programming, and jumps to modern day. There is a short detour through the 90s as well, though. But the 80s seems to have been lumped in with the consoles.

    Chris– As you know I’ve been over to your site and left a comment. The site and career that you have now abandoned to go make games!

    I picked up the Zinesters book as it was all the rage in my feeds when it came out. I picked it up as background material; I didn’t need converting or anything, of course, I’m an ex-game developer fully onboard with all the new work being put out there. The book wants to convince the reader (assuming a non-programmer, of course) that he or she can make games. It’s basic premise is that people have been excluded from making games, and tools and the internet have changed that. I wouldn’t dispute that. But it omitted we already had a trial run at this thing: maybe it was just all white male folks in the main, but as lots of white male folks were involved in the conception of videogaming (because it was the 70s, and although we’d had some feminism, it was still pretty much a man’s world: I’d like to tell you some time about how I broke my own cultural conditioning and it wasn’t through peer pressure), I guess its what you would expect. As I mention, we’re still living out the troubled legacy of that (tired genres, character stereotyping, etc.) but we’re still standing on their shoulders, whether some of us like or not. This is a second, wider run of “Zinester culture”. Although I don’t particularly like that term. Hobbyist is okay, although I don’t like that much either!

    Here’s a short sidestory that’s missing from Learning Curve. I wrote a text adventure called AMCS. AMCS was a robot that had gone inexplicably crazy on spaceship en route to a colony planet (this plot is so fresh). I planned out a huge map and started typing in the descriptions of each location into BASIC. Lots of DATA statements. I think I was around 80% through the descriptions when I got ERROR- 2. That’s Atari BASIC’s message for “out of memory”. And that was the end of AMCS. That’s how much discipline I had in the beginning! A problem! Time for something else!

    James, again– Thanks for the comment about the game titles! I just said on Twitter “perhaps a side-effect of daydreaming game ideas instead of making them.” I really wanted to make Suicide Dancing; it was the one idea I had that felt completely unique. I’d realised no one had made an “adult game” other than porn. I don’t know what was going down in the wider IF field at the time, of course, but an IF about domestic violence felt like it could make some real waves. You were going to play the female victim and had to escape the relationship. Suicide (also a suicide pact with another man) and murder would be options, but neither would be the correct way out. Still, what good is an idea without execution?

Comments are closed.