Tunnels of Doom!

Æons ago, that is, in September 1982, I got my first true computer, the Texas Instrument TI-99/4A. A truly incredible machine, especially considering the era. Its CPU, a true 16 bits processor, the TMS 9900, ran at 3MHz. Even the popular TRS-80 Color Computer, Radio Shack’s 6809-based computer, ran at a mere 0.895MHz, or, if the gods were good to you, you could tweak the clock divisor and crank it up to 1.79MHz.

In addition to being my first programmable computer, the TI-99/4A sported truly nice graphics and a relatively advanced sound chip—for the era—and a number of games were developed for it. Hunt the Wumpus, Parsec, and others. But the one game that really got me interested in programming is Kevin Kenney’s Tunnels of Doom, one of the very first graphic dungeon crawler games. While not being nostalgic by nature, I returned to that game about a year ago following an unusual route. I was looking into procedural content generation, especially map generation, in relation to data compression and I remembered that the old game’s dungeons were generated randomly, yet were very well structured. I decided to get a look into it, and see if I could generate maps like his.

The original game is a classic dungeon crawler game: you kill monsters, get gold and XP, get better weapons, kill more monsters, until you get to the bottom of the dungeon were you found the King and his Orb of Power (whatever that was) and I can’t remember if you won at the moment you found the two or if you had to crawl back up to the surface. Probably the latter. So, as a round-based attack system (where parties rolls for initiative, who goes first is decided, and where blows are exchanged taking turns) you encounter monsters and slay them—or try very hard to. The combats would look something like this:


Tunnels of Doom Combat Screen. Source:Wikipedia

Of course, what intrigued me the most was how the maps were generated. They never had disconnected rooms, they were always well spread across the space, and they never looked too random. A typical map would look something like this:

Ed Burn's page (link at the bottom of the post)

Typical Tunnels of Doom Level. Source: Ed Burn's page (link at the bottom of the post)

where the boxes are rooms (filled with monsters), round rooms are fountains, were you could drink and get your HP restored, and the arrow-like rooms are stairwells.

So during last year’s Christmas vacations, I tried to replicate the look of the maps. After a few tries, not getting very nice-looking map, I decided to ask Kevin Kenney how he did it, back in the 80’s. Kenney was kind enough to answer my mail:

The trick was to not put in too many horizontal corridors – balancing that took awhile.

What I can remember about the dungeon-digging algorithm:

  1. Place stairwells from the previous floor.
  2. Place rooms & other features (stairs down, fountains, etc.) randomly.
  3. Draw vertical corridors between features. (Rooms are also features.)
  4. Tough part: Draw horizontal corridors starting from a feature to another feature or corridor, if another corridor isn’t already above (or below) it. (Don’t connect corridor to corridor.)
  5. Search for unconnected rooms, and connect them, where possible. If a room lies entirely outside of the others, and can’t be connected, mark it as empty. I randomized stairwells less widely than other features, so they wouldn’t have this problem
  6. Stock the dungeon.

Following his hints, I set to recode my map generator. I finally got somewhat decent looking maps:

Tunnels Generated by My Heuristic, In Glorious ASCII Art.

Tunnels Generated by My Heuristic, In Glorious ASCII Art.

But I did not quite implement his heuristic. I continued playing with the rules to understand the process of pleasant map generation. The maps cannot be merely random; the player must feel that the world he’s navigating is not arbitrary and random, but is thought-out, even though it is really random. (hmm… reminds me of something.)

Another Generated Map

Another Generated Map

So the basic data structure is a flat array, 16×24 bytes (actually, it could be stored in 5 bits per square), where rooms and corridors are generated. The specific encoding is not that interesting, but I’ll be back on map representations in another entry. The corridor digging function links any two rooms, and knows how to follow an existing tunnel to avoid digging extra tunnels; I keep track of which rooms (and clusters of rooms) are connected using the disjoint set data structure, which is surprisingly simple and fast—and certainly one of the few algorithms every programmer should know.

In fact, one can ask himself if the map is not so much generated as decompressed from a highly compressed core of data, the dungeon number, basically, a random seed. Because from the seed, everything that happens further on is completely deterministic. So to “store” the dungeon, I only need to store the seed that generated it, and, possibly, other events such as the state of the monsters and treasures found therein.

The relation between procedural content generation and data compression is a complex one. In many ways, procedural generation works like a data compression algorithm where very few bits are expanded into a great many, the decompressor being an algorithm that iterates on the seed, the equivalent of the compressed data, to generate textures, maps, or what have you. But there’s a catch. In data compression, you start with expanded data, say a texture, and compress it, transform it into a compact representation. The data decompression algorithm would take this compact representation and restitute the original data (or at least a good approximation to, in the case of lossy data compression). In procedural generation, it is as if we can only decompress data but from an unknown original; we can only try different compressed data until the decompressed data (the generated content) suits us. It’s like fishing in an infinite sea for the right fish.

At the time, on the TI-99/4A, it took literally forever to generate the ten levels of the dungeon—in fact, something like 10 minutes. Let’s take this 10 minutes figure, although it might have been only 8, or 6, or 12, I never timed it, and I have no means of timing it now. On my Pentium IV Vaio laptop, running at 3.0GHz—a mere 1000 times faster than the TMS9900—map generation takes under 10 microseconds per level; or about 100 microsecond for the complete dungeon (as reported by the gettimeofday function). The speedup from the architecture alone should be in the range of 1000× to 10 000×, as not only clock rates got a lot higher but also because the microprocessors’ architecture got a lot better in the mean time. The Pentium IV is a processor with a very deep instruction pipeline an a relatively large number of execution units, enabling it to yield several instructions per clock. The compilers got a lot better too. But what I get is a speedup in the order of 60 000 000:1! I cannot really explain the ~1000× difference with what I expected. Probably the fact that I use a modern compiler plays an important role. I do not know if the original game was programmed in BASIC or if it was compiled from another language or even written in assembly language.

The fact that it takes 10 microseconds to expand a dungeon from its seeds encourages me to think that this algorithm would be entirely amenable to an implementation of Tunnels of Doom on a very low speed, low power microprocessor, making it possible to play Tunnels of Doom on a device that lives on a lithium battery for seven years. Tunnels of Doom key-chain (Key-chain of Doom?). That’d be cool.

You can read more on the TI-99/4A here.

You can browse Ed Burn’s tribute page to Tunnels of Doom here.

You can read an interview with Kevin Kenney here.

13 Responses to Tunnels of Doom!

  1. I’m a long-time player and fan of ToD. Would you be willing to share your map generation code?

  2. Steve,

    I too am a long-time ToD fan and web programmer. I am intrigued with auto map generation especially the way ToD was written. Would you still be able to share your code (even if its pseudo code)? If not, great article! Thanks!

    • Steven Pigeon says:

      It’s SteveN.

      Sure, I can share the code. It’s for Linux though, but the linux-specific parts are really tiny. As it’s 1000-ish LoC, I’ll send it as attachment to your e-mail address.

      I suppose I could GPL the code, but, you know, whatever. Use it, and just keep the attribution.

  3. Sincere thanks for the sharing the code. I will definitely respect the attribution if I was to reference/use it. I was more interested in how things of that nature worked, especially, after being a kid playing a game like that for hours (and sometimes days on end) and now developing websites and working with various scripting languages. I made a loose attempt a few years back (flowchart stage) nothing compared to what you’ve seemed to achieve. Thanks again!

  4. Steven Pigeon says:

    You’re welcome.

    I think that it is small things like that that can change a lot in one’s life. I was already interested in computers at age 12, but I think playing a game like Kenney’s ToD really made me want to understand things at a deeper level. That I took another 26 years before returning to ToD and write a prototype just shows how fondly I remember that time of perpetual and intense discovery.

    • David Franklin says:

      You are totaly right. I started at 11 with that game and after seeing Tron (the Matrix 0.1), that was it! Im trying to build a ToD type game in Flash with my kids (they are fans too). Not a remake (that’s been done). Just a more easier less complicated RPG the you can pick up and play whenever. I’ll send you a link to the WIP after we really get it off the ground.

      Thanks again!

  5. […] a while ago, I discussed map generation in the classic 80s-era game Tunnels of Doom. After a short […]

  6. Jonathan says:

    Okay, this a little more on the side topic of the TI and ToD performance, than the main topic of your coding project… but I gotta say it, being from the Olde School of Technology as well…(my first computer too).

    I think the speed-up factor discrepancy you’re seeing is because a flaw in your thinking. You’ve got it partially – maybe it’s not calculating anything really much, just jumping to an offset in a huge data file, which took forever to load. Tunnels of Doom was a hybrid product – it was a cartridge with a data tape, and you’d use a blank for your saves, I think. It also came on floppy disk but a disk drive for that thing was way too expensive. It simply took a LOOONG time to load the data from the cassette tape – NOT related to processing anything. The processing is done relatively quickly – as instantly as anything ever was from that era. Once the tape was finished reading, it was ready.

    I’ve got an emulator that has ToD and other games built-in. (classic99, google it) Everything is loaded to RAM. Anyway, my point is, removing I/O speed concerns by virtualizing everything in memory makes it clear that we’re not bound by processing speed here either. When choosing the option to load from disk It loads everything in a split second and the game is ready pretty much when you hit enter, all while running the emulation at normal speed. When I choose the option for load from cassette it says “this will take 200 seconds” but of course it will just sit there doing nothing because the cassette drive is not actually implemented. But if it were a real TI, with a real cassette drive, you’d go make a sandwich or something while that sucker loaded.

    • Yes, I remember that it took a while to load the data from the tape. But that delay was quite distinct from the generation of the dungeon that occured after, and at the start of each new game (that fortunately did not require us to reload everything from tape)

    • I d/l and tried the classic99 emulator. It does work, but, well, I am clearly not nostalgic enough to endure the awesome slowness of the ti99/4a ;) Still, thanks for the link.

  7. Another ToD fan here. To emphasize how enraptured my friends and I were in this game, we would gather around to play together. Each person got to hit the key to attack and we made our decisions by consensus. If this happened once or twice, that would be one thing, but we did this all the time.
    I also remember how mind-numbing it was to wait for the data to load via audio tape. I’m glad we’re beyond that now!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: