Æons ago, that is, in September 1982, I got my first true computer, the Texas Instrument TI-99/4A. A truly incredible machine, especially considering the era. Its CPU, a true 16 bits processor, the TMS 9900, ran at 3MHz. Even the popular TRS-80 Color Computer, Radio Shack’s 6809-based computer, ran at a mere 0.895MHz, or, if the gods were good to you, you could tweak the clock divisor and crank it up to 1.79MHz.
In addition to being my first programmable computer, the TI-99/4A sported truly nice graphics and a relatively advanced sound chip—for the era—and a number of games were developed for it. Hunt the Wumpus, Parsec, and others. But the one game that really got me interested in programming is Kevin Kenney’s Tunnels of Doom, one of the very first graphic dungeon crawler games. While not being nostalgic by nature, I returned to that game about a year ago following an unusual route. I was looking into procedural content generation, especially map generation, in relation to data compression and I remembered that the old game’s dungeons were generated randomly, yet were very well structured. I decided to get a look into it, and see if I could generate maps like his.
The original game is a classic dungeon crawler game: you kill monsters, get gold and XP, get better weapons, kill more monsters, until you get to the bottom of the dungeon were you found the King and his Orb of Power (whatever that was) and I can’t remember if you won at the moment you found the two or if you had to crawl back up to the surface. Probably the latter. So, as a round-based attack system (where parties rolls for initiative, who goes first is decided, and where blows are exchanged taking turns) you encounter monsters and slay them—or try very hard to. The combats would look something like this:
Of course, what intrigued me the most was how the maps were generated. They never had disconnected rooms, they were always well spread across the space, and they never looked too random. A typical map would look something like this:
where the boxes are rooms (filled with monsters), round rooms are fountains, were you could drink and get your HP restored, and the arrow-like rooms are stairwells.
So during last year’s Christmas vacations, I tried to replicate the look of the maps. After a few tries, not getting very nice-looking map, I decided to ask Kevin Kenney how he did it, back in the 80’s. Kenney was kind enough to answer my mail:
The trick was to not put in too many horizontal corridors – balancing that took awhile.
What I can remember about the dungeon-digging algorithm:
- Place stairwells from the previous floor.
- Place rooms & other features (stairs down, fountains, etc.) randomly.
- Draw vertical corridors between features. (Rooms are also features.)
- Tough part: Draw horizontal corridors starting from a feature to another feature or corridor, if another corridor isn’t already above (or below) it. (Don’t connect corridor to corridor.)
- Search for unconnected rooms, and connect them, where possible. If a room lies entirely outside of the others, and can’t be connected, mark it as empty. I randomized stairwells less widely than other features, so they wouldn’t have this problem
- Stock the dungeon.
Following his hints, I set to recode my map generator. I finally got somewhat decent looking maps:
But I did not quite implement his heuristic. I continued playing with the rules to understand the process of pleasant map generation. The maps cannot be merely random; the player must feel that the world he’s navigating is not arbitrary and random, but is thought-out, even though it is really random. (hmm… reminds me of something.)
So the basic data structure is a flat array, 16×24 bytes (actually, it could be stored in 5 bits per square), where rooms and corridors are generated. The specific encoding is not that interesting, but I’ll be back on map representations in another entry. The corridor digging function links any two rooms, and knows how to follow an existing tunnel to avoid digging extra tunnels; I keep track of which rooms (and clusters of rooms) are connected using the disjoint set data structure, which is surprisingly simple and fast—and certainly one of the few algorithms every programmer should know.
In fact, one can ask himself if the map is not so much generated as decompressed from a highly compressed core of data, the dungeon number, basically, a random seed. Because from the seed, everything that happens further on is completely deterministic. So to “store” the dungeon, I only need to store the seed that generated it, and, possibly, other events such as the state of the monsters and treasures found therein.
The relation between procedural content generation and data compression is a complex one. In many ways, procedural generation works like a data compression algorithm where very few bits are expanded into a great many, the decompressor being an algorithm that iterates on the seed, the equivalent of the compressed data, to generate textures, maps, or what have you. But there’s a catch. In data compression, you start with expanded data, say a texture, and compress it, transform it into a compact representation. The data decompression algorithm would take this compact representation and restitute the original data (or at least a good approximation to, in the case of lossy data compression). In procedural generation, it is as if we can only decompress data but from an unknown original; we can only try different compressed data until the decompressed data (the generated content) suits us. It’s like fishing in an infinite sea for the right fish.
At the time, on the TI-99/4A, it took literally forever to generate the ten levels of the dungeon—in fact, something like 10 minutes. Let’s take this 10 minutes figure, although it might have been only 8, or 6, or 12, I never timed it, and I have no means of timing it now. On my Pentium IV Vaio laptop, running at 3.0GHz—a mere 1000 times faster than the TMS9900—map generation takes under 10 microseconds per level; or about 100 microsecond for the complete dungeon (as reported by the gettimeofday function). The speedup from the architecture alone should be in the range of 1000× to 10 000×, as not only clock rates got a lot higher but also because the microprocessors’ architecture got a lot better in the mean time. The Pentium IV is a processor with a very deep instruction pipeline an a relatively large number of execution units, enabling it to yield several instructions per clock. The compilers got a lot better too. But what I get is a speedup in the order of 60 000 000:1! I cannot really explain the ~1000× difference with what I expected. Probably the fact that I use a modern compiler plays an important role. I do not know if the original game was programmed in BASIC or if it was compiled from another language or even written in assembly language.
The fact that it takes 10 microseconds to expand a dungeon from its seeds encourages me to think that this algorithm would be entirely amenable to an implementation of Tunnels of Doom on a very low speed, low power microprocessor, making it possible to play Tunnels of Doom on a device that lives on a lithium battery for seven years. Tunnels of Doom key-chain (Key-chain of Doom?). That’d be cool.
You can read more on the TI-99/4A here.
You can browse Ed Burn’s tribute page to Tunnels of Doom here.
You can read an interview with Kevin Kenney here.