Update: The solid, unassuming Dig Dog is kind of an ideal “Spelunky on the go” type of video game, and we enjoyed happening upon it first during the 2017 Fantastic Arcade in Austin, Texas. This weekend, Dig Dog gets a proper re-release on the Nintendo Switch, so we’re taking an opportunity to resurface this article that’s ultimately less review and more origin story. After all, Dig Dog isn’t just a solid, cheap option to enjoy with Switch’s portable mode—it’s also a remarkable example of a game coded and drawn as a hands-free experiment. This piece originally published on February 9, 2018, and it appears unchanged below.
Dig Dog is a pretty fun little video game. Call it “Spelunky for kids”—and don’t think of that as a backhanded compliment, either. Dig Dog, which launched Thursday on iOS, Xbox, Windows, and Mac, shaves away some of the genre’s complications, controls smoothly, and has depth. It’s as if the modern wave of randomly generated, dig-for-surprises adventures had existed in early ’80s arcades. (And all for only $3!)
I liked Dig Dog enough when I stumbled upon it at last year’s Fantastic Arcade event in Austin, Texas. But my interest in the game spiked when its creator reached out ahead of this week’s launch to confirm something I’m not sure any other video game creator has done: coding an entire game by himself… without using his hands.
Longtime developer and Austin resident Rusty Moyher was diagnosed with Repetitive Strain Injury (RSI) roughly five years ago—while in the middle of a time-crunched game-design project, no less—and found that his only true physical relief came when he took full, 100-percent breaks from typing and using a mouse. That wouldn’t cut it for him, he admitted. “I still want to make games,” Moyher told Ars. “It’s hard to imagine any career or job that doesn’t involve computers.”
Moyher wanted to prove that his dream—of making legitimate video games without using his hands—was possible. For him, the only true answer was to make and launch a good, working game—and to tell the world how he did it so that others might follow suit.
Battling RSI with a Dragon
Nobody’s RSI diagnosis is ever convenient, but Moyher’s wrist and hand pains reached an unsustainable peak while he was in the middle of possibly the least-compatible project imaginable. He and two longtime game-design collaborators had just met a $60,000 Kickstarter goal for a “six games in six months” project. Just reading the Retro Game Crunch pitch these days still makes Moyher’s wrists hurt: the team would take suggestions from paying backers, then turn a prototype around in 72 hours. A fine-tuning process would follow with a slap-dash game being completed by the end of the month, with the next game’s creation and development starting immediately after.
Thanks to Moyher’s RSI, the three-man team didn’t meet those deadlines. Nevertheless, he plowed away on those Kickstarter games while experimenting with changes to his office setup: ergonomic keyboards, desk changes for the sake of posture, mice swaps. Nothing worked, aside from good old-fashioned time away from a keyboard and mouse (along with injections of medicine directly into his skin).
“The silver bullet” came when Moyher found a video presentation by developer and coder Travis Rudd, which appeared online in 2013 shortly after his diagnosis, that took viewers step-by-step through Rudd’s own RSI experience. The 28-minute video shows Rudd breaking down exactly how he customized Dragon NaturallySpeaking, a voice-recognition software suite, to write code in the Python language using nothing other than his voice. This countered the wisdom Moyher had seen in forums about RSI and coding, declaring that Dragon’s usefulness in coding was limited. “Don’t do it, it’s impossible,” was the common wisdom, Moyher said.
But Rudd swore by hacks he’d applied to the Dragon ecosystem, which Moyher eventually fished from the speaker by nagging him via email. The two toolsets he learned about, Natlink and Dragonfly, were appealing because they could be customized to support certain key phrases that would then trigger anything from basic text entry to variable names to macros. “The commands you come up with, basically in a made-up language, are all built to be quickly and easily recognized by Dragon,” Moyher said, and he recommended “short, tight words or phrases that can be executed quickly.”
The word “slap” hits the return key once; “two slap” hits it twice. Say “camel” before saying a phrase like “this is variable” out loud, and it’ll be parsed like so: “thisIsVariable.” Open-faced brackets can be typed by saying “lack” (for <) and “rack” (for >). For a sample of exactly how this works, Moyher was kind enough to provide video of an average coding session, embedded below:
Some of these terms were already baked into the tools that Moyher downloaded and attached to his install of Dragon NaturallySpeaking. But he admits that for the most part, he had to invent and train the system to accept new ones.
The ol’ Texas two slap
“I needed to build a vocabulary that was suited for what I was doing that I was familiar with,” Moyher said. “The process of coding by voice is, I have to do programming tasks, like normal, and come up with commands and modify the system. On top of that, all at once, I also have to remember these [shortcut terms]. It can be really slow going. I had to slowly build up a library of commands I was familiar with, working with my voice, that I could basically remember.”
Part of his need for a custom vocabulary was that his programming choices of Visual Studio and Xcode were better suited for game development, as opposed to the Emacs-friendly voice commands that Rudd and his peers had built up. His choice of programs, by the way, also required significantly more cursor movement, he reckoned, which meant he needed something that Rudd and others had not suggested: a true hands-free mouse replacement.
Rather than jerry-rig some sort of infrared head or sight tracker, Moyher went for what appeared to be the best option at the time: a $500 SmartNav 4 from the special-needs computing supplies company Natural Point. One sensor is placed in front of his line of sight, and he attaches a small reflector to whatever hat he wants to wear. After some sensitivity adjustments, Moyher got this to work with pretty minimal movement of his head, roughly “5-10 degrees” left-to-right and less up-to-down. The final piece of the hardware puzzle was a foot pedal, which he uses to control clicks of a mouse.
As Dig Dog‘s sole coder, designer, and artist, this head-tracking solution was imperative to draw the game’s elements and animation frames, which are otherwise admittedly simple. The game’s giant-pixel look fills out at roughly 220p with a maximum six-color palette. “It’s not as precise or quick as a mouse,” Moyher admitted. “I designed the style of the game to be more achievable with this system.”
Moyher’s original mission was to pull off something that would not only be designed without hands but also testable without hands, but he didn’t quite pull that part off. Dig Dog originally began life with a design priority that went beyond simply being achievable; he wanted to create a good iOS platformer. His first prototypes were more atmospheric and low-key, and they simply featured a pixellated dog wandering across a giant desert. “Then I stumbled upon digging and realized, oh, this is obviously about digging now.” (Indeed, the basic action of tapping to dig and moving around by hitting the screen’s edges works quite well within the constraints of a smartphone screen, though it’s also fun with a standard gamepad.)
But as a sole designer and coder, Moyher had to endure wrist pain in the form of one exception: testing the prototype. “I wanted to playtest the game with my tools, but all the playtesting was with my hands,” Moyher said. “There was no other way to get the game feel and fine-tune the mechanics without doing that.”
Tossing the community a bone
Moyher’s full-game mission came, in part, to give himself breathing room to figure out a speech-to-text coding vocabulary. Unlike that Kickstarter, he didn’t want anybody in a project depending on his speed or competency to get work done on a larger project. He now believes he has set roots for moving forward as a speech-first coder, whether alone or in a larger project. (“I’ve gotten close to one-to-one speed,” he said, in comparing his spoken speed with normal typing speed.) And he’s already looking into superior tools, particularly eye-tracking technology, to see if he can get increased speed and fidelity with head-tracked mouse movement.
But just as Moyher looked to the community for inspiration when he first succumbed to RSI, so he hopes that other game makers with similar diagnoses or disabilities look at his accomplishment and plow forward on their own dream games.
“My thought (from the beginning) was, if I could make a game interesting and fun to play, it’d be a fun way to promote [voice coding] as an idea,” Moyher said. “The tools right now are pretty brittle. They can break relatively easily, in terms of having to install software, get it working in Windows, get various add-ons and hardware. Just to promote the idea that there’s this other way to work, and that these things exist? Maybe it’d improve voice coding in general!”
Moyher sent a few links along after our interview for anybody interested, as many of his tweaks and efforts were based on publicly available information. Start with this guide to DragonFly as a coding-specific addition to NaturallySpeaking, along with an installation guide for relevant add-ons. Dive further at handsfreecoding.org.