obligatory obscure reference

self-deprecating yet still self-promotional witty comment


Can We Teach Debugging?

Filed under: Arduino,Hacking — jet @ 12:01

[thx to Tom Igoe for making me write all this down]

Question: Can we teach people how to debug? We often talk about “the art of debugging”, but is it really an art? Is it a science? A mixture of both?

I’ve never given thought to explicitly teaching a unit on debugging, it’s always something that gets covered along the way while teaching coding and physical computing. Which isn’t surprising, I don’t remember anyone teaching any sort of debugging in any of my CS classes or ever thinking of it as an academic subject. It was mostly treated as a practical thing that we’d pick up after we got work. You get your first few coding jobs, you have bugs to fix, and someone senior (usually) comes by and teaches you what they know about finding bugs in software.

However, I think that where I really learned how to debug wasn’t while writing software, it was while working on cars and motorcycles. When you’re faced with a motor that won’t start or a strange noise or an intermittent failure in some part of the electrics, you have a problem space much bigger than you can see all at once. You have to break the problem down, rule out areas that can’t obviously be at fault, and try and focus on the areas where the problem might lie. There’s an iterative process of selecting smaller and smaller domains until you get to the point you can start asking questions that are simple enough to be easily answered. “Why won’t the car start?” can’t be answered until you’ve answered all the little questions, like “is the battery fully charged?” or “is the fuel pump sending fuel to the carbs?” or “are the spark plugs firing?”. More often than not, answering those questions simply leads to even more big questions that have to be broken down again, “ok, why isn’t power getting from the battery to the spark plugs?”

Another issue is that in the physical world, you don’t have x-ray vision (debuggers) or the ability to replicate the problem over and over without causing physical damage (“let’s add some printf’s and run it again”). You have to learn to break the problem down into tests that won’t damage the engine or make the problem worse. You also can’t magically make most of the car go away with #ifdef’s in order to isolate the problem, you have to physically connect and disconnect various things both for safety reasons and so that you can perform proper tests. (Always remove the ground strap from the negative pole of the battery. No, really. Trust me on this.)

When unit testing started gaining in popularity in the software dev world it made perfect sense to me. This is the way I fix a vehicle — break the problem down into increasingly smaller chunks, isolating unrelated systems from one another, to the point that I can “Has the flayrod gone askew on treadle, yes or no?” instead of being stuck at “there’s a problem at the mill”. Granted, we use these tests to prevent bugs in code under development, but adding them to an existing code base often reveals all sorts of, “oh, so that’s why it makes that odd noise on Tuesdays after lunch” bugs that were never worth tracking down. The problem with unit testing, however, is that you’re usually adding it from the bottom up while you’re actively focused on the task of software development. You’re almost ruling out the need for debugging later by doing extensive testing now, or you’re adding it to an existing code base to “increase stability,” as they say. (In other words, there are all sorts of mildly annoying bugs we can’t find easily, so let’s just add unit tests and hope that fixes things.)

My gut feeling is that there’s a pattern involved in debugging both hardware and software problems, and that the pattern is similar to other patterns we use in hacking and design. I believe “my car won’t start” and “my LED won’t light” can be solved using similar methodologies to generate domain-specific questions. Each domain requires its own set of knowledge and specialized tools, but I think the process for fixing each problem is similar.

So how do we teach debugging? I think we start by teaching the good ol’ scientific method, translated to computational space:

  1. Have you solved this problem before? If so, repeat your previous investigation to see if it’s happened again.
  2. Define questions that you can answer with simple tests.
  3. Perform tests and collect data
  4. Analyze results.
  5. Repeat 1-4 until problem is solved.

Ok, granted, that seems pretty obvious, but it took us humans a really long time to sort that out and write it down. I suspect most of us couldn’t recite the scientific method off the top of our heads, especially if we didn’t study science in high school or college.

Using these steps also requires a certain amount of domain specific knowledge and ability to choose and use the appropriate tools. If I don’t know ground from positive and that current can only flow through an LED one way, I’m not going to get very far figuring out why my LED won’t turn on because I’ll never be able to ask the correct questions.

In theory, what we’re teaching when we teach computing is the necessary domain specific knowledge and tools, so the student should be able to debug simple problems in their own work. What I’m suggesting is that we also hammer on the above methodology any time a student shows up saying, “my LED won’t turn on”. Make the student break the problem down, generate the questions, then show the student how to answer those questions if they’re unable to do so on their own. If the student can’t ask the right questions, then there’s a bigger problem: the student doesn’t have the domain specific knowledge to turn the LED on in the first place. If the student can ask, “is there power to the LED” then you can show them how to use a multimeter to answer that question. But if they’re still unclear on polarity, then they’ve got a bigger problem.

I think we can teach debugging, even though we weren’t taught debugging when we were learning. Not only can we, I think we do students a disservice if we don’t teach debugging as a core competency.

Footnotes: “Kit-building isn’t learning” and “It’s not my fault that Windows sucks”.

There are at least two exceptions to the above.

1) Building a kit isn’t the same as studying and learning a subject. Knowing only how to solder and read English, one can put together kits ranging from a simple Adafruit Arduino kit to a complex synthesizer from PAiA. But unless the builder does research into why the kit is put together the way it is, they haven’t learned much, if anything, about what they’ve built. As a result if there is a problem, whether it is in the instructions, the components, or their assembly, they’re going to find it very difficult to ask the right questions that will let them debug the problem. (I’ve assembled kits from both outfits and they both provide excellent kit-building instructions.)

2) Windows sucks. Ok, all computers suck, but Microsoft has spent decades innovating new and wonderful ways for Windows to suck, so I’m picking on them. There comes a point in debugging a hardware or software problem where the problem is out of operational scope. Maybe “my LED won’t turn on” because there’s a subtle bug in the USB driver between the IDE and the breadboard. The student will probably never have the domain specific knowledge needed to start asking questions about the USB driver, and odds are the instructor won’t either. Sadly, even the person who developed the driver or tool or OS can’t formulate the right questions either, this is why the final step in many troubleshooting guides is, “reinstall the drivers and then the operating system”.

[tags]debugging, pedagogy[/tags]


  1. There are a number of methods that can be taught; debugging is not really an art, it’s more of a craft really. You become skilled at it but you do so by following various relatively mechanical techniques.

    As an example, the first step in debugging is to *isolate the problem*. The way you typically isolate a problem is via a form of binary search. You can do this in various ways; breakpoints, probes (even printf), etc. This is common across almost ALL domains involving complicated systems, not just software.

    A common next step is simply to *reduce the problem* into a reproducible and hopefully simpler case. Again true across many domains.

    You really nail the actual meat of the method with points 3 and 4 in your list above, though. The only valid way to really debug is to collect metrics, analyze them, modify the code, and check results. It’s kind of a shorthand for the scientific method. At this point professionals also build new test cases based on the mentioned metrics collection that check this condition so that it can be regressed in the future.

    Too often you see some wanker try and stare down a debugging problem as if he can force the bug away by sheer will of thought. It’s a lot easier to behave like a scientist rather than a magician :) This is actually a mark of inexperience and usually plagues mid-level engineers for some reason. I certainly was guilty of it. It’s like you have a little experience under your belt so you feel like you can figure out potentially complicated issues by working through them in your head. Maybe you can, but it is a lot easier combined with the above techniques.

    Comment by Howard Berkey — 2009/08/08 @ 21:39

  2. Debuggung is actually two related tasks: finding the bug and fixing the bug. Fixing the bug is usually regarded as being the easier of the two; if one has the knowledge to create a system with a bug, one probably knows how to remove it, or at least replace it with something that works better.

    Finding the bug is another issue, and it does require knowledge of the design and underlying technologies. But the process can be taught. “Divide and conquer” works very well, and pretty much describes the method used by everyone except the truly gifted or the divinely lucky. And even the truly gifted usually use that method. They only make it look like they got it right on the first try because their experience and knowledge let them come up with a mental list of potential causes and the relative probabilities of each, then they test the most likely one first. Ask an experienced mechanic why your car won’t start. In a heartbeat he’ll think of a dozen possible reasons, then ask you when was the last time you put gas in the tank. (It’s people like this that encourage Howard’s wankers to try to stare down a problem. They see the battle-scarred veterans do it, but they don’t realize that each scar represents the hard-won experience that makes it possible. We all have to pay our dues one way or another.)

    One extension of debuggung I wish more people understood is Root Cause Analysis. Lots of people due to laziness, schedule pressure, or lack of knowledge will fix a symptom and never fix the underlying problem. The unofficial mantra of Root Cause Analysis is “ask ‘why?’ five times.” Each answer leads to the next “why?”, and may even lead to causes (and suggest fixes) outside the system in question. And by answering each “why?” you learn a lot more about the system.

    Comment by Hinermad — 2009/08/09 @ 09:46

  3. I think another thing to throw in along with “isolate the problem” is “Is there a problem at all?” Many so-called problems are just misunderstandings of what proper behavior is suppose to be.

    Somewhat related is problem definition. “It doesn’t work” isn’t sufficient, of course. Iteratively defining The Problem in deeper and deeper detail tends to reveal the underlying “why?” This tends to involve some research and self-education – a step in the process that some get paralyzed at. This was the hang up VCR owners had years ago – programming a VCR was A Problem To Solve and it required that you had to learn something new. And that is where people gave up, stopped and thought that anyone who could program a VCR was a genius.

    The effort to learn new things is too much of a bother for some. The rest of us love the idea.

    Comment by Jeffrey — 2009/08/09 @ 10:51

  4. I loved the analogy to diagnosing a car. I still vividly remember the day I was trying to fix my car and couldn’t figure out what was wrong. My dad, who happened to be an electronics engineer, came along and started breaking down the problem of “my car won’t start” into appropriate sub categories, in this case air, fuel, and fire. There was no obstruction of the air intake, and valve timing was fine. Pulling a spark plug, I cranked the engine and observed a spark. Finally he pulled the hose off the fuel pump output and we saw that there was no fuel pressure when the engine cranked over. As this was one of the last times I saw my father, I still remember this incident every time I feel stumped trying to debug something. My father taught me how to think things out by his example. The science is in knowing what experiments to try, the art is knowing how best to do the experiment and which experiments to do, but the real key is knowing how to think. Sadly, that usually gets lost in the maze of facts and figures. What’s really needed is a teacher, a guru, who has realized the differences between cause and effect. This can be anybody anywhere: a parent, a teacher, a co-worker, or even a timely stranger. We just have to be willing to get past our fear of ignorance and earnestly seek the answers we need. Once you reach that point, the guru will appear, though you might not recognize them as such. One of the greatest things about on-line forums is the way total strangers can share problem solving strategies and fill in missing knowledge about various subsystems.
    Does anyone remember back in the 80’s when there was a huge amount of hype about expert systems? People hoped then that expert level skills could be boiled down into a set of rules which a program could simply run through in order to solve a problem, but none of those programs knew how to ask a new question.
    Yes debugging can be taught, but it is an empirical knowledge gained through hands on experience, not simply a memorized procedure. You have to learn how to ask the appropriate questions in order to get the appropriate answers, and that is not a simple thing.

    Comment by Mark — 2009/08/09 @ 18:06

  5. while i of course agree that kit building is not the same as studying from a book, it does provide a structured way of seeing -patterns- of electronics, eg “There always seems to be a diode between a 7805 and DC Jack” “LEDs have a 1K series resistor”. much like following a recipe does not mean you’re a cook, but eventually many basics sink in.
    the real learning comes when the user modifies the kit!

    Comment by limor — 2009/08/09 @ 21:03

  6. Limor, your instructions are a rare exception in the kit-building world. You not only explain what to do, but why to do it, so kudos to you for your efforts.

    Comment by jet — 2009/08/10 @ 00:06

  7. Great “tutorial” on how to solve bugs. The key point is (in my opinion) follow a structured way. Creation is usually messy, but the approach to debugging must be structured.
    After several frustrating attempts when I was a kid, I stop assembling those projects from the Electronics magazines. Later I understood that if don’t know what is “happening” you shouldn’t build it.
    As you I, learned a lot from mechanics, after graduation I worked for a lift (elevator) company, we designed the controllers and worked mostly with induction motor drives and hydraulic lifts, now I solve problems in my bike :-) When you are faced with problems in complex systems requiring interaction of many parts on different domains (mechanics, lubrication, hardware and software) you need to approach the problem in an analytical way: isolate a section; check if it works as it should; if so check the other section, if not break this section again. If you don’t proceed methodically you’ll be lost in the jungle and you’ll never leave.

    Comment by JCS — 2009/08/13 @ 02:29

RSS feed for comments on this post.

Leave a comment

Powered by WordPress