Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Debugging is a bad smell - how to persuade them?

Tags:

tdd

debugging

I've been working on a project that can't be described as 'small' anymore (40+ months), with a team that can't be defined as 'small' anymore (~30 people). We've been using Agile/Scrum (1) practices all along, and a healthy dose of TDD.

I'm not sure if I picked this up from Agile or TDD, more likely a combination of the two, but I'm now clearly in the camp of people that looks at debugging as a bad smell. By 'debugging' I'm not referring to the more abstract concept of figuring out what might be wrong with the system, but the specific activity of running the system in Debug mode, stepping through the code to figure out details that are otherwise inscrutable.

Since I'm fairly convinced, this question is not about whether debugging is a bad smell or not. Rather, I'd like to know how I can persuade my team-mates about this.

People that believe debugging mode is the 'standard' mode tend to write code that can be understood only by debugging through it, which leads to a lot of time wasted since every time you work an item on top of code developed by someone else, you get to first spend a considerable amount of time debugging it (and, since there's no bug involved.. the term is becoming increasingly ridiculous) - and then silos happen. So I'd love to convince a few of my team-mates that avoiding debug mode is a Good Thing (2). Since they are used to live in Debug mode, however, they don't seem to see the problem; to them, spending hours debugging someone else code before they even start doing anything related to their new item is the norm; they don't see anything wrong with it. Plus, as they spend time 'figuring it out' they know eventually the developer that worked that area will become available and the item will be passed on to them (leading to yet another silo).

Help me come up with a plan to turn them from the Dark Side !

Thanks in advance.

(1) Also referred to as SCRUM (all caps). Capitalization arguments aside, I think an asterisk after the term must be used since - unsurprisingly - our organization 'tweaked' the Agile and Scrum process to fit the perceived needs of all stakeholders involved. So, in all honesty, I won't pretend this has been 100% according to theory, but that's beside the point of my question.

(2) Yes, there will always be times when we'll have to get in debug mode, I'm not trying to absolutely avoid it, just.. trying to minimize the number of times we have to dive into it.

like image 214
FOR Avatar asked Nov 01 '08 20:11

FOR


2 Answers

If you want to persuade your coworkers that your programming practices are better, first demonstrate by your productiveness that you are more effective than they are, at least for some tasks. Then they'll believe you when you explain how you get so much done.

It's also sometimes easier to focus on something concrete. Do your coworkers even talk in terms of "code smell"? Perhaps you could focus on specifics like "When the ABC module fails, it takes forever to debug it; it's much faster to use technique XYZ. Here, let me demonstrate." Then afterwards you can mention your basic principle, which is yeah the debugger is a useful tool, but there's usually other more useful ones.

like image 125
lacker Avatar answered Sep 23 '22 08:09

lacker


This is a cross-post, because the first time around it was more of an aside on someone else's answer to a different question. To this question it's a direct answer.

Debugging degrades the quality code of the code we produce because it allows us to get away with a lower level of preparation and less mental discipline. I learnt this from an accidental controlled experiment in early 2000, which I now relate:

I took on a contract as a Delphi coder, and the first task assigned was to write a template engine conceptually similar to a reporting engine - using Java, a language with which I was unfamiliar.

Bizarrely, the employer was quite happy to pay me contract rates to spend months becoming proficient with a new language, but wouldn't pay for books or debuggers. I was told to download the compiler and learn using online resources (Java Trails were pretty good).

The golden rule of arts and sciences is that whoever has the gold makes the rules, so I proceeded as instructed. I got my editor macros rigged up so I could launch the Java compiler on the current edit buffer with a single keystroke, I found syntax-colouring definitions for my editor and I used regexes to parse the compiler output and put my cursor on the reported location of compile errors. When the dust settled, I had a little IDE with everything but a debugger.

To trace my code I used the good old fashioned technique of inserting writes to the console that logged position in the code and the state of any variables I cared to inspect. It was crude, it was time-consuming, it had to be pulled out once the code worked and it sometimes had confusing side-effects (eg forcing initialisation earlier than it might otherwise have occurred resulting in code that only works while the trace is present).

Under these conditions my class methods got shorter and more and more sharply defined, until typically they did exactly one very well defined operation. They also tended to be specifically designed for easy testing, with simple and completely deterministic output so I could test them independently.

The long and the short of it is that when debugging is more painful than designing, the path of least resistance is better design.

What turned this from an observation to a certainty was the success of the project. Suddenly there was budget and I had a "proper" IDE with an integrated debugger. Over the course of the next two weeks I noticed a reversion to prior habits, with "sketch" code made to work by iterative refinement in the debugger.

Having noticed this I recreated some earlier work using a debugger in place of thoughtful design. Interestingly, taking away the debugger slowed development only slightly, and the finished code was vastly better quality particularly from a maintenance perspective.

Don't get me wrong: there is a place for debuggers. Personally, I think that place is in the hands of the team leader, to be brought out in times of dire need to figure out a mystery, and then taken away again before people lose their discipline.

People won't want to ask for it because that would be an admission of weakness in front of their peers, and the act of explaining the need and the surrounding context may well induce peer insights that solve the problem - or even better designs free from the problem.

So, FOR, I not only agree with your position, I have real data from a controlled experiment to support it. It is, however, a rather small sample. More elaborate tests are required before my conclusions are supportable.

Why don't you take what I've said to your team and suggest trials. You have more data than they do (I just gave it to you) and in order to have a credible basis for disagreeing with you they basically have to test the idea, and the only way to do that is to give your idea a go.

You should be ready for it to all fall apart, though, because the whole thing is predicated on the assumption that the developers have the talent and experience to rise to the challenge of stronger design in the absence of step-through debugging.

Step-through debugging was created to make debugging easier. The direct effect of lowering the bar is that people with less talent can participate - if you build a tool that even jackasses can use, you will get jackasses using it -- a lot of them, if the newly accessible activity is well-remunerated.

This causes an exodus of people with talent because they generally use that talent to do rare and precious things in order to be well paid without working too hard, and the market doesn't want to pay for excellence because it cannot distinguish talent well enough to know when paying for it is justified.


Another thought: more recent work with problems on production servers, where it was impossible to install a debugger, has shown the importance of having a codebase for which maintenance doesn't depend on the availability of a debugger. Code that's grown in the absence of debuggers is much less hassle. Choose not to use them when you can change your mind, and then when you can't change your mind it won't be so awful.

like image 9
Peter Wone Avatar answered Sep 21 '22 08:09

Peter Wone