Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why aren't students taught to use a debugger? [closed]

Tags:

debugging

People also ask

Why you should not use debugger?

While many times being able to quickly find errors in code without a debugger is a good skill to have, it seems it's less productive to spend a lot of time looking for issues when a debugger would easily find little mistakes like typos. Is it possible to manage a complex without a debugger?

What is the point of a debugger?

A debugger is a software tool that can help the software development process by identifying coding errors at various stages of the operating system or application development. Some debuggers will analyze a test run to see what lines of code were not executed.

Why is debugging code so difficult?

Debugging Is Hard “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.” So why is debugging so hard? It all comes down to the complexity.

Are debuggers useful?

A debugger is a program that allows you to step through another program one line at a time. This is very useful when trying to identify incorrect code and analyze how a program "flows". Key concepts include: Breakpoints, Stepping, and Viewing data.


I don't think the problem is teaching. Using a modern graphical debugger is not rocket science (at least not for most user-mode programs running on a single computer). The problem is with the attitudes of some people. In order to use a debugger effectively, you should:

  • Admit it's your fault and select isn't broken.
  • Have the perseverance to spend a couple nights debugging, without forgetting the previous point.
  • There's no specific algorithm to follow. You should guess educatedly and reason effectively from what you see.

Not many non-programmers have these attitudes. At college, I have seen many friends who give up after a relatively short period of time and bring me some code and tell me the computer is doing something wrong. I usually tell them I trust their computer more than them (and this hurts some feelings, but that's the way it is).


In my high school and university, most of the people in the classes didn't really care about programming at all.


If by students you mean Computer Science students, I think the answer is fairly obvious. The subject matter for courses is generally theory, with the programming language / framework / library there as an aid. The professor can't go very far in depth on a particular tool, since it would take away from time he is teaching networking or systems or whatever. Maybe if there were a course called "Real World Programming" or something like that, they'd cover debuggers, but in general I don't see too much wrong with expecting students to read the language / tool documentation in order to accomplish the coursework.


Debuggers were introduced in my second year Intro to C course, if I recall correctly. Of course the problem most students were struggling with at that point was getting their work to compile, which a debugger will not help with. And once their ten line command line program compiles and then crashes, well, they already have some printfs right there. Fighting to master GDB is overkill.

In my experience, it's fairly rare to actually deal with a code base large enough to make more than a cursory familiarization with a debugger worth the time investment in most Comp. Sci curriculums. The programs are small and the problems you face are more along the lines of figuring out the time-space complexity of your algorithm.

Debuggers become much more valuable on real world projects, where you have a lot of code written by different people at different times to trace through to figure out what keeps frotzing foo before the call to bar().


This is a good question to ask to the faculty at your school.

At my university, they gave a very brief example of debugging, then pointed us to the "help" files and the books.

Perhaps they don't teach it because there is sooo much stuff to cover and so little time for the lecturers. The professors aren't going to hold everybody's hand.


Not entirely related, but people need to use debuggers not just for debugging but to understand working code.


I'll put in a cautionary note on the other side. I learned to program with Visual Basic and Visual C ( mid 80s ), and the debuggers were built-in and easy to use. Too easy, in fact... I generally didn't think about how to solve a problem, I just ran it in the debugger and adjusted the behavior. Oh, that variable is one too high... I must have to subtract one here!

It wasn't until I switched to Linux, with the not-quite-as-easy gcc/gdb combo, that I began to appreciate design and thinking about your code first.

I'll admit, I probably go too far the other way now. I use a debugger to analyze stack traces and that's about it. There should be a middle ground between analyzing the problem and stepping through it in a debugger. Certainly people should be shown all the tools available too them.