This is the first post in a series about using NDepend for code analysis. Below are the links to all posts of the series (will be updated as more posts are published):
- It NDepends! Part 1: Motivation and overview
- It NDepends! Part 2: Metrics and rules
Let me start with a fictional dialog between a software engineer and a product/project manager:
SE: Our code base is getting too complicated and needs refactoring. The technical debt is really big, but we never get enough time in our sprints to fix it.
PM: Technical what?
SE: Technical debt. You know, too many classes inheriting from each other, too complex conditional logic in methods, lots of…
PM: *sighs* Alright, alright… So how big is this debt exactly?
SE: Uhm… Well, I don’t know exactly, but it’s big. It’s huge.
PM: How much time roughly do you need?
SE: No idea. But, as I said, it’s huuuuge…
If you worked in any kind of software development company, you probably heard something similar or even participated in it. If you are a lucky software engineer and your company trusts you, maybe your PM will just say “go for it”. But even so, your PM will need to explain the situation to the business owners, who will inevitably frown upon the idea of spending unknown time to pay off some strange unmeasured “technical debt”.
Wouldn’t it be nice if you, as a software enginner, could present a detailed breakdown of technical debt and give a realistic estimate of eliminating it? And what if you, as a PM, could tell the stakeholders exactly how much time developers are asking for?
Technical debt and code analysis tools
According to Wikipedia, “Technical debt is a concept in software development that reflects the implied cost of additional rework caused by choosing an easy solution now instead of using a better approach that would take longer.” It’s this piece of code that you copy-pasted to five other places, because “there was no time” to extract it to a base class. It’s this application with no IoC container set up, because you thought it was quicker to just
new up all the types. It’s this thousand-lines-of-code-single-method-console-tool you wrote, because it felt easier this way. Hopefully, you know what I’m talking about and recognize the dangers of ignoring code design issues for too long.
Anyway, this post is not so much about what technical debt is, but rather about how to deal with it and related issues. There are plenty of tools for performing static code analysis in .NET space, both free and commercial. A pretty popular free one is SonarQube, which I’m a bit familiar with and will sometimes refer to it for comparison. However, I always like to have options, so I wanted to also check out one of the well-known commercial code analysis tools for .NET - NDepend.
What is NDepend?
In its heart, NDepend is a static code analysis application, with a strong focus on visualizing code architecture and quality. It has tons of built-in analyzers, which allow to generate a comprehensive technical debt overview in a matter of seconds. This is invaluable for Continuous Integration scenarios in a collaborative setup. In this area NDepend’s closest alternative, in my opinion, is SonarQube, but it doesn’t provide those rich visualizations. Also, as the documentation states, these analyzers complement each other rather than compete or duplicate:
Both NDepend and SonarQube are static analyzers that offer a rule-based system to detect problems in C# and VB.NET code. However the NDepend default Rules-Set has very few overlap with the SonarQube rules and also with the Roslyn analyzers.
These features of NDepend are pretty unique, I think, and they allow you to explore the codebase in an intuitive and insightful way. On top of that, NDepend exposes the internal code querying mechanism, CQLinq, for further expansion and customization of the default ruleset. Yes, that’s LINQ-to-code!
NDepend comes in several flavours:
- Visual NDepend, a standalone UI application for performing code analysis
- Visual Studio extension for running the same analysis against a solution in VS
NDepend.Console.exe, a console version of the code analysis runner for automation scenarios
To give it a quick test drive, I ran the analysis on one of the medium-sized solutions (29 projects) that I have been working on. It finished very quickly (no more than 30 seconds, for this codebase and on my machine) and produces a nice HTML report at the end:
The report shows some high-level metrics (like this overall B rating and magically calculated 91 days of technical debt), screenshots of several diagrams (including a dependency graph and this colorful and scary-looking “Treemap Metric View”), and also the detailed lists of violated code quality rules. The nice thing about this report is that it is self-contained and can be easily shared with anyone or published on an internal website.
But, of course, the real value is in analyzing this report interactively in Visual NDepend, since all these diagrams and metrics are actually query-based and can be inspected and modified on the fly. In the next post I will look deeper into the built-in code quality rules of NDepend and how you can define your own. Stay tuned!