A friend of mine brought up a common struggle for many software developers, particularly independent consultants:
How do you influence your clients to demand their apps have effective unit and integration test coverage? Quality is a tough sell because the customer doesn’t directly see it.
This struggle between quality and expediency is a common thread throughout any developer’s career. It’s a big reason why I was so excited to publish Architecting Applications for the Real World on Pluralsight. Once we learn best practices, it’s natural to have a desire to apply them everywhere. Yet once pressure is applied, developers tend to revert to the mode that they feel allows them to move fastest. And this commonly means temporarily ignoring clean coding practices and reducing or altogether eliminating efforts in automated testing. Yes, many feel TDD helps them move faster in the long run, yet a recent study by Microsoft found TDD added around 15-35% to the initial development timeline. Thus, like many architectural best practices, TDD is an investment up front for potential payoff later (in both improved design and enhanced agility down the road).
So how do we assure that there’s adequate time to build sufficient quality into the application? I see two approaches to consider:
- Developer dictates quality. Code at a level of quality that makes you feel comfortable, fulfilled, and professional. Don’t even broach the subject with the client. There’s no requirement that we expose decisions on such technical details to the customer, right?
- Customer dictates quality. Engage the customer in conversations about the current constraints and their impact on code quality. Attempt to sell them on the need for these best practices, the impacts of technical debt, and the cost/benefit ratio. Be flexible and ultimately let the customer make the call on the level of quality they desire.
While neither approach is universally applicable, I tend to choose #1. Why? They’re paying us to be professionals. And as professionals, we should analyze their situation and flex the quality of the implementation based on an assessment of their current and future needs. If I feel that means their timeline is unrealistic, then I discuss the need to flex the feature set or the timeline. I only hack something in to hit the date when I think that option is truly recommended for the client in their current situation.
Think about your job as a service layer. Yes, they’re all the rage these days because they provide a coarse grained, friendly, and reusable API. As developers, we provide a human service layer to our clients. Yes, we’re aware of the fine grained API and plethora of options that we’re working with behind the scenes. But that doesn’t mean we should expose all these choices to the client. They’d likely be both overwhelmed and annoyed. They’re paying us for our judgment.
If you consider other fields, there’s indeed a precedent for hiding information from clients. An architect won’t mention all the potential materials that could be utilized for a structure. She’ll consider the situation and recommend a short list that makes the most sense in that context. A doctor won’t enumerate every potential drug or surgical option to a patient. He’ll instead recommend a specific course of action based on his expertise. We know this to be true because patients often seek second opinions when they don’t like what they’re hearing from the doctor. We assume the doctor is hiding some options he feels aren’t advisable.
We are software professionals, so we know that quality is not an all or nothing decision. We should consider the client’s needs and select an approach that balances quality and expedience. And yes, this means the answer likely won’t be 100% or 0% test coverage. But make your decision based on context, and if the timeline doesn’t afford you the time to deliver at the quality you believe is merited for this project, then it’s not suddenly time to start silently reducing quality to hit a deadline. Instead, it’s time to talk about flexing features and deadlines. Reducing quality below your professional recommendation in order to hit a deadline is malpractice. An architect won’t risk public safety, the company’s future, or her reputation for the sake of a deadline. Neither should we.
How do you balance quality, cost, and timelines? Do you discuss options for quality such as automated testing with your clients? Chime in via the comments below or on Hacker News or Reddit.
I got two words.. IT DEPENDS…
If you are developing a solution for NASA, or any medical systems, you surely need to have close to 100% code coverage for your unit tests. Becuase there is usually enough resources allocated for such projects.
BUT, if you are like most of us, where you design the database, code the backend, design/code the frontend, execute the functional tests, while at the same time trying to get the work out as fast as you can so that you can start on new work that is waiting for you; you surely can understand that unit testing is a “myth”. That there is pure life outside such things as “TDD”.
There is one thing that people like you always forget, that the IT consulting firms, thrives on fixing things. Maintenance and software updates is where the big money comes from. You can even ask Microsoft or Oracle, it is all about not giving the client “everything” in the software, keep some of the staff so that you can sell it to them as an upgrade.
“There is one thing that people like you always forget…”
Thanks for the comment Takaz. Can you clarify what you mean by “people like you…?”
I do agree that the level of code coverage that makes sense is dependent on context. And I have seen many projects where software quality fell short, thus necessitating weeks of bug fix work after launch. But the idea that some companies are deliberately sandbagging on quality to assure ongoing work is troubling indeed. There’s enough good honest work out there that I would hope such tactics are rare.
Then again, back to your question; in an ideal world, both programmer and client should work together to improve code quality. But again, the world is never ideal, we just have to make compromises, compromises that depend on various factors.
Totally agree. And on nearly all projects I’ve been involved in I’ve worked with the client to help determine the appropriate level of quality. But in this post I’m merely arguing that developers should ultimately have a baseline quality that they require based on context. Reducing quality to the point that public safety or your own reputation is at risk shouldn’t be on the table, regardless of the timeline or the size of the paycheck.
Authors judge quality, clients judge compliance to requirements. Easy.
Quality or Reliability are things that clients will either have or find out later they don’t have it (to paraphrase someone much smarter than I). It’s cheaper to pay for quality earlier in the process than later, but as other commenters have stated, it depends. Technical debt at the expense of deadlines is sometimes the right call to meet a tough deadline. But it should be measured and payed back as soon as possible. It’s hard to build equity when you’re slowing eroding the foundation of your house.
I did some support work for an established project not long ago. The original developer had left the company and there were additional features that needed to be added to the project. As I looked through the repo I immediately knew that I was looking at high quality code because it had conceptual clarity. Modules, variables, and functions were well-named and performed discreet roles. State was well represented, finite, and encapsulated to prevent unwanted side-effects. Flow control made sense, and read like a series of well articulated instructions.
Code is a symbolic representation of human thought processes within a context of specific rules (the language, platform, libraries, and problem domain). Conceptual clarity and mastery of those rules make for high quality code.
Woah, THE Nicholas Could read my blog?! Thanks for the comment man. I couldn’t agree more with your assessment. I believe there’s no logical need to ignore clarity in order to meet a timeline. The harder questions revolve around what constitutes appropriate architecture and sufficient test coverage in a given context.
I don’t know who Nicholas *Could* is, but *I* like to keep up with the awesome things you have to say here. 🙂
I’ve only just found this post. It a hugely important topic. In my experience, all users are similar in most ways and highly similar to consumers in general.
Everybody wants high quality software at the best possible price and delivered as soon as possible. Trade-offs are an inconvenient truth that few people want to deal with.
Ideally an organization will have at least one product owner available to make the very tough choices involved with product delivery.
The importance of the presence of a high quality product owner is enormously underrated. In Scrum, there is much more written about the Scrum Master, but if the development team is strong, the team can work well without much involvement from the Scrum Master.
The product owner on the other hand, is faced with many tough decisions that could make or break the organization.
If the organization does not have a product owner available to make these decisions, then the best position for the development team to present to the client(s) is “We are flexible on absolutely everything to meet your needs, except for one thing. We won’t compromise on quality.”
Since everyone wants high quality, I have never seen a client argue against this.
Thanks for the comment Kevin! I’ve had to compromise on quality many times in my career. But the key to doing it well is to convey when you feel it makes sense and why. Sometimes I bring this up in conversation so that the business can choose whether to flex quality, but only when I feel I can still comfortably deliver something that won’t keep me up at night.