I 100% accept that what I am about to say in this post will find dissent in the programming community. That is a good thing as any difference of opinion opens dialog which leads to new information and ideas, and ultimately growth. I encourage people to disagree with me at NML, although as any of the developers will tell you, you need to be prepared to defend your position.
Here is the crux of it:
As a developer, if you do not have unit tests to validate your work, you have failed at your job, even if the resulting implementation works flawlessly.
My position on this is simple. If a developer tells me that they are done with their implementation, and the answer to "Do you have unit tests?" is "No", then that developer has failed to perform the function that we have hired him/her for.
Unit testing is not just a good idea when appropriate. Delivering a feature without unit testing is delivering less than half of the feature.
I continuously have to drive this point home with our developers (and project managers) and truth be told, I am not sure why. I find it obvious that if I am implementing a required behavior, knowing whether what I implemented works, and being able to let others independently reaffirm that, is well worth the effort.
A constant objection to unit testing is that it takes too much time and that there is too much pressure to get a feature out. Writing good unit tests can take as much time if not more, as writing the feature itself, so when the pressure is on it should be acceptable to skip unit tests.
It is a nonsense objection because it entirely ignores the resulting development life cycle of the feature. There are two reasons why it is nonsense.
My own experience on this is that if I spend the time implementing tests that prove the expected behavior of my code, I rarely have to go back to it. When I do not spend time on writing unit tests, I invariably end up having to return to the implementation over at least a number of cycles.
Considering that any returned work breaks the flow of what you are busy with and that it takes away the time of other people on the team, like testers, not implementing unit tests arguably slows down development for any feature.
It can only true that writing unit tests takes too much time if you view them as a "nice to have", or something that is a good idea, but not critical to the feature. If that is the developers' mindset, they will incorrectly assess the scope and complexity of the work required, and end up under-estimating the time it will take to implement the feature.
On the other hand, if as company and development team you accept that unit tests are part of the feature, absolutely integral to delivery, you must include it in your assessment on the scope of work.
I outright forbid development teams at NML to add a unit test task on their boards. Putting a unit test task on a sprint board is telling me that as a developer you do not see unit testing as part of your job, but that you will do it if you have time since it is "a good idea". Absolutely not acceptable. It is in every way you can imagine part of your job, and you should never have to put a placeholder somewhere to remember to do it.
Another objection often sighted is that clients do not care about unit tests and shouldn't have to pay additionally for them.
Again, complete nonsense. Clients do not care about unit tests for their software in the same way that we do not care about cows for our ice cream. We expect the ice cream to be good and tasty, and when it is not we don't blame the cows that produced the milk, we just have a bad experience and never buy from vendor again. Clients want quality software, and when they do not get what they expect, the do not blame the lack of unit tests, they blame the vendor.
If we are honest about what it really takes to implement quality software, of which unit tests form but one important aspect, then we can go a long way towards eliminating bad experiences.
As a software vendor, unit testing should not be an item on a catalog that the client can choose to have or not. The cost to both you as the vendor and the client it too high to even consider that. The cost is definitely more than what it would be if unit testing is just par for the course of development, and considered upfront.
As a vendor, if you have a client that willingly chooses fragile, unreliable software because it costs less up front and will be delivered faster, then you absolutely need to refer that client to another vendor. What they're really saying is that they want to pay less upfront, and will then, later on, insist that you did not deliver what was agreed on, as they do not have properly working software.
As a client, if you have a vendor giving you the option to skip unit tests, go somewhere else for your software. What they are really saying is that you might get your software fast and cheap, but then after you are committed, you will have to spend at least three times more than you budgeted for and wait twice as long to get the software you want.
Writing code is not an easy endeavor and developers, even the smartest ones, will make mistakes while they develop. The best way to reliably minimize mistakes is to write simple concise code that verifies the intended behavior repeatably, aka unit tests. Unit tests are not additional work or additional cost, they're integral to the development effort. Unit tests should be invisible as an item of work since they are part of the work.
That means that writing unit tests is your job in every way that writing code for a feature is.