Category Archives: .Net

Failing the Pipeline When NuGet Packages are Out of Date

Letting your dependencies get out of date is an all too common problem. On the face of it, you might think it’s not such a big issue – “our code works, why do we need to worry?”.

However, keeping on top of your updates on a regular basis has several advantages:

  • It is one of the easiest ways to minimise the risk of security vulnerabilities. Vulnerabilities are often discovered in popular (or not so popular) packages. New releases are produced to fix these vulnerabilities.
  • You reap the benefits of any bug fixes or optimisations that have been introduced.
  • Frequent, small changes are easier to manage and carry less risk than allowing changes to back up and become big changes.

With a little preparation, keeping on top of this is pretty easy. The dotnet command has inbuilt support for checking for outdated packages. Simply execute the following while in your solution or project directory:

dotnet list package --outdated

This will check the package versions requested in your project(s) and output a list of any package for which a later version is available.

One thing worth noting is that you need to have done a 'restore' on the project(s) before running this check because it requires the projects.assets.json file to have been generated and be up to date.

There are a few ways that the behaviour of this check can be modified. For example, restricting the check to minor versions. You can find full details in the Microsoft dotnet list package docs.

So far so good, but what if we want to enforce updates in the CI/CD pipelines? We can achieve this by wrapping the command in a Bash script that will return a non-zero exit code if outdated dependencies are detected. I’ve recently used this in Azure DevOps pipelines, but the same technique will apply to other pipeline tools.

The script I used is here:

#!/bin/bash

updates=$(dotnet list package --outdated | grep -e 'has the following updates')

if [[ -n "$updates" ]]; then
    dotnet list package --outdated
    echo "##[error]Outdated NuGet packages were detected"
    exit 1
else
    echo "No outdated NuGet packages were detected"
    exit 0
fi

It simply looks for the text “has the following updates” in the output from calling dotnet list package --outdated and if that phrase is present, exits with the non-zero return code. The ##[error] tag is an Azure DevOps Pipeline feature which means the echoed message will be formatted as an error (i.e. appear in red) in the pipeline log details.

Validating Enums in .Net WebAPI

Recently, I needed to implement some validation rules on a .Net WebAPI. The API in question accepted a payload that, amongst other things, included a currency code (e.g. “GBP”, “EUR”, “USD”). On the face of it, this sounds pretty simple but there are a few things to watch out for in order to do this well.

We might start with something like the following code

using System.Text.Json;
using Microsoft.AspNetCore.Mvc;

[ApiController]
[Route("[controller]")]
public class ExampleController : ControllerBase
{
    private readonly ILogger<ExampleController> _logger;

    public ExampleController(ILogger<ExampleController> logger)
    {
        _logger = logger;
    }

    [HttpPost("payments")]
    public IActionResult Payments(Payment payment)
    {
        _logger.LogInformation(JsonSerializer.Serialize(payment));
        return this.Ok();
    }
}

/// <summary>
/// This is the type we accept as the payload into our API.
/// </summary>
public class Payment
{
    public Currency Currency { get; set; }

    public decimal Amount { get; set; }
}

/// <summary>
/// This is an example enumeration.
/// We want our API consumers to use currencies as strings.
/// </summary>
public enum Currency
{
    GBP,
    EUR,
    USD
}

The first problem we have here is that the Currency value in our Json payload has to be a number. Out of the box, the API won’t accept a string such as “USD” and we’ll get a standard “400 Bad Request” validation failure response.

Accepting Enum String Values

We can easily fix this issue by using the JsonStringEnumConverter class that is part of System.Text.Json. To do this we decorate the enum declaration with the JsonConverter attribute. Optionally, we could have added the attribute to the individual property, but if we decorate the enum declaration, strings will be acceptable wherever it used.

/// <summary>
/// This is an example enumeration.
/// We want our API consumers to use currencies as strings.
/// </summary>
[JsonConverter(typeof(JsonStringEnumConverter))]
public enum Currency
{
    GBP,
    EUR,
    USD,
}

We can now pass in Currency as a string value. Great! If we pass an invalid string, we’ll get a 400 Bad Request. Even better!

Limiting to Valid Enum Values

What we have now is a situation where, if we provide a string value, the validation insists that it is a valid member of the Currency enum. However, because this is an enum, we can also pass in a number. This number does not get validated, so in our controller our payment object contains an invalid value. We can limit the field to valid values by adding an attribute to the property, like so:

    [EnumDataType(typeof(Currency))]
    public Currency Currency { get; set; }

Making Enum Mandatory

What if we want this to be a mandatory field that must be supplied by our caller? We might assume that this is easy – simply add the Required attribute to our Currency property. Just like this:

    [Required, EnumDataType(typeof(Currency))]
    public Currency Currency { get; set; }

Unfortunately, that doesn’t give us the behaviour we want. The field will be marked as mandatory in Swagger/OpenAPI if we are using that, but if we omit the field entirely from our payload, the POST is accepted.

Why would this be? The reason for this is that the underlying type for all enums is a numeric type. Unless specified otherwise, this will be int. That means that enums are value types and when initialised will default to 0. As far as the .Net validation process is concerned, the value is always present, so the Required attribute doesn’t have the effect we need.

To address this is also a simple modification, provided that you can treat 0 as an invalid value. Simply set the first enum value to a non-zero value:

/// <summary>
/// This is an example enumeration.
/// We want our API consumers to use currencies as strings.
/// We also set the first value to 1 so that 0 is invalid.
/// </summary>
[JsonConverter(typeof(JsonStringEnumConverter))]
public enum Currency
{
    GBP = 1,
    EUR,
    USD,
}

We now have an enum property that

  • can be specified as a string
  • is mandatory so must be present in the payload
  • must be a valid enum value
  • is documented appropriately in Swagger/OpenAPI

Seven Steps to Quality Code – Step 7

Automated Tests

This is step 7, the final step in a series of articles, starting with Seven Steps To Quality Code – Introduction.

The biggest change in my development practices over the last few years has been has been the introduction of Test Driven Development (TDD).  I remember the first project in which I used it and the feeling it gave me.  A feeling of confidence.  Confidence that the code I had written worked and confidence that I could make changes without worrying about something else breaking unknowingly.

I used to be an assembly language programmer.  In that world, there were no helpful debuggers other than the ability to dump the contents of the registers to the screen or maybe post a “got here” message.  When I  moved to Visual Basic, I was blown away by the fact that I could step through code and inspect the values of variables.  This was simply amazing and of course changed the way I developed software.

Discovering Test Driven Development felt just the same.  A revolution in how I would code.

So now, a decade after those first steps, I am a big fan of TDD.  It suits the way I work (mostly) and really helps to lock in quality.

However, this step in the “seven steps” series is not really about TDD.  The series is more about improving what you already have.  TDD is great for a new project, but it’s too late for a project that’s already in existence.

I’m going to suggest a few very simple actions to take that will allow you to get some of the benefits of TDD.  If you haven’t tried TDD or any automated testing before, it will allow you to get a feel for it in a comfortable environment (i.e. your existing code) without the pressure of going all out on a drive to introduce TDD.

Perhaps at this point an overview of some terminology will be useful.  The table below explains a few terms.

TermDescription
Automated TestingA broad term that covers any form of testing where test cases are automatically run against the code and verified for correctness.
Unit TestingTraditionally, unit testing meant a developer performing their own manual tests on a module/component or other "unit" of software, whether formally or informally. In more modern parlance, it is often used interchangeably with "Automated Testing" above but strictly speaking, the 2 are different.
On one extreme, unit testing means automated testing of a single method - perhaps passing in a series of inputs and verifying that the outputs or the behaviour are correct. Often this will be through the use of "Mocks" (see below) to isolate the unit from any dependencies. Purists will insist that this is the only true definition.
Alternatively, unit testing may mean automated testing of a single class and its behaviour or perhaps a complete software component, though it is more likely that this would be minimal testing of a third party product.
MocksMartin Fowler has written an interesting article called "Mocks Aren't Stubs which describes far better than I ever could, what a mock is. For our purposes here though, we will consider a mock as some sort of software 'thing' used in place of some other software 'thing' in order to isolate the unit being tested and optionally verify behaviour.
Test Driven Development (TDD)This is where tests are written before the code and the code is written purely to make a test pass.
Red-Green-RefactorThis is a phrase that describes the process of TDD. 'Red' refers to the usual indicator against a test showing that it has failed (a red cross perhaps), 'Green' refers to the usual indicator to show that a test has passed (a green tick perhaps) and 'Refactor' refers to the task performed after a test passes, which is to revisit the code and improve it without changing its external behaviour. So, TDD is a repeating cycle of - write a failing test - make it pass - improve the code.
Integration TestAn integration test is a test that checks that several "units" work together. Sometimes referred to as "end to end" tests because of the desire to verify that the system under test functions as a whole from the beginning of a process to the end.
Test Prompted DevelopmentA phrase I use that describes a dangerous situation that can often occur in Test Driven Development. I will blog about this in the near future.

What To Do

The suggestions I have are very simple and should start to show benefits very quickly.

Test Bugs

Whenever a bug is discovered in code that does not have associated tests, write a test (or tests) that reproduces that bug and fix the bug by making the test pass.  Leave the test in the project permanently so that it is possible to verify whenever necessary that the bug has been fixed and has not returned.

This is really no different to how a developer often fixes bugs without automated testing. The really powerful part of it is retaining the test and incorporating it into the project.

Test New Helper Functions

The easiest kind of code to test is often the little helper functions.  If you are writing a new function to do something like string handling (perhaps concatenating first name, middle name and surname or something like that), write some tests.  Verify that the result is correct when the input values are null or empty, when boundaries are reached or breached.  For many functions of the kind I am talking about, you can produce a really solid set of tests that will give you absolute confidence that the code is correct.

Again, make sure the tests are retained as part of the project.

Incorporate Tests into Your Build Process

If you have any form of automated build, include your tests in that process.  All build automation systems will support this in an easily implemented way.  Ensure that test failures result in build failures and ensure that the failures are addressed as a priority.

Conclusion

As I said at the start of this article, automating provides amazing benefits and allows you to be really confident that your code performs as you expect.  A full TDD approach is pretty hard, but if you start by dipping your toe in the water as I have suggested, you can start with little risk and learn the techniques slowly.

Well, that is the seven steps finished.  If you are working in a development environment that needs this information then I hope that it has been useful.  Please let me know your experiences of putting this stuff into practice.  If you need any assistance, get in touch – I’ll be happy to help.

Seven Steps to Quality Code – Step 5

Code Analysis – Further Rules

This is step 5 in a series of articles, starting with Seven Steps To Quality Code – Introduction.

This step is simply a case of turning the ratchet a little more and locking in further quality gains.

Previously, my suggestion for existing projects has been to set the Code Analysis rule set to “Minimum recommended rules” (or in Visual Studio versions after 2010, “Managed Recommended Rules”) in order to keep the number of rule violations to a minimum.  Now I am going to suggest that the rules are tightened by using a rule set or combination of rule sets that check for further violations.

Ultimately, our goal is to have the “All Rules” rule set enabled on all code, but in practice this may not be achievable for legacy code.  What we can do, is work towards this so that we can catch the more important issues in our code.  For example, the effort of implementing the globalisation rules in legacy code is not going to give you much bang for buck (unless of course globalisation has become a required feature!).

A great feature of Code Analysis is that we can progressively add further rules in order to increase the range of issues that are checked.  The rule sets that are available, however, do not give us a sequential order that we can progressively move through because particular rule sets focus on particular issues.  We can achieve the same effect though by using the option of progressively applying multiple rule sets.  You can do this by selecting the “Choose multiple rule sets” option on the Code Analysis tab of a project’s properties screen, as shown in the screenshot below:

Screenshot showing where to select multiple Code Analysis rule sets
Selecting the Multiple Rule Sets Option

A comprehensive list of the rule sets available can be found on the Microsoft site:

Visual Studio 2010 Code Analysis Rule Sets
Visual Studio 2013 Code Analysis Rule Sets

To avoid getting swamped by too many violations and to allow a “small bite at a time” approach, I suggest considering 1 project (i.e. csproj file) at a time and performing the following procedure:

  1. Run the “All Rules” rule set on the project.  If you consider the number of violations to be manageable, use the “All Rules” rule set.  This is the perfect situation to be in, ignore the further steps in this list and proceed to fix the violations.
  2. If “All Rules” is looking like a step too far, progressively add in the following rule sets one at at time, fixing the violations and checking in as you go:
    Visual Studio 2010Visual Studio 2013
    Microsoft Basic Correctness RulesBasic Correctness Rules rule set for managed code
    Microsoft Basic Design Guideline RulesBasic Design Guideline Rules rule set for managed code
    Microsoft Extended Correctness RulesExtended Correctness Rules rule set for managed code
    Microsoft Extended Design Guideline RulesExtended Design Guidelines Rules rule set for managed code
  3. At this point, you may be in a position to apply the “All Rules” rule set and have a manageable number of violations.  Alternatively, if that still produces too many violations, you may have a particular need for one of the remaining focused rule sets (security or globalization rules) and wish to apply one of those. (Actually, by the time you get to this stage the likelihood is that if you have a lot of violations under “All Rules” they will all be globalization related).

Conclusion

Once this step is followed for a solution, you will be in a great place for continuing development in a high quality environment with automatic guards in place to keep it there.

In step 6 we’ll look at peer reviews – something that everyone knows about but all too often they are the easiest thing to discard in a development process.