Tag Archives: .Net

Failing the Pipeline When NuGet Packages are Out of Date

Letting your dependencies get out of date is an all too common problem. On the face of it, you might think it’s not such a big issue – “our code works, why do we need to worry?”.

However, keeping on top of your updates on a regular basis has several advantages:

  • It is one of the easiest ways to minimise the risk of security vulnerabilities. Vulnerabilities are often discovered in popular (or not so popular) packages. New releases are produced to fix these vulnerabilities.
  • You reap the benefits of any bug fixes or optimisations that have been introduced.
  • Frequent, small changes are easier to manage and carry less risk than allowing changes to back up and become big changes.

With a little preparation, keeping on top of this is pretty easy. The dotnet command has inbuilt support for checking for outdated packages. Simply execute the following while in your solution or project directory:

dotnet list package --outdated

This will check the package versions requested in your project(s) and output a list of any package for which a later version is available.

One thing worth noting is that you need to have done a 'restore' on the project(s) before running this check because it requires the projects.assets.json file to have been generated and be up to date.

There are a few ways that the behaviour of this check can be modified. For example, restricting the check to minor versions. You can find full details in the Microsoft dotnet list package docs.

So far so good, but what if we want to enforce updates in the CI/CD pipelines? We can achieve this by wrapping the command in a Bash script that will return a non-zero exit code if outdated dependencies are detected. I’ve recently used this in Azure DevOps pipelines, but the same technique will apply to other pipeline tools.

The script I used is here:

#!/bin/bash

updates=$(dotnet list package --outdated | grep -e 'has the following updates')

if [[ -n "$updates" ]]; then
    dotnet list package --outdated
    echo "##[error]Outdated NuGet packages were detected"
    exit 1
else
    echo "No outdated NuGet packages were detected"
    exit 0
fi

It simply looks for the text “has the following updates” in the output from calling dotnet list package --outdated and if that phrase is present, exits with the non-zero return code. The ##[error] tag is an Azure DevOps Pipeline feature which means the echoed message will be formatted as an error (i.e. appear in red) in the pipeline log details.

Validating Enums in .Net WebAPI

Recently, I needed to implement some validation rules on a .Net WebAPI. The API in question accepted a payload that, amongst other things, included a currency code (e.g. “GBP”, “EUR”, “USD”). On the face of it, this sounds pretty simple but there are a few things to watch out for in order to do this well.

We might start with something like the following code

using System.Text.Json;
using Microsoft.AspNetCore.Mvc;

[ApiController]
[Route("[controller]")]
public class ExampleController : ControllerBase
{
    private readonly ILogger<ExampleController> _logger;

    public ExampleController(ILogger<ExampleController> logger)
    {
        _logger = logger;
    }

    [HttpPost("payments")]
    public IActionResult Payments(Payment payment)
    {
        _logger.LogInformation(JsonSerializer.Serialize(payment));
        return this.Ok();
    }
}

/// <summary>
/// This is the type we accept as the payload into our API.
/// </summary>
public class Payment
{
    public Currency Currency { get; set; }

    public decimal Amount { get; set; }
}

/// <summary>
/// This is an example enumeration.
/// We want our API consumers to use currencies as strings.
/// </summary>
public enum Currency
{
    GBP,
    EUR,
    USD
}

The first problem we have here is that the Currency value in our Json payload has to be a number. Out of the box, the API won’t accept a string such as “USD” and we’ll get a standard “400 Bad Request” validation failure response.

Accepting Enum String Values

We can easily fix this issue by using the JsonStringEnumConverter class that is part of System.Text.Json. To do this we decorate the enum declaration with the JsonConverter attribute. Optionally, we could have added the attribute to the individual property, but if we decorate the enum declaration, strings will be acceptable wherever it used.

/// <summary>
/// This is an example enumeration.
/// We want our API consumers to use currencies as strings.
/// </summary>
[JsonConverter(typeof(JsonStringEnumConverter))]
public enum Currency
{
    GBP,
    EUR,
    USD,
}

We can now pass in Currency as a string value. Great! If we pass an invalid string, we’ll get a 400 Bad Request. Even better!

Limiting to Valid Enum Values

What we have now is a situation where, if we provide a string value, the validation insists that it is a valid member of the Currency enum. However, because this is an enum, we can also pass in a number. This number does not get validated, so in our controller our payment object contains an invalid value. We can limit the field to valid values by adding an attribute to the property, like so:

    [EnumDataType(typeof(Currency))]
    public Currency Currency { get; set; }

Making Enum Mandatory

What if we want this to be a mandatory field that must be supplied by our caller? We might assume that this is easy – simply add the Required attribute to our Currency property. Just like this:

    [Required, EnumDataType(typeof(Currency))]
    public Currency Currency { get; set; }

Unfortunately, that doesn’t give us the behaviour we want. The field will be marked as mandatory in Swagger/OpenAPI if we are using that, but if we omit the field entirely from our payload, the POST is accepted.

Why would this be? The reason for this is that the underlying type for all enums is a numeric type. Unless specified otherwise, this will be int. That means that enums are value types and when initialised will default to 0. As far as the .Net validation process is concerned, the value is always present, so the Required attribute doesn’t have the effect we need.

To address this is also a simple modification, provided that you can treat 0 as an invalid value. Simply set the first enum value to a non-zero value:

/// <summary>
/// This is an example enumeration.
/// We want our API consumers to use currencies as strings.
/// We also set the first value to 1 so that 0 is invalid.
/// </summary>
[JsonConverter(typeof(JsonStringEnumConverter))]
public enum Currency
{
    GBP = 1,
    EUR,
    USD,
}

We now have an enum property that

  • can be specified as a string
  • is mandatory so must be present in the payload
  • must be a valid enum value
  • is documented appropriately in Swagger/OpenAPI

Seven Steps to Quality Code – Step 5

Code Analysis – Further Rules

This is step 5 in a series of articles, starting with Seven Steps To Quality Code – Introduction.

This step is simply a case of turning the ratchet a little more and locking in further quality gains.

Previously, my suggestion for existing projects has been to set the Code Analysis rule set to “Minimum recommended rules” (or in Visual Studio versions after 2010, “Managed Recommended Rules”) in order to keep the number of rule violations to a minimum.  Now I am going to suggest that the rules are tightened by using a rule set or combination of rule sets that check for further violations.

Ultimately, our goal is to have the “All Rules” rule set enabled on all code, but in practice this may not be achievable for legacy code.  What we can do, is work towards this so that we can catch the more important issues in our code.  For example, the effort of implementing the globalisation rules in legacy code is not going to give you much bang for buck (unless of course globalisation has become a required feature!).

A great feature of Code Analysis is that we can progressively add further rules in order to increase the range of issues that are checked.  The rule sets that are available, however, do not give us a sequential order that we can progressively move through because particular rule sets focus on particular issues.  We can achieve the same effect though by using the option of progressively applying multiple rule sets.  You can do this by selecting the “Choose multiple rule sets” option on the Code Analysis tab of a project’s properties screen, as shown in the screenshot below:

Screenshot showing where to select multiple Code Analysis rule sets
Selecting the Multiple Rule Sets Option

A comprehensive list of the rule sets available can be found on the Microsoft site:

Visual Studio 2010 Code Analysis Rule Sets
Visual Studio 2013 Code Analysis Rule Sets

To avoid getting swamped by too many violations and to allow a “small bite at a time” approach, I suggest considering 1 project (i.e. csproj file) at a time and performing the following procedure:

  1. Run the “All Rules” rule set on the project.  If you consider the number of violations to be manageable, use the “All Rules” rule set.  This is the perfect situation to be in, ignore the further steps in this list and proceed to fix the violations.
  2. If “All Rules” is looking like a step too far, progressively add in the following rule sets one at at time, fixing the violations and checking in as you go:
    Visual Studio 2010Visual Studio 2013
    Microsoft Basic Correctness RulesBasic Correctness Rules rule set for managed code
    Microsoft Basic Design Guideline RulesBasic Design Guideline Rules rule set for managed code
    Microsoft Extended Correctness RulesExtended Correctness Rules rule set for managed code
    Microsoft Extended Design Guideline RulesExtended Design Guidelines Rules rule set for managed code
  3. At this point, you may be in a position to apply the “All Rules” rule set and have a manageable number of violations.  Alternatively, if that still produces too many violations, you may have a particular need for one of the remaining focused rule sets (security or globalization rules) and wish to apply one of those. (Actually, by the time you get to this stage the likelihood is that if you have a lot of violations under “All Rules” they will all be globalization related).

Conclusion

Once this step is followed for a solution, you will be in a great place for continuing development in a high quality environment with automatic guards in place to keep it there.

In step 6 we’ll look at peer reviews – something that everyone knows about but all too often they are the easiest thing to discard in a development process.

Seven Steps to Quality Code – Step 4

StyleCop

This is step 4 in a series of articles, starting with Seven Steps To Quality Code – Introduction.

How often have we heard things like “Steve wrote that – I’m not touching it!” or “That came from the offshore team – it’s a mess”?  Often, this is down to a question of personal style.  The code in question may be perfectly readable and maintainable to the developer who wrote it, but not to anybody else.

If we want robust and maintainable (i.e. quality) code then a standard style, consistent across the whole team or organisation, is what is required.  If the code is always written in the same style, then developers can focus on functionality and correctness rather than being distracted by inconsistencies in coding styles.

As with the other steps, this isn’t a new idea,  It has long been recognised that some sort of corporate “Coding Standard” is useful.   This would perhaps specify common standards for all code – things like layout, formatting, casing, naming conventions etc.  Traditionally, if an organisation has a view on these things, lots of effort would be spent in devising the standard and creating a “coding standards” document hidden away on a server somewhere.   New developers typically have this document on a list of “documents to read” as part of their induction. In my experience, that’s usually as far as it goes.  The practicalities mean that enforcing the corporate standard is simply too difficult and so the code rarely, if ever, actually conforms to the standard.  Each developer uses their own view of “standard practice” and these views often differ widely.

In this fourth step, we are going to address these issues.  The approach is to discard the Coding Standards document and instead introduce our next tool – StyleCop.  This tool was originally developed to enforce standards on internally created code in Microsoft but was then released for public use.  In 2010, it became open source and is now available on the StyleCop CodePlex page.   In a similar way to Code Analysis/FxCop introduced in step 3, it analyses code against a set of predetermined rules.  The difference between the two tools is that Code Analysis looks at the compiled assemblies, whereas StyleCop looks at the contents of your source files.  There are some overlapping areas, but StyleCop will focus on what your code looks like, rather than how it behaves.

A sample of the things that StyleCop will report are as follows:

  • Incorrect casing of variables, method names, property names etc.
  • Incorrect naming conventions (e.g. use of underscores)
  • Missing XML comments
  • Non-standard XML comments (for example, read/write properties must start with “Gets or sets…”)
  • Incorrect spacing around operators, brackets, commas etc.
  • Missing curly brackets
  • Missing and superfluous blank lines

Like Code Analysis, rule violations are reported in the Visual Studio Error window, as shown below:

StyleCop Warnings

Also like Code Analysis, detailed explanations can be found by right-clicking and selecting “Show Error Help”. Unlike Code Analysis, individual rule violations cannot be suppressed.  This is not important with StyleCop as the rules are more objective and not prone to exceptions like Code Analysis rules.

It is possible to switch off a rule entirely, but I would strongly recommend not doing this.  My approach is to leave the rules in their default state.  This leads to a much simpler situation for ongoing configuration management as there is no configuration required by project, solution, developer or machine.  Additionally, there are rules that complement each other and work together “as a set”.  For example, the common practice of prefixing class level fields with an underscore is outlawed under StyleCop, but this practice is complemented by a rule that insists that these fields are preceded by “this”.

Some of the rules may seem a little strange at first.  The aforementioned rule about not prefixing with underscores being a good example if you have been used to this previously, using statements having to be inside the namespace being another.  However, it takes a surprisingly short period of time before the standard StyleCop format becomes “the norm” and non-compliant code is jarringly wrong.

The real beauty of using StyleCop is that it quickly becomes possible to see past what the code looks like (because it’s always the same) and see what it is doing.  Errors in logic or function jump out of the page in a way that is simply not possible when everyone codes to their own style.

Rules/Guidelines for Using StyleCop

These are similar to the guidelines for Code Analysis.

  • Stick with the default rule settings.
  • Incorporate StyleCop into your process. Configure it to run each time the solution is built. If you have an automated build, incorporate it there.
  • Always have the Visual Studio Error window visible.  You get immediate feedback on violations following a build (as well as build errors, compiler warnings and Code Analysis violations).
  • Fix violations frequently.  Fix them as they occur.  Don’t leave this as a task to be performed just before check in.
  • Always use the error help if the rule isn’t understood. There is often interesting commentary on the rationale behind the rule.  In the early days, it’s worth looking at the help even when you think you understand the rule.
  • Use GhostDoc.  StyleCop is hard work without it, GhostDoc becomes absolutely essential.
  • Persevere.  It can be daunting to see a single class with 600 violations, but this means there is massive scope for improvement in such a class and so the benefits and the feeling of accomplishment once it’s done will be worth it.

Once the use of StyleCop is embedded, you will be well placed to move on to step 5...