.NET Core 1.1 building with Docker and Cake (part 2)

This is a follow up to my original post: .NET Core 1.1 building with Docker and Cake That post has a bit more detail than here.

Essentially, overtime, the build was a bit too slow. Having a build container (with Mono) being pulled on Circle CI each time was too slow.

I’ve moved away from a build container but still publish and create a Docker image.

Build Process Overview

  1. Dependencies:
    • Install dotnet SDK
    • dotnet restore via Cake
  2. Compile:
    • dotnet build via Cake
  3. Test:
    • dotnet test via Cake
  4. Deployment:
    • dotnet publish
    • Use Dockerfile to create image
    • Push built image to AWS ECS

Circle CI configuration

– docker

– sudo sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/dotnet-release/ trusty main" > /etc/apt/sources.list.d/dotnetdev.list'
– sudo apt-key adv –keyserver hkp://keyserver.ubuntu.com:80 –recv-keys 417A0893
– sudo apt-get update
– sudo apt-get install dotnet-dev-1.0.4
– ./build.sh build.cake –target=restore
– ~/.nuget

– ./build.sh build.cake –target=build

– ./build.sh build.cake –target=test

branch: [master, dev]
– mkdir publish/
– dotnet publish src/Api.Server -f netcoreapp1.1 -c Release -o ../../publish –version-suffix ${CIRCLE_BRANCH}-${CIRCLE_BUILD_NUM}
– docker build -f Dockerfile -t server-api:latest .
– docker tag server-api:latest $AWS_ACCOUNT_ID.dkr.ecr.eu-west-1.amazonaws.com/server-api:$CIRCLE_BUILD_NUM-$CIRCLE_BRANCH
– ./push.sh

New: Build Phases

I still hang everything together with a Cake script but call the stages individually to better match the stags on Circle CI. It seems most build services work this way.

New: dotnet SDK installation

This is just a copy/paste from the https://dot.net site for Ubuntu. The current version of the SDK now is 1.0.4.

New: Caching NuGet dependencies

Circle CI and other services have a notion of caching. This was easy on Circle. I just tell it to save my .nuget directory and nuget pulls are much faster. I should figure out something better for the SDK itself. But that probably means a base docker image. Maybe this is better for Circle CI 2.0 which all docker based.

New: Branch tagging

Circle supports having the build number as well as the branch as an environment variable. Using this to tag is nicer for me as well.

Cake file

The cake file has changed since the last post too. Cake better supports the dotnet commands. I still have to manually glob for tests to run though.

var target = Argument("target", "Default");
var tag = Argument("tag", "cake");

  .Does(() =>

  .Does(() =>

  .Does(() =>
    var files = GetFiles("test/**/*.csproj");
    foreach(var file in files)

  .Does(() =>
    var settings = new DotNetCorePublishSettings
        Framework = "netcoreapp1.1",
        Configuration = "Release",
        OutputDirectory = "../../publish",
        VersionSuffix = tag

    DotNetCorePublish("src/Api.Server", settings);



The deployment Dockerfile

FROM microsoft/dotnet:1.1.2-runtime

COPY ./publish/api /app


ENTRYPOINT ["dotnet", "Api.Server.dll"]

I no longer hardcore the ASPNETCORE_ENVIRONMENT variable in the Dockerfile and put that in my ECS config in using terraform. That’s another subject though.

Publishing to AWS ECR – push.sh

I could probably fold this into the circle.yml but I like having it separate

I added a git push for tagging to my Github repo

#!/usr/bin/env bash

# more bash-friendly output for jq
JQ="jq --raw-output --exit-status"

    aws --version
    aws configure set default.region eu-west-1
    aws configure set default.output json

    eval $(aws ecr get-login --region eu-west-1)
    docker push $AWS_ACCOUNT_ID.dkr.ecr.eu-west-1.amazonaws.com/server-api:$CIRCLE_BUILD_NUM-$CIRCLE_BRANCH


git tag -a $CIRCLE_BUILD_NUM-$CIRCLE_BRANCH -m "Circle CI Build Tag"
git push origin --tags

SharpCompress 0.16.0 Released

Another release with some good changes. I’m still deciding on where to take this. I’m still leaving SharpCompress without a 1.0 release as I never feel confident enough to be strict with myself not to break the API in case of changes.

I have started a dotnet tool branch for fun as well as consume my own API again to get a better sense of how things feel. Take a look at the branch: dotnet tool

SharpCompress 0.16.0 on Nuget


As always, more fixes and help are welcome!

For me to remember: .NET Core and JWT

This is a Memory Store ™ post for me to remember later. This isn’t an intro to JWT or JWT with .NET Core. Here’s some better links for that:

Check the official docs for more about JWT Bearer auth or ASP.NET Core identity in general.

This post is more “this is how I did it because it still felt unclear after reading things.”

I did this while creating the Realworld Sample for ASP.NET Core

Essentially, the JWT Bearer library that is provided for ASP.NET Core handles all of the checking of a JWT token if it’s on the Authorization header as Bearer. Which is great. Just need to hook it up:

(JwtIssuerOptions is covered later)

public static void AddJwt(this IServiceCollection services)
    //using options with JwtIssuerOptions

    //store this key somewhere else!
    var signingKey = new SymmetricSecurityKey(Encoding.ASCII.GetBytes("somethinglongerforthisdumbalgorithmisrequired"));
    services.Configure<JwtIssuerOptions>(options =>
        //change this value!
        options.Issuer = "issuer";
        //change this value!
        options.Audience = "Audience";
        options.SigningCredentials = new SigningCredentials(signingKey, SecurityAlgorithms.HmacSha256);

public static void UseJwt(this IApplicationBuilder app)
    var options = app.ApplicationServices.GetRequiredService<IOptions<JwtIssuerOptions>>();

    var tokenValidationParameters = new TokenValidationParameters
        // The signing key must match!
        ValidateIssuerSigningKey = true,
        IssuerSigningKey = options.Value.SigningCredentials.Key,
        // Validate the JWT Issuer (iss) claim
        ValidateIssuer = true,
        ValidIssuer = options.Value.Issuer,
        // Validate the JWT Audience (aud) claim
        ValidateAudience = true,
        ValidAudience = options.Value.Audience,
        // Validate the token expiry
        ValidateLifetime = true,
        // If you want to allow a certain amount of clock drift, set that here:
        ClockSkew = TimeSpan.Zero

    app.UseJwtBearerAuthentication(new JwtBearerOptions
        AutomaticAuthenticate = true,
        AutomaticChallenge = true,
        TokenValidationParameters = tokenValidationParameters,
        AuthenticationScheme = JwtIssuerOptions.Scheme

This hooks JWT into your Startup. Easy peasy. What’s not easy peasy was understanding how a person logs in JWT and manages claims for ASP.NET Core Identity.

The options used for ASP.NET Core JWT need to used for generating the JWT tokens. This was lifted from one of the above links and it works well.

public class JwtIssuerOptions
    public const string Scheme = "Token";

    /// <summary>
    /// "iss" (Issuer) Claim
    /// </summary>
    /// <remarks>The "iss" (issuer) claim identifies the principal that issued the
    ///   JWT.  The processing of this claim is generally application specific.
    ///   The "iss" value is a case-sensitive string containing a StringOrURI
    ///   value.  Use of this claim is OPTIONAL.</remarks>
    public string Issuer { get; set; }

    /// <summary>
    /// "sub" (Subject) Claim
    /// </summary>
    /// <remarks> The "sub" (subject) claim identifies the principal that is the
    ///   subject of the JWT.  The claims in a JWT are normally statements
    ///   about the subject.  The subject value MUST either be scoped to be
    ///   locally unique in the context of the issuer or be globally unique.
    ///   The processing of this claim is generally application specific.  The
    ///   "sub" value is a case-sensitive string containing a StringOrURI
    ///   value.  Use of this claim is OPTIONAL.</remarks>
    public string Subject { get; set; }

    /// <summary>
    /// "aud" (Audience) Claim
    /// </summary>
    /// <remarks>The "aud" (audience) claim identifies the recipients that the JWT is
    ///   intended for.  Each principal intended to process the JWT MUST
    ///   identify itself with a value in the audience claim.  If the principal
    ///   processing the claim does not identify itself with a value in the
    ///   "aud" claim when this claim is present, then the JWT MUST be
    ///   rejected.  In the general case, the "aud" value is an array of case-
    ///   sensitive strings, each containing a StringOrURI value.  In the
    ///   special case when the JWT has one audience, the "aud" value MAY be a
    ///   single case-sensitive string containing a StringOrURI value.  The
    ///   interpretation of audience values is generally application specific.
    ///   Use of this claim is OPTIONAL.</remarks>
    public string Audience { get; set; }

    /// <summary>
    /// "nbf" (Not Before) Claim (default is UTC NOW)
    /// </summary>
    /// <remarks>The "nbf" (not before) claim identifies the time before which the JWT
    ///   MUST NOT be accepted for processing.  The processing of the "nbf"
    ///   claim requires that the current date/time MUST be after or equal to
    ///   the not-before date/time listed in the "nbf" claim.  Implementers MAY
    ///   provide for some small leeway, usually no more than a few minutes, to
    ///   account for clock skew.  Its value MUST be a number containing a
    ///   NumericDate value.  Use of this claim is OPTIONAL.</remarks>
    public DateTime NotBefore => DateTime.UtcNow;

    /// <summary>
    /// "iat" (Issued At) Claim (default is UTC NOW)
    /// </summary>
    /// <remarks>The "iat" (issued at) claim identifies the time at which the JWT was
    ///   issued.  This claim can be used to determine the age of the JWT.  Its
    ///   value MUST be a number containing a NumericDate value.  Use of this
    ///   claim is OPTIONAL.</remarks>
    public DateTime IssuedAt => DateTime.UtcNow;

    /// <summary>
    /// Set the timespan the token will be valid for (default is 5 min/300 seconds)
    /// </summary>
    public TimeSpan ValidFor { get; set; } = TimeSpan.FromMinutes(5);

    /// <summary>
    /// "exp" (Expiration Time) Claim (returns IssuedAt + ValidFor)
    /// </summary>
    /// <remarks>The "exp" (expiration time) claim identifies the expiration time on
    ///   or after which the JWT MUST NOT be accepted for processing.  The
    ///   processing of the "exp" claim requires that the current date/time
    ///   MUST be before the expiration date/time listed in the "exp" claim.
    ///   Implementers MAY provide for some small leeway, usually no more than
    ///   a few minutes, to account for clock skew.  Its value MUST be a number
    ///   containing a NumericDate value.  Use of this claim is OPTIONAL.</remarks>
    public DateTime Expiration => IssuedAt.Add(ValidFor);

    /// <summary>
    /// "jti" (JWT ID) Claim (default ID is a GUID)
    /// </summary>
    /// <remarks>The "jti" (JWT ID) claim provides a unique identifier for the JWT.
    ///   The identifier value MUST be assigned in a manner that ensures that
    ///   there is a negligible probability that the same value will be
    ///   accidentally assigned to a different data object; if the application
    ///   uses multiple issuers, collisions MUST be prevented among values
    ///   produced by different issuers as well.  The "jti" claim can be used
    ///   to prevent the JWT from being replayed.  The "jti" value is a case-
    ///   sensitive string.  Use of this claim is OPTIONAL.</remarks>
    public Func<Task<string>> JtiGenerator =>() => Task.FromResult(Guid.NewGuid().ToString());

    /// <summary>
    /// The signing key to use when generating tokens.
    /// </summary>
    public SigningCredentials SigningCredentials { get; set; }

Now you use the above options to generate tokens! When do you generate tokens? On login! Use JwtTokenGenerator to create tokens for users.

JWT Bearer Authentication leaves it up to the caller on managing the token (as opposed to using browser cookies) so the token can just be put in the response of the login.

public class JwtTokenGenerator : IJwtTokenGenerator
    private readonly JwtIssuerOptions _jwtOptions;

    public JwtTokenGenerator(IOptions<JwtIssuerOptions> jwtOptions)
        _jwtOptions = jwtOptions.Value;

    //TODO: use something not custom
    private static long ToUnixEpochDate(DateTime date) => (long) Math.Round((date.ToUniversalTime() - new DateTimeOffset(1970, 1, 1, 0, 0, 0, TimeSpan.Zero)).TotalSeconds);

    public async Task<string> CreateToken(string username)
        var claims = new[]
            new Claim(JwtRegisteredClaimNames.Sub, username),
            new Claim(JwtRegisteredClaimNames.Jti, await _jwtOptions.JtiGenerator()),
            new Claim(JwtRegisteredClaimNames.Iat,
        var jwt = new JwtSecurityToken(
            issuer: _jwtOptions.Issuer,
            audience: _jwtOptions.Audience,
            claims: claims,
            notBefore: _jwtOptions.NotBefore,
            expires: _jwtOptions.Expiration,
            signingCredentials: _jwtOptions.SigningCredentials);

        var encodedJwt = new JwtSecurityTokenHandler().WriteToken(jwt);
        return encodedJwt;

It seems, putting username in the JwtRegisteredClaimNames.Sub claim will put it as the ClaimTypes.NameIdentifier claim. So the below code will return the authenticated user name after the request as been validated:

public string GetCurrentUsername()
    return _httpContextAccessor.HttpContext.User?.Claims?.FirstOrDefault(x => x.Type == ClaimTypes.NameIdentifier)?.Value;

So now we can validate JWT tokens and get the user name of the token once validated. Other claims can be added to the token as well and accessed later. Let the fun begin!

Just annotate your controllers or controller methods standard ASP.NET Core Identity and it works:

[Authorize(ActiveAuthenticationSchemes = JwtIssuerOptions.Scheme)]

Generating URL slugs in .NET Core

Updated: 5/5/17

  • Better handling of diacritics in sample

I’ve just discovered what a Slug is:

Some systems define a slug as the part of a URL that identifies a page in human-readable keywords.

It is usually the end part of the URL, which can be interpreted as the name of the resource, similar to the basename in a filename or the title of a page. The name is based on the use of the word slug in the news media to indicate a short name given to an article for internal use.

I needed to know this as I’m particapting in the Realworld example projects and I’m doing a back end for ASP.NET Core.

The API spec kept saying slug, and I had a moment of “ohhh, that’s what that is.” Anyway, I needed to be able to generate one. Stackoverflow to the rescue!: https://stackoverflow.com/questions/2920744/url-slugify-algorithm-in-c

Also, decoding random characters from a lot of languages isn’t straight forward so I used one of the best effort implementations from the linked SO page: https://stackoverflow.com/questions/249087/how-do-i-remove-diacritics-accents-from-a-string-in-net

Now, here’s my Slug generator:

public static class Slug
    public static string GenerateSlug(this string phrase)
        string str = phrase.RemoveDiacritics().ToLower();
        // invalid chars           
        str = Regex.Replace(str, @"[^a-z0-9\s-]", "");
        // convert multiple spaces into one space   
        str = Regex.Replace(str, @"\s+", " ").Trim();
        // cut and trim 
        str = str.Substring(0, str.Length <= 45 ? str.Length : 45).Trim();
        str = Regex.Replace(str, @"\s", "-"); // hyphens   
        return str;

    public static string RemoveDiacritics(this string text)
        var s = new string(text.Normalize(NormalizationForm.FormD)
            .Where(c => CharUnicodeInfo.GetUnicodeCategory(c) != UnicodeCategory.NonSpacingMark)

        return s.Normalize(NormalizationForm.FormC);

TimeZoneInfo is Cross Platform?

Short one!

Apparently, TimeZoneInfo is different on Windows and Linux as cataloged here on Github.

There doesn’t seem to be a solution coming in the .NET Core 2.0 timeframe.

Fortunately, a developer on the above issue made a workaround: TimeZoneConverter

Implemment this and you’re golden:

using System.Runtime.InteropServices;
using TimeZoneConverter;

public static TimeZoneInfo GetTimeZoneInfo(string windowsOrIanaTimeZoneId)
        // Try a direct approach first
        return TimeZoneInfo.FindSystemTimeZoneById(windowsOrIanaTimeZoneId);
        // We have to convert to the opposite platform
        var tzid = RuntimeInformation.IsOSPlatform(OSPlatform.Windows)
            ? TZConvert.IanaToWindows(windowsOrIanaTimeZoneId)
            : TZConvert.WindowsToIana(windowsOrIanaTimeZoneId);

        // Try with the converted ID
        return TimeZoneInfo.FindSystemTimeZoneById(tzid);

Proto.Actor and me

Proto Actor is a new lightweight actor library (and I say library, not framework) by one of the creators of Akka.NET.

Before going further, you might want to read up on why one should consider using the Actor model in the first place. Also, the Akka.NET docs are good for use cases too. If you’re really into it, the original Akka project for the JVM should be given a look as well.

Why use Proto Actor over Akka.NET or Microsoft Orleans?

Akka.NET was the first time I tried using the actor model in any real anger. Coming from a traditional enterprise software application development background (using, Dependency Injection, unit tests, etc.) I found several things off putting.

First, the Props concept for creating actors felt wrong. Using DI wasn’t recommending. Testing actors required it’s own testing framework. Lastly, worst of all, using async/await is an anti-pattern. Blasphemy!

I never got beyond a proof of concept with Akka.NET but it felt promising. It being a port of the original Akka with the HOCON config and all still didn’t sit right with me.

Next, I took a look at Orleans.

Great! async/await is a first order thing. Virtual actors (or Grains) make lifetime management just a bit easier.

Downsides: the runtime silos still feel heavyweight and very closely tied to AppDomains. The generation of the serializable types is a bit clunky and very tied to the tooling provided. To be fair, they’re working on fixing much of what I didn’t like as their port to .NET Core requires much of it.

Again, I didn’t use Orleans much more than in a proof of concept. Primarily because of a project direction change. I almost deployed it to production.

Proto Actor is a library

Proto Actor does not use a runtime. In fact, it’s a set of fine grained libraries and relies mostly on 3rd party technologies to do a lot of the extra stuff (transport, serialization, logging, etc.) that aren’t the core of the actor concept.

Being cross-platform is first order concern for this project. While I don’t think I would recommend using Go and C# in the same cluster (though you could), the constraint means Proto Actor can only do things that are well known for both. Protobuf and gRPC for transport are good high performing well-known projects.

For the dotnet space, using Microsoft.Extension namespaces for logging and configuration which are pluggable with the new ASP.NET Core libraries mean your favorite logging and config can be used.

So what now?

I’ve been using Proto Actor in a Winforms Application (I know!) to better model the threading and eventing not only from the user but from attached devices to the application. The interactions basically require queues (actor mailboxes) in order to process everything.

I’ve been using MediatR a lot recently to better organize the features of my cloud applications and this desktop application is no different. However, what is different is that things are much more stateful.

I have objects managing state and need a queue to read messages from. Guess what that is, an actor!

Each attached device is an actor. They have children which model the real world objects they’re tracking. HTTP connections are modeled as an actor with state. The UI forms are actors that accept messages to give user feedback and user actions are messages to other actors.

I think it works out nicely. I made a simple dispatcher to allow the Winform UI synchronization context to process messages. My UI code does not have to worry about calling Invoke.

Up next: Proto Actors and DependencyInjection

Targeting “unrecognized” portable .NET framework targets with VS2017


1>C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\Sdks\Microsoft.NET.Sdk\build\Microsoft.NET.TargetFrameworkInference.targets(84,5): error : Cannot infer TargetFrameworkIdentifier and/or TargetFrameworkVersion from TargetFramework='portable-net40+sl5+win8+wpa81+wp8'. They must be specified explicitly.
1>C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\15.0\Bin\Microsoft.Common.CurrentVersion.targets(1111,5): error MSB3644: The reference assemblies for framework ".NETFramework,Version=v0.0" were not found. To resolve this, install the SDK or Targeting Pack for this framework version or retarget your application to a version of the framework for which you have the SDK or Targeting Pack installed. Note that assemblies will be resolved from the Global Assembly Cache (GAC) and will be used in place of reference assemblies. Therefore your assembly may not be correctly targeted for the framework you intend.

In the Microsoft.NET.TargetFrameworkInference.targets file it helpfully says this:

    Note that this file is only included when $(TargetFramework) is set and so we do not need to check that here.

    Common targets require that $(TargetFrameworkIdentifier) and $(TargetFrameworkVersion) are set by static evaluation
    before they are imported. In common cases (currently netstandard, netcoreapp, or net), we infer them from the short
    names given via TargetFramework to allow for terseness and lack of duplication in project files.

    For other cases, the user must supply them manually.

    For cases where inference is supported, the user need only specify the targets in TargetFrameworks, e.g:

    For cases where inference is not supported, identifier, version and profile can be specified explicitly as follows:
       <PropertyGroup Condition="'$(TargetFramework)' == 'portable-net451+win81'">
       <PropertyGroup Condition="'$(TargetFramework)' == 'xyz1.0'">

    Note in the xyz1.0 case, which is meant to demonstrate a framework we don't yet recognize, we can still
    infer the version of 1.0. The user can also override it as always we honor a TargetFrameworkIdentifier
    or TargetFrameworkVersion that is already set.

In a project, I was targeting: net45 netstandard1.3 and .NETPortable,Version=v4.0,Profile=Profile328

The auto migration only does so far:

There are some other properties added for portable40-net40+sl5+win8+wp8+wpa81 but the end result is that on build, MSBuild doesn’t know what portable40-net40+sl5+win8+wp8+wpa81 means.

To fix this, translate Profile328 to what the comments say from the targets file. I also used this from Microsoft as a guide for profile targets.

I added:

<PropertyGroup Condition="'$(TargetFramework)' == 'portable-net40+sl5+win8+wpa81+wp8'">

The name portable-net40+sl5+win8+wpa81+wp8 could be anything really as long as they match and the above XML really puts the profile, version and identifer for MSBuild.

Here’s the complete working csproj

Why couldn’t migrate do this for you? I don’t know.