Using the AWS SDK to login with MFA and Assume Role

A new-ish thing to me is having my IAM account on a centralized AWS account then switching roles to a role in another AWS account. It’s a good way to manage users across a lot of accounts: Cross-Account Access in the AWS Management Console

What is definitely is having to do this programmatically for a script. I’m using C# for this but the guts are the same for any language I’m sure.

Steps:

1) Load main credentials – either hard-coded or from ~/.aws/credentials
2) Get a Session Token from STS
3) Setup MFA callback
4) Use Session Token creds and MFA call back on an AssumeRole credential set then to do work.

Code

//needed info from target account
var targetRoleAccount = "<account id>";
var targetRoleName = "<role name>";
//needed info from main account about my user
var mainAccount = "<account id>";
var mainAccountUser = "<my user name>";
//my user creds
var mainAccountUserAccessToken = "<aws access token>";
var mainAccountUserSecretToken = "<aws secret token>";
//make some ARNs
var roleArn = $"arn:aws:iam::{targetRoleAccount}:role/{targetRoleName}";
var mfaArn = $"arn:aws:iam::{mainAccount}:mfa/{mainAccountUser}";

var basicCreds = new BasicAWSCredentials(mainAccountUserAccessToken, mainAccountUserSecretToken);

var stsClient = new AmazonSecurityTokenServiceClient(basicCreds);
var sessionResponse = await stsClient.GetSessionTokenAsync();

var sessionCreds = new SessionAWSCredentials(sessionResponse.Credentials.AccessKeyId,
    sessionResponse.Credentials.SecretAccessKey, sessionResponse.Credentials.SessionToken);

var options = new AssumeRoleAWSCredentialsOptions()
{
    MfaSerialNumber = mfaArn,
    MfaTokenCodeCallback = () =>
    {
        Console.WriteLine("Enter MFA");
        return Console.ReadLine();
    }
};

var assumeRoleCredentials = new AssumeRoleAWSCredentials(sessionCreds, roleArn, targetRoleName, options);

//time to work!
var client = new AmazonEC2Client(assumeRoleCredentials, RegionEndpoint.EUWest1);

The code roughly follows the steps I listed before. The trick is getting your MFA code in.

Here, I just have a console app so I can just ReadLine() and enter the numbers from my phone which I use for the two-factor code.

Took some figuring out as I didn’t know the AWS termology and had to dig the into the AWS SDK integration tests for STS to get it right. Wasn’t too bad.

SharpCompress 0.18

NuGet
GitHub Release

New
* Breaking change – Remove ArchiveEncoding static class in favor of instance on OptionsArchiveEncoding is now on the Options base class. This now allows for more Encoding class options as well as a custom Func for decoding for more custom options. Being instance based avoids multi-threading issues. See https://github.com/adamhathcock/sharpcompress/blob/master/src/SharpCompress/Common/ArchiveEncoding.cs

Fixes
* LeaveStreamOpen doesn’t work with TarWriter
* If Zip file has normal file header AND a post-descriptor header AND the file is attempted to be skipped by a ZipReader, then the data is attempted to be skipped twice.
* AbstractReader.Skip() does not fully read bytes from non-local streams

.NET Core on Circle CI 2.0 using Docker and Cake

I’ve only just started with Circle 2.0, which just had it’s beta tag removed.

It’s completely Docker based which I adore. I refuse to package code any other way these days.

My goal would was to build on what I previously did on Circle CI but only use an official Microsoft .NET Core SDK docker image. Having to layer extra tools onto another image and manage that is extra work. I abhor extra work.

.circleci/config.yml

Circle 2.0 moves their YAML to a subdirectory which seems to be envogue these days so we can have lots of files for specific services!

version: 2
jobs:
  build:
    working_directory: ~/api
    docker:
      - image: microsoft/dotnet:1.1.2-sdk-jessie
    environment:
      - DOTNET_CLI_TELEMETRY_OPTOUT: 1
      - CAKE_VERSION: 0.19.1
    steps:
      - checkout
      - restore_cache:
          keys:
            - cake-{{ .Environment.CAKE_VERSION }}
      - run: ./build.sh build.cake --target=restore
      - save_cache:
          key: cake-{{ .Environment.CAKE_VERSION }}
          paths:
            - ~/api/tools
      - run: ./build.sh build.cake --target=build
      - run: ./build.sh build.cake --target=test

The hard part with Circle CI 2.0 is that caching is done pretty manually and changes aren’t auto-detected. You have to version cache keys or hashes that act as cache keys. I haven’t mastered it yet.

Ideally, I’d cache the Cake tools directory and my .nuget folder on this running image but I’m not there yet.

The big thing to note is that the image is based on the official SDK image with all the necessary build tools.

Bootstrapping Cake

So it should be easy to do this now as I already have a build.sh to execute Cake right? Nope!

The bash script uses the unzip utility that usually exists. This is needed to extract the nuget package that is downloaded. curl doesn’t exist either, by the way.

Fortunately, the dotnet cli is here. It should easily restore Cake. My new build.sh needs a csproj to restore Cake with. Since the new csproj XML is tiny, this is easy to echo into a file.

#!/usr/bin/env bash
##########################################################################
# This is the Cake bootstrapper script for Linux and OS X.
# This file was downloaded from https://github.com/cake-build/resources
# Feel free to change this file to fit your needs.
##########################################################################

# Define directories.
SCRIPT_DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
TOOLS_DIR=$SCRIPT_DIR/tools
TOOLS_PROJ=$TOOLS_DIR/tools.csproj
CAKE_DLL=$TOOLS_DIR/Cake.CoreCLR.$CAKE_VERSION/cake.coreclr/$CAKE_VERSION/Cake.dll


# Make sure the tools folder exist.
if [ ! -d "$TOOLS_DIR" ]; then
  mkdir "$TOOLS_DIR"
fi

###########################################################################
# INSTALL CAKE
###########################################################################

if [ ! -f "$CAKE_DLL" ]; then
    echo "<Project Sdk=\"Microsoft.NET.Sdk\"><PropertyGroup><OutputType>Exe</OutputType><TargetFramework>netcoreapp1.1</TargetFramework></PropertyGroup></Project>" > $TOOLS_PROJ
    dotnet add $TOOLS_PROJ package cake.coreclr -v $CAKE_VERSION --package-directory $TOOLS_DIR/Cake.CoreCLR.$CAKE_VERSION
fi

# Make sure that Cake has been installed.
if [ ! -f "$CAKE_DLL" ]; then
    echo "Could not find Cake.exe at '$CAKE_DLL'."
    exit 1
fi

###########################################################################
# RUN BUILD SCRIPT
###########################################################################

# Start Cake
exec dotnet "$CAKE_DLL" "$@"

Note: I’ve moved the CAKE_VERSION variable out of the script to attempt to use it with CircleCI but it can easily be added back

SharpCompress 0.17 (LZip, XZ) and SharpCompress in .NET Core?

SharpCompress 0.17 released on NuGet !

New Features – Full LZip support!

To me this was a big missing hole when considering how to compress. LZMA seems to be the best in the business at the moment and lots of people want to use it. Jon Skeet provided LZip read support a while back but I only now fleshed it out for writing. Even works with tar.lz just like tar.gz.

New Features – XZ read support!

XZ is another LZMA (well, LZMA2) archive format. Not sure where it came from but the LZip author isn’t impressed and I have to say, I’m not either: Xz format inadequate for long-term archiving

The XZ support is basic and it only supports one internal “stream” (e.g. file) even though multiple are possible.

There are a couple more fixes to read on the github page

Recommendations

After poking around different file formats and compressors for a long time now, I decided to write up what I think people ought to use when considering archive formats and algorithms.

Recommended: Tar with GZip/BZip2/LZip

In general, I recommend GZip (Deflate)/BZip2 (BZip)/LZip (LZMA) as the simplicity of the formats lend to better long term archival as well as the streamability. Tar is often used in conjunction for multiple files in a single archive (e.g. .tar.gz).

Tar is aging a bit with a lot of extensions but it’s still simple. I believe there are Tar replacements but I don’t come across them often.

Not recommended: Zip

Zip is okay, but it’s a very hap-hazard format and the variation in headers and implementations makes it hard to get correct. Uses Deflate by default but supports a lot of compression methods.

Zip has been king for a while so it’s not like this format is going anyway anytime soon.

Avoid: RAR

RAR is not recommended as it’s a propriatory format and the compression is closed source. Use Tar/LZip for LZMA. I’m not up to date on how RAR vs other compressors are but its claim to fame was that it was better than Zip/DEFLATE. Probably not better than LZMA.

Avoid: 7Zip and XZ

7Zip and XZ both are overly complicated. 7Zip does not support streamable formats. XZ has known holes explained here: Xz format inadequate for long-term archiving

Use Tar/LZip for LZMA compression instead.

.NET Core and compression

Recently, there’s been discussion I’ve been looking at on the CoreFx repo. Mainly, I’ve wanted to push compressors into the core for native support but probably keep archive formats outside. Most implements don’t support forward-only.

I’ve now opened a new issue about forward-only. I may push SharpCompress all the way in the core! Who knows?

The code probably needs a bit rewrite as quality is all over the place. I’ve never been one to redo algorithms but I’ve always been proud of the unified interface.

.NET Core 1.1 building with Docker and Cake (part 2)

This is a follow up to my original post: .NET Core 1.1 building with Docker and Cake That post has a bit more detail than here.

Essentially, overtime, the build was a bit too slow. Having a build container (with Mono) being pulled on Circle CI each time was too slow.

I’ve moved away from a build container but still publish and create a Docker image.

Build Process Overview

  1. Dependencies:
    • Install dotnet SDK
    • dotnet restore via Cake
  2. Compile:
    • dotnet build via Cake
  3. Test:
    • dotnet test via Cake
  4. Deployment:
    • dotnet publish
    • Use Dockerfile to create image
    • Push built image to AWS ECS

Circle CI configuration

machine:
environment:
DOTNET_CLI_TELEMETRY_OPTOUT: 1
services:
– docker

dependencies:
pre:
– sudo sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/dotnet-release/ trusty main" > /etc/apt/sources.list.d/dotnetdev.list'
– sudo apt-key adv –keyserver hkp://keyserver.ubuntu.com:80 –recv-keys 417A0893
– sudo apt-get update
– sudo apt-get install dotnet-dev-1.0.4
override:
– ./build.sh build.cake –target=restore
cache_directories:
– ~/.nuget

compile:
override:
– ./build.sh build.cake –target=build

test:
override:
– ./build.sh build.cake –target=test

deployment:
builds:
branch: [master, dev]
commands:
– mkdir publish/
– dotnet publish src/Api.Server -f netcoreapp1.1 -c Release -o ../../publish –version-suffix ${CIRCLE_BRANCH}-${CIRCLE_BUILD_NUM}
– docker build -f Dockerfile -t server-api:latest .
– docker tag server-api:latest $AWS_ACCOUNT_ID.dkr.ecr.eu-west-1.amazonaws.com/server-api:$CIRCLE_BUILD_NUM-$CIRCLE_BRANCH
– ./push.sh

New: Build Phases

I still hang everything together with a Cake script but call the stages individually to better match the stags on Circle CI. It seems most build services work this way.

New: dotnet SDK installation

This is just a copy/paste from the https://dot.net site for Ubuntu. The current version of the SDK now is 1.0.4.

New: Caching NuGet dependencies

Circle CI and other services have a notion of caching. This was easy on Circle. I just tell it to save my .nuget directory and nuget pulls are much faster. I should figure out something better for the SDK itself. But that probably means a base docker image. Maybe this is better for Circle CI 2.0 which all docker based.

New: Branch tagging

Circle supports having the build number as well as the branch as an environment variable. Using this to tag is nicer for me as well.

Cake file

The cake file has changed since the last post too. Cake better supports the dotnet commands. I still have to manually glob for tests to run though.

var target = Argument("target", "Default");
var tag = Argument("tag", "cake");

Task("Restore")
  .Does(() =>
{
    DotNetCoreRestore(".");
});

Task("Build")
  .Does(() =>
{
    DotNetCoreBuild(".");
});

Task("Test")
  .Does(() =>
{
    var files = GetFiles("test/**/*.csproj");
    foreach(var file in files)
    {
        DotNetCoreTest(file.ToString());
    }
});

Task("Publish")
  .Does(() =>
{
    var settings = new DotNetCorePublishSettings
    {
        Framework = "netcoreapp1.1",
        Configuration = "Release",
        OutputDirectory = "../../publish",
        VersionSuffix = tag
    };

    DotNetCorePublish("src/Api.Server", settings);
});

Task("Default")
    .IsDependentOn("Restore")
    .IsDependentOn("Build")
    .IsDependentOn("Test");

RunTarget(target);

The deployment Dockerfile

FROM microsoft/dotnet:1.1.2-runtime

COPY ./publish/api /app
WORKDIR /app

EXPOSE 5000

ENTRYPOINT ["dotnet", "Api.Server.dll"]

I no longer hardcore the ASPNETCORE_ENVIRONMENT variable in the Dockerfile and put that in my ECS config in using terraform. That’s another subject though.

Publishing to AWS ECR – push.sh

I could probably fold this into the circle.yml but I like having it separate

I added a git push for tagging to my Github repo

#!/usr/bin/env bash

# more bash-friendly output for jq
JQ="jq --raw-output --exit-status"

configure_aws_cli(){
    aws --version
    aws configure set default.region eu-west-1
    aws configure set default.output json
}

push_ecr_image(){
    eval $(aws ecr get-login --region eu-west-1)
    docker push $AWS_ACCOUNT_ID.dkr.ecr.eu-west-1.amazonaws.com/server-api:$CIRCLE_BUILD_NUM-$CIRCLE_BRANCH
}

configure_aws_cli
push_ecr_image

git tag -a $CIRCLE_BUILD_NUM-$CIRCLE_BRANCH -m "Circle CI Build Tag"
git push origin --tags

SharpCompress 0.16.0 Released

Another release with some good changes. I’m still deciding on where to take this. I’m still leaving SharpCompress without a 1.0 release as I never feel confident enough to be strict with myself not to break the API in case of changes.

I have started a dotnet tool branch for fun as well as consume my own API again to get a better sense of how things feel. Take a look at the branch: dotnet tool

SharpCompress 0.16.0 on Nuget

Changelog

As always, more fixes and help are welcome!