Author: adamhathcock

Connecting to AWS DocumentDb with C# from your local machine

I’ve had a lot of trouble connecting to a test cluster of AWS DocumentDB from my local machine. Sure, I should probably just use a local MongoDB instance (which I will) but I needed to just test a raw connection for my first attempt.

I was using Windows with Ubuntu WSL. Your mileage may vary.

Steps

  1. Create DocumentDB Cluster
  2. Bastion SSH instance
  3. Start SSH Tunnel
  4. telnet test!
  5. Programmatic Access

Steps 1 and 2

A decent how-to for steps 1 & 2 is here: https://medium.com/softinstigate-team/how-to-create-a-web-api-for-aws-documentdb-using-restheart-987921df3ced

It doesn’t explain everything and wants to set up something else but the AWS basics are there. Ensure same VPC, subnet access and security group access

Step 3

ssh -A -p 2202 -L 27017:<cluster-dns>:27017 <bastion-user>@<bastion-dns>

The above is the SSH tunnel I used.

Step 4

telnet localhost 27017 should now work. If it doesn’t then it will error out instantly.

Using localhost:27017 for my Mongo connection should have just worked right? Nope!

Step 5

The official documentation for how to connect to Document for dotnet isn’t that great. However the basic steps are still required: https://docs.aws.amazon.com/documentdb/latest/developerguide/connect.html

I didn’t like inserting the RDS cert into my machine so I changed my code to look similar to this:


string caContentString = System.IO.File.ReadAllText("rds-combined-ca-bundle.pem");
X509Certificate2 caCert = new X509Certificate2(Encoding.ASCII.GetBytes(caContentString));
var settings = MongoClientSettings.FromUrl(new MongoUrl(connectionString));
settings.AllowInsecureTls = true;
settings.SslSettings = new SslSettings()
{
    ClientCertificates = new[] {caCert}
};

Will likely embed the pem file into the assembly for production.

The problem

I’m not 100% sure what happened but somehow the Clustering logic of the Mongo Driver knew about the AWS DNS for the instance. I guess the clustering feature shares connection information which is usually a good thing. However, in this case, it is not.

Constant timeouts when clustering was configuring itself was my symptom.

After digging around source, I found a solution: I needed to override the Server Selectors for the cluster with IServerSelector.

The solution

public class LocalhostOnlySelector : IServerSelector
{
    public IEnumerable<ServerDescription> SelectServers(ClusterDescription cluster, IEnumerable<ServerDescription> servers)
    {
        foreach (var server in cluster.Servers)
        {
            switch (server.EndPoint)
            {
                case DnsEndPoint dep:
                    {
                        if (dep.Host.Equals("localhost"))
                        {
                            yield return server;
                        }
                        continue;
                    }
                    default:
                        continue;
            }
        }
    }
    }

So now my complete test connection looks like:

string template = "mongodb://{0}:{1}@{2}/test?ssl=true&replicaSet=rs0&readpreference={3}";
string username = "Adam";
string readPreference = "secondaryPreferred";
string connectionString = String.Format(template, username, password, "localhost:27017", readPreference);


string caContentString = System.IO.File.ReadAllText("rds-combined-ca-bundle.pem");
X509Certificate2 caCert = new X509Certificate2(Encoding.ASCII.GetBytes(caContentString));
var settings = MongoClientSettings.FromUrl(new MongoUrl(connectionString));
settings.AllowInsecureTls = true;
settings.SslSettings = new SslSettings()
{
    ClientCertificates = new[] {caCert}
};
settings.ClusterConfigurator = x =>
{
    x.ConfigureCluster(y => y.With(preServerSelector : new LocalhostOnlySelector(), postServerSelector : new LocalhostOnlySelector()));
};
var client = new MongoClient(settings);
var database = client.GetDatabase("test");
var collection = database.GetCollection<BsonDocument>("test1");
var count = await collection.CountDocumentsAsync(x => true);
Console.WriteLine(count);

My console prints 1!

CSPROJ Pain: ASP.NET Core Razor Pages with Jetbrains Rider

Rider

I love Rider and they’ve generously provided me with a full license for my open source work. I can’t thank them enough.

The one downside to Rider is mono is used internally for many features instead of .NET Core. I know there are many reasons for this but I ran into a single issue with this:

Build FAILED.
(default target) (2) ->
(RazorCoreCompile target) -> 
  /usr/local/share/dotnet/sdk/2.2.103/Sdks/Microsoft.NET.Sdk.Razor/build/netstandard2.0/Microsoft.NET.Sdk.Razor.Compilation.targets(155,10): error MSB4064: The "SharedCompilationId" parameter is not supported by the "Csc" task. Verify the parameter exists on the task, and it is a settable public instance property. [/Users/adam/**********..csproj]
  /usr/local/share/dotnet/sdk/2.2.103/Sdks/Microsoft.NET.Sdk.Razor/build/netstandard2.0/Microsoft.NET.Sdk.Razor.Compilation.targets(107,5): error MSB4063: The "Csc" task could not be initialized with its input parameters.  [/Users/adam/**********.csproj]

The fix?

To fix this, I found that I needed to add a reference to Microsoft.Net.Compilers Sure, whatever. Little did I know this doesn’t work with the .NET Core compiler. So when building my Docker container I get this:

/root/.nuget/packages/microsoft.net.compilers/2.10.0/tools/Microsoft.CSharp.Core.targets(52,5): error MSB3883: Unexpected exception:  [/build/src/**********.csproj]
/root/.nuget/packages/microsoft.net.compilers/2.10.0/tools/Microsoft.CSharp.Core.targets(52,5): error : System.ComponentModel.Win32Exception (8): Exec format error [/build/src/**********.csproj]
/root/.nuget/packages/microsoft.net.compilers/2.10.0/tools/Microsoft.CSharp.Core.targets(52,5): error : at Interop.Sys.ForkAndExecProcess(String filename, String[] argv, String[] envp, String cwd, Boolean redirectStdin, Boolean redirectStdout, Boolean redirectStderr, Boolean setUser, UInt32 userId, UInt32 groupId, Int32& lpChildPid, Int32& stdinFd, Int32& stdoutFd, Int32& stderrFd, Boolean shouldThrow) [/build/src/**********.csproj]
/root/.nuget/packages/microsoft.net.compilers/2.10.0/tools/Microsoft.CSharp.Core.targets(52,5): error : at System.Diagnostics.Process.StartCore(ProcessStartInfo startInfo) [/build/src/**********.csproj]
/root/.nuget/packages/microsoft.net.compilers/2.10.0/tools/Microsoft.CSharp.Core.targets(52,5): error : at System.Diagnostics.Process.Start() [/build/src/**********.csproj]
/root/.nuget/packages/microsoft.net.compilers/2.10.0/tools/Microsoft.CSharp.Core.targets(52,5): error : at Microsoft.Build.Utilities.ToolTask.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands) [/build/src/**********.csproj]
/root/.nuget/packages/microsoft.net.compilers/2.10.0/tools/Microsoft.CSharp.Core.targets(52,5): error : at Microsoft.CodeAnalysis.BuildTasks.ManagedCompiler.ExecuteTool(String pathToTool, String responseFileCommands, String commandLineCommands) [/build/src/**********.csproj]

Great. Turns out Roslyn on for .NET Core has the right compiler but not for mono.

csproj

Now, I also really have enjoyed .NET Core since the early betas and really did reappreciate how they started over with the tooling. Now, I also appreciate to really drag everyone into .NET Core, they still needed msbuild…I guess.

Anyway, csproj, XML, msbuild…we still have to deal with this. I have to put a conditional package reference now!

Now, my complete csproj is this (with super awesome mono detection):

<Project Sdk="Microsoft.NET.Sdk.Web">

    <PropertyGroup>
        <TargetFramework>netcoreapp2.2</TargetFramework>
    </PropertyGroup>

    <ItemGroup>
        <PackageReference Include="IdentityModel" Version="3.10.5" />
        <PackageReference Include="Microsoft.AspNetCore.App" />
        <PackageReference Include="Microsoft.AspNetCore.Razor.Design" Version="2.2.0" PrivateAssets="All" />
        <PackageReference Condition="$(CscToolPath.Contains('mono'))" Include="Microsoft.Net.Compilers" Version="2.10.0" />
    </ItemGroup>

</Project>

Hope this helps someone else.

Using the Cake dotnet tool

Cake just released 0.30 which includes a package for a dotnet global tool.

I’ve been waiting for this for a long time as I’ve always had “personal” issues with the bootstrapper script. As I make it a good practice to Dockerize all the things, I can simplify my dotnet Dockerfiles a bit.

Here’s the new Dockerfile I have for the sample Conduit ASP.NET Core app that I maintain for fun:

#build container
FROM microsoft/dotnet:2.1.401-sdk as build

WORKDIR /build
COPY . .
RUN dotnet tool install -g Cake.Tool
ENV PATH="${PATH}:/root/.dotnet/tools"
RUN dotnet cake build.cake 

#runtime container
FROM microsoft/dotnet:2.1.3-runtime

COPY --from=build /build/publish /app
WORKDIR /app

EXPOSE 5000

ENTRYPOINT ["dotnet", "Conduit.dll"]

The one tricky bit is that the current dotnet base images don’t have the tools dir on the PATH as tracked here: https://github.com/dotnet/dotnet-docker/issues/520

The fix is just adding the single line: ENV PATH="${PATH}:/root/.dotnet/tools"

Now I can delete build shell/powershell scripts all day long!

SharpCompress 0.21.0 – RAR5!

Happy to say that SharpCompress finally supports RAR5 due to a large contribution from a user!

RAR5 support is a slightly incomplete report of the unrar code from C++ If I was more skilled and had more time, I’d finish it. However, the incomplete code paths are fortunately fully implemented in the old port. So I glued them together to make it all work. Yay coding.

Full notes on the github release and it’s on Nuget!

Cake on .NET Core 2.0! – A simpler Docker build for CircleCI 2.0

Previously I wrote about having a Docker container with .NET Core 1 and 2 specifically to run a build with Cake on Circle CI 2.0. This was because Cake had been only able to run on .NET Core 1 while the rest of the .NET Core world had moved on to 2. I’m happy to say this is no longer the case.

Cake is now targeting .NET Standard 2.0 (and net46 for Omnisharp reasons) which allows me to drop the custom container I made just for Cake builds.

Now my CircleCI 2.0 uses the official Microsoft build .NET Core 2.0 SDK container with a small addition: unzip. Maybe I’ll figure out how to drop this too one day:

version: 2
jobs:
  build:
    docker:
      - image: microsoft/dotnet:2.0.5-sdk-2.1.4
    steps:
      - checkout
      - run:
          name: Install unzip
          command: |
            apt-get update
            apt-get install -y unzip
      - run:
          name: Build
          command: ./build.sh

It’s nice to be simple and rely on other people to maintain stuff for you instead of making my own container.

Terraform, API Gateway and Cognito

I’d like to control API Gateway as an HTTP Proxy to an ALB for an ECS Task.

Unfortunately, Terraform’s support of Cognito isn’t quite there.

There are some features missing:

In this context, I need to add a Cognito Authorizer for an existing User Client Pool.

Currently, Terraform only supports making an authorizer for a lambda only. So creating an authorizer for cognito is a manual step. Creating a cognito authorizer is documented but creating it with the AWS console is easy. Just make it of type COGNITO then select the pool you want.

Next you need to attach the authorizer to the aws_api_gateway_method resources desired. Your methods would look similar to this:

resource &quot;aws_api_gateway_method&quot; &quot;api-gateway-method-post&quot; {
  rest_api_id   = &quot;${aws_api_gateway_rest_api.api-gateway.id}&quot;
  resource_id   = &quot;${aws_api_gateway_resource.api-gateway-resource.id}&quot;
  http_method   = &quot;POST&quot;
  authorization = &quot;COGNITO_USER_POOLS&quot;
  authorizer_id = &quot;${var.cognito-authorizer-id}&quot;
}

variable &quot;cognito-authorizer-id&quot; {
  default = &quot;9rvrci&quot;
}

Setting authorization to COGNITO_USER_POOLS isn’t documented but it currently works.

The hard part here is finding the authorizer id. I found this by setting using it manually in the AWS console then running terraform plan to see what terrafrom would change the value of a current method from to empty. I’m sure there are better ways.

Summary

Process for API Gateway with Cognito Authorizer

  • Create API Gateway (minus authorizer) with Terraform
  • Create Cognito User Pool (maybe without Terraform)
  • Create Cognito Authorizer on the API Gateway (without Terraform)
  • Add Cognito Authorizer details to the Terraform configuration then apply

One day soon, Terraform will support all this 🙂