Authenticating ASP.NET 5 to AD FS OAuth Part 3: Validating JWTs

In our first part of handling OAuth, we handle the response from AD FS and parse the JWT back to the application.

OAuth 2.0 by nature depends on transport security (TLS). Without HTTPS, OAuth 2.0 is completely insecure. However the JWT that AD FS returns is in fact, signed. It’s signed by the Token Signing Certificate in AD FS, and using the Public Key we can validate it. You can get the certificate from AD FS by simply exporting to to disk and saving it as a .cer file.

Previously, we weren’t validating the JWT – which isn’t unreasonable if you have correct transport security in place. If you want to validate the signature of the JWT, you can modify your middleware configuration like so:

OnGetUserInformationAsync = context =>
    var handler = new JwtSecurityTokenHandler();
    var signingCert = new X509Certificate2(Path.Combine(_appEnv.ApplicationBasePath, "jwtToken.cer"), (string)null);
    SecurityToken securityToken;
    var validationOptions = new TokenValidationParameters
        ValidateIssuerSigningKey = true,
        IssuerSigningKey = new X509SecurityKey(signingCert),
        ValidateAudience = true,
        ValidateIssuer = true,
        ValidAudience = "",
        ValidIssuer = "",
        NameClaimType = ClaimTypes.Upn,
        RoleClaimType = ClaimTypes.Role,
        AuthenticationType = "oauth2",
        RequireSignedTokens = true,
    var principle = handler.ValidateToken(context.AccessToken, validationOptions, out securityToken);
    context.Principal = principle;
    return Task.FromResult(0);

The first thing we’ll need is the certificate – in this case I named it jwtToken.cer and put it in the application base path, one level up from wwwroot. You can change the validation as you like, such as not validating the audience or issuer.

Alternatively, you can obtain the certificate from a certificate store:

X509Store store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
var thumbprint = "get-thumbprint-from-configuration";
var certificates = store.Certificates.Find(X509FindType.FindByThumbprint, thumbprint, false);
if (certificates.Count == 0)
    throw new System.Security.SecurityException($"Unable to find certificate with thumbprint \"{thumbprint}\".");
var certificate = certificates[0];

Authenticating ASP.NET 5 to AD FS OAuth Part 2: Claims

Last we looked at using the ASP.NET Identity Framework to authenticate to AD FS with OAuth2. This did simple authentication, but no claim information about the identity was known – we had a single claim for the token, and that’s all. Next, we are going to add some information about the user as a claim on the identity.

The first step, is to have AD FS send the claims that you want. This is done by configuring the Relying Party’s Claim Rules.


I’m sending three claims here – the UPN, Display Name, and “Token-Groups”. The UPN is the user principle name in Active Directory, like This is used to identify the user in a unique way, like a username. The display name is used for a friendly, “Hello, Kevin” on the header of the application. The Token-Groups in this case are simply the active directory groups the user belongs to. There are three different choices. Unqualified means it’s just the name of the group, like “MyGroup”. Next is short qualified, like “mydomain\MyGroup”, and lastly there is fully qualified, like “mydomain.local\MyGroup”. I’ve opted for the unqualified, which you use is up to you, if you use them at all. You many not want to use unqualified if you have more than one domain in the forrest with a trust relationship. If there are two groups with the same name in different domains, you wouldn’t be able to tell them apart.

The outgoing claim type is the type of claim that the receiver, in this case our application, will see.

We need to get our application to handle this correctly now, which isn’t too hard. Ultimately, what AD FS does is encode all of this into a JWT. We need to handle the token as a JWT token, extract the claims, and create an identity with this information. To do so, we need to handle the OnGetUserInformationAsync notification, where we are given the raw token from AD FS.

options.Notifications = new OAuthAuthenticationNotifications
    OnApplyRedirect = context => { /* Content omitted */ },
    OnGetUserInformationAsync = context =>
        var token = new JwtSecurityToken(context.AccessToken);
        var identity = new ClaimsIdentity(token.Claims, context.Options.AuthenticationScheme, "upn", "role");
        context.Principal = new ClaimsPrincipal(identity);
        return Task.FromResult(0);

The JwtSecurityToken class does the heavy lifting. We simply give it the tokens, and it does the rest. I’m using the one from the “System.IdentityModel.Tokens” NuGet package.

We then create an identity using the claims from the JWT, then assign a new principle to the context. When constructing the ClaimsIdentity, the last two parameters are the name of the claims that contain the username and roles. If these don’t match the name of the claims, then your identity will be authenticated, but they will have no username or roles. These values must match the value of the outgoing claim when we originally set up the claim rules.

Putting it all together, we now have OAuth2 authentication with full support for claims.

Authenticating ASP.NET 5 to AD FS OAuth

One of the new things that Active Directory Federation Services supports starting in Windows Server 2012 R2 is OAuth2. I wanted to get ASP.NET 5 working with AD FS’s OAuth2 support (as opposed to WS-Federation or SAML).

To get this to work, we must first configure AD FS to support this. Use the AD FS management tool to ensure the OAuth2 service endpoint is enabled:


The OAuth2 specification makes no security promises by itself, instead it relies on Transport security, or TLS.

Next, you will want to ensure you have a relying party configured. If you have one that exists you want to use already, then you can use an existing one.

Here we can set one up quickly for testing. Start with a manual configuration:


Next, specify an identifier for your relying party. This can be any valid URI, including an URN or URL. For purposes of OAuth2, this can be any URI so long as it is unique amongst all relying parties.

Screen Shot 2015 05 02 at 9 26 12 PM

Continue through the wizard with the defaults or nothing selected since we will not be using SAML or WS-Federation, then add a claim rule for for the user principle name.


Now that you have a relying party, you use the Add-AdfsClient powershell cmdlet. This adds an OAuth2 client to the relying party. Each client has a unique identifier. How many clients you make per relying party is up to you – you can reuse it for many multiple applications, or make a distinct client per application.

  • -ClientId: This is a unique identifier that is the client ID that we will configure OAuth to use. Typically this is just a random GUID.
  • -Name: The name of the client.
  • -RedirectUri: This is an URI or array of URIs that AD FS is allowed to post back to. This must be a fully qualified URI.
  • -Description (optional): A description of the client.

Getting into the ASP.NET 5 web application, we use the OAuth middleware, which performs the authentication. Because OAuth is just an authentication step, it must piggy-back on another authentication provider that can authenticate the entire browser session, like cookies.

Let’s say my application is being hosted on, and AD FS is located at, and we’ll see how this ties in to the AD FS configuration.

app.UseOAuthAuthentication("oauth2", options => {
    options.AutomaticAuthentication = true;
    options.SignInScheme = CookieAuthenticationDefaults.AuthenticationScheme;
    options.ClientId = "1bf8f5f1-c3c5-4a7c-993a-01d912409915";
    options.ClientSecret = "abc123";
    options.CallbackPath = new PathString("/oauth-callback");
    options.Notifications = new OAuthAuthenticationNotifications {
        OnApplyRedirect = context => {
            var parameter = new Dictionary
                ["resource"] = "https://test.local"
            var query = QueryHelpers.AddQueryString(context.RedirectUri, parameter);
    options.ClaimsIssuer = "";
    options.AuthorizationEndpoint = "";
    options.TokenEndpoint = "";

The ClientId is the GUID we specified in the Add-AdfsClient cmdlet. The ClientSecret is a meaningless value, AD FS does not support client secrets. However, the OAuth2 middleware requires it.

The CallbackPath is a relative path that the middleware expects AD FS to return the OAuth token. Since our application’s callback URI is, this would be the URI we specify as the -RedirectUri specified in the powershelgl cmdlet.

I did struggle with one thing for a bit, which was having to slightly modify the query string the ASP.NET 5 OAuth middleware used to go to the AD FS portal. AD FS expects a query string parameter of “resource” with a URI that matches one of the relying party trust URIs. The OAuth middleware allows you to intercept some events such as the redirection to the portal, handling the response back, and setting the claims up from the response.

The last step is to enable cookie authentication:

app.UseCookieAuthentication(config =>
    config.AutomaticAuthentication = true;

This is how the OAuth authentication “sticks” for the duration of the browser session.

That was enough to get OAuth2 working with ASP.NET 5 and AD FS from a pure authentication perspective. Next time we will look at setting up claims for roles and permissions.


The NuGet packages needed for all of this is as following:

  • “Microsoft.AspNet.Identity”: “3.0.0-beta4″
  • “Microsoft.AspNet.Authentication.Cookies”: “1.0.0-beta4″
  • “Microsoft.AspNet.Authentication.OAuth”: “1.0.0-beta4″
  • “Microsoft.AspNet.Authentication”: “1.0.0-beta4″
  • “System.IdentityModel.Tokens”: “5.0.0-beta4″

Keep in mind that given of this is beta, it’s possible some of the nuget packages needed will change, some may be removed, and others may be renamed. Finally, the namespaces used:

using Microsoft.AspNet.Builder;
using Microsoft.Framework.DependencyInjection;
using Microsoft.AspNet.Http;
using Microsoft.AspNet.Hosting;
using Microsoft.AspNet.Authentication.OAuth;
using Microsoft.AspNet.WebUtilities;
using Microsoft.AspNet.Authentication.Cookies;
using Microsoft.AspNet.Authentication;
using Microsoft.Framework.Runtime;
using System.Security.Claims;
using System.Threading.Tasks;
using System.IdentityModel.Tokens;
using Microsoft.AspNet.Authorization;

Digging into MVC 6 – Part 1: Tag Helpers

I’ve always found some of razor’s syntax less idiomatic than I would like. HTML is a fairly ubiquitous thing amongst web developers. Whether you use ASP.NET, Symfony, Flask, ColdFusion, or WordPress, a firm grasp of HTML is required. So when I see views with this kind of markup:

@using (Html.BeginForm("Login", "Authentication"))
    @Html.TextBoxFor(m => m.UserName, new { @class="LoginTextBox" })
    <!-- login form contents -->

I can’t help but cringe a little bit. This is a far cry from HTML. The form element is a using statement, the text box for the user name uses anonymously typed objects for applying attributes to the element. It all seems a little off. If I were a designer and all I wanted to do was apply a class attribute to the form and didn’t have strong ASP.NET MVC skills, I might be at a loss.

I’d much prefer something that actually resembled HTML. HTML is, after all, what we are trying to render here. This was one of the huge benefits of MVC over web forms. With MVC, you have complete control over what HTML gets rendered. No more crazy view state, enormous element IDs, or controls that require hours and hours of overriding default behaviors to get it render the markup you want.

Tag helpers are the next step in the continuation of putting HTML back into the hands of MVC developers. Instead of all of the HTML helpers and extension methods as seen above, we can write natural HTML, and decorate them with some simple attributes that MVC recognizes. Here’s the above written as tag helpers:

<form method="post" asp-controller="Authentication" asp-action="Login">
    <input type="text" asp-for="UserName" class="LoginTextBox" />

Tag helpers let us write what we want using plain HTML. Jeff Fritz has a great blog post discussing the plumbing of tag helpers, and even how to develop your own. These are very powerful mechanisms that offer a lot of flexibility over how the final markup gets rendered.

A First Glance at Visual Studio 2015 RC and ASP.NET 5

I’ve had the chance to sit down and work with Visual Studio 2015 Release Candidate. While not the finished product, I suspect a lot of this is close to what we will be seeing in the final bits of Visual Studio 2015.


A first thing to point out, is while this is the Release Candidate for Visual Studio 2015, it is not a release candidate for ASP.NET 5, or the DNX Framework or Tools. These are still very much at “beta” quality. MVC6 and many other packages are on Beta 4.

These are going to take a bit longer to update, and as Microsoft has stated, they are going to continue to do work on these NuGet packages even once Visual Studio 2015 ships. Gone are the days of huge monolithic release – .NET and Visual Studio and Libraries all once. As more and more components are shifted to NuGet packages, the easier it will be for those components to be updated.

The upshot to all of this is we get more new things more frequently. MVC was really the trendsetter for this, and it has worked out quite well1. The downside is, we may not be getting a perfectly seamless product when Visual Studio 2015 ships. That’s all still up in the air, and it shows with the current release.

DNX… Core… What’s going on here?

There’s been some considerable changes in ASP.NET 5. It’s a far flung departure from what we have today, and there are a lot of new concepts to wrap your head around. If you are keen on upgrading, you have a choice on your upgrade path for ASP.NET project or if starting a new one.


Along with ASP.NET 5, there is an ASP.NET 4.6. ASP.NET 4.6 is very similar to the ASP.NET you know and love today. It has Web Forms (.aspx), MVC, Web API, and any combination of the three. It still uses traditional Project files. This is a fairly frictionless upgrade path, but at the cost of not having the latest-and-greatest. MVC 6 will not work in this kind of project. MVC 5 is still there, and may receive updates still. MVC 6 is a pretty big departure from MVC 5 and prior.

ASP.NET 5 is where all of the big changes come in. Web Forms are a thing of the past – there are no Web Forms for ASP.NET 5. There isn’t Visual Basic .NET support, either. However Microsoft has promised2 they will add VB.NET support after ASP.NET 5 ships. If you make use of Web Forms or VB.NET, that limits things to ASP.NET 4.6 for now.

ASP.NET 5 brings some big changes. The big outstanding ones is a new “Core” runtime that allows cross-platform deployment, called DNX Core 5. DNX is short for .NET Execution Environment, and consists of several other tools. If you build your web application on the Core runtime, you should be able to run it on Windows, Mac, and certain *Nix platforms with little to no special coding. The downside to this Core runtime is it is a subset of the full .NET Framework – many library classes are not available. As the DNX Core framework matures, more things will be added and ported to the Core framework. To make this work, many parts of the .NET Framework that are available today have been broken up into NuGet packages. As more things are brought to the DNX Core framework, NuGet packages will get updated and new ones will become available.

If cross-platform isn’t on your radar, you can still do full .NET Framework development with ASP.NET on DNX 4.5.1. This name probably isn’t going to remain, it is likely to get changed to DNX 4.63.

Let’s try and summarize this.

  • ASP.NET 4.6 is very familiar to what we have today, Web Forms, MVC 5, etc. It uses the .NET Framework 4.6.
  • For ASP.NET 5, there are two Frameworks: DNX Core 5 and DNX 4.5.1 (likely to be changed to DNX 4.6). The former is a subset of the full .NET Framework that supports cross platform. DNX 4.5.1 uses the full .NET Framework 4.6.
  • DNX is a “.NET Execution Environment” that consists of a runtime, framework, and tooling.

The next question might be, if the DNX 4.5.1 / 4.6 uses the .NET Framework 4.6, how is that different from ASP.NET 4.6 using the .NET Framework 4.6?

One of the main goals of ASP.NET is to completely decouple all components from the System.Web assembly. This assembly has served us well over the years, but it has a major flaw: it assumes the web server will always be Internet Information Services (IIS). This makes cross-platform impossible or a terrible hack, it also prohibits “self hosting”. If you’ve ever wanted to run a small web server in a Windows Service or Console Application, you couldn’t due to that IIS dependency. There has been a lot of attempts to get this working well over the years, but all of it was friction with System.Web.dll.

If this sounds familiar, it’s because a lot of this work has been going on for a long time already with OWIN. OWIN in a nutshell is an interface or specification that acts between various web frameworks and the underlying web server. This has already been achieved with SignalR 2 and WebAPI – both can be self hosted today, but only on Windows. However all of the legwork was done there – all that was missing was a runtime and CLR that implemented the OWIN specification on other platforms, and that is what we are getting with ASP.NET 5.

DNX, DNVM, and nuget

If you have been doing any work with Visual Studio 2015 CTPs or previous beta of MVC6, or have been reading about ASP.NET 5, you will see all of the tools, “k”, “re”, and “cpm” aren’t there anymore. Where did they go? These were the previous names of the DNX tools, and have since been renamed4.

There are a few tools to familiarize yourself with.

dnx is a command line tool that acts as an entry point to an application. If you are coming from a node.js background, or are familiar with it, you can liken it to the node command line tool. It actually bootstraps the execution of an application. Previously, this was called the KLR.

dnvm.ps1 (and .sh) is the version manager. This tool manages which DNXs are installed, and which is the primary one. For the node folks, this is similar to the nvm, or for Ruby rvm.

nuget is the package manager. It can restore package dependencies, and create new packages. This serve’s as node.js’s npm tool.

You can see there is a lot of inspiration from how many other development stacks do it: you have your entry point, version manager, and package manager. This isn’t a mistake: the goal here is to use familiar experiences and terminology that exists today on other platforms and bring those terms to .NET, not introduce new ones.

OWIN’s King

As touched on previously, ASP.NET 5 has really made OWIN a front-and-center requirement. Everything must now use the OWIN specification to run in ASP.NET 5, including MVC 6.

OWIN itself is not a server, it is purely interfaces that act as a specification. The implementors of those interfaces allow those web components to talk to the web server in an agnostic way. One such implementation is Katana. Katana has been around for a while. If you were using OWIN before ASP.NET 5 with IIS, chances are the OWIN implementation you were using was Katana. If you were self hosting SignalR, you were using HTTPListener.

In ASP.NET 5, the new implementation of OWIN is Helios for IIS. The name Helios isn’t seen much anywhere, rather it has been renamed to Microsoft.AspNet.Server.IIS. Helios has several advantages over Katana. Katana was still heavily dependent on System.Web to accomplish anything. Helios on the other hand, has no dependency on System.Web. This gives quite a bit more flexibility with self hosting.

Since MVC is transitioning away from System.Web to OWIN, there are going to be breaking changes.

MVC 6 and getting started with Visual Studio 2015 RC

I’ll give three warnings on Visual Studio 2015 RC. The first is, this is a release candidate. Visual Studio may be unstable, I myself have already filed a couple of bugs on it. Secondly, as I pointed out in the beginning, the nugget packages themselves are still very much more of a beta quality. Things may break as they get closer to RTW. Thirdly, when I installed the Release Candidate, I had the option to import all of my settings and preferences from Visual Studio 2013. This did not go well – intenseness was complete broken for me. Undoing this got everything working again.

MVC 6 is similar in a lot of ways. Many of the conventions are still there, controllers are classes that have action methods on them, Views are .cshtml files with Razor in them. However many other things work differently, such as routing. ASP.NET 5 has no Global.asax or corresponding code behind for it. Instead with OWIN, we use the Startup.cs file. I’m going to start with an “Empty” ASP.NET vNext project in Visual Studio 2015 RC and add MVC 6 to it in a basic manner so we can see how to wire it up.

The empty project template includes a simple WriteLine that says Hello World! – go ahead and nuke that so that the Configure method is completely empty in Startup.cs. First, we need to add a dependency on MVC 6, this is done in the project.son file. There is a property called “dependencies”, which we can add NuGet packages to. We’ll add MVC 6 beta45 as a dependency. My dependencies property now looks like this:

"dependencies": {
    "Microsoft.AspNet.Server.IIS": "1.0.0-beta4",
    "Microsoft.AspNet.Server.WebListener": "1.0.0-beta4",
    "Microsoft.AspNet.Mvc": "6.0.0-beta4"

When you modify this file, Visual Studio will automatically pull down the package for you if it needs to. If for whatever reason Visual Studio doesn’t do that, you can use Ctrl+Shift+K,Ctrl+Shift+R to manually perform a package restore. Now that we have MVC added, jump back to the Startup.cs file. Adding the nuget dependency to MVC puts some extension methods in place for ConfigureServices and Configure.

public void ConfigureServices(IServiceCollection services)

public void Configure(IApplicationBuilder app)

ConfigureServices is used to services, or dependencies to your application. The Configure is used to configure your OWIN pipeline, or really to describe how a request should be handled. In Configure, we are adding MVC to be able to handle requests through our pipeline if MVC agrees that it can handle it.

The UseMvc method has an overload that lets us build our routes:

app.UseMvc(routes =>
    routes.MapRoute("Default", "{controller}/{action}");

Since there is no web forms here, the usual IgnoreRoute for *.axd is gone – AXDs are part of the legacy ASP.NET pipeline which we are leaving behind.

To configure MVC, that is done with the ConfigureMvc extension method off of IServiceCollection in ConfigureServices. This is where a lot of the things you would normally do in Application_Start for previous versions of MVC, such as adding global filters, custom model binders, etc. This is optional for getting a basic MVC application up and running, but it’s common enough to use these features.

services.ConfigureMvc(config =>
    config.Filters.Add(new MyCustomerSecurityFilter());

Next I added a folder called “Controllers” to my project, and added a controller called “DefaultController”.

public class DefaultController : Controller
    public IActionResult Index()
        return View();

This is all very familiar to the way previous versions of MVC worked. The exception to this is that ActionResult is now an IActionResult. You can still choose to declare your action methods as ActionResult instead of IActionResult.

Finally, I added a Views folder and a Default folder for my Index view.

Project Structure

After adding some simple Hello World HTML to my view, I can pull it up in my browser.

MVC Hello World

As far as Controllers and Views go, these are largely the same from MVC 5. If you use the built-in attribute routing, you don’t call routes.MapMvcAttributeRoutes any more. It just works.


  1. Except for those times Microsoft shipped Windows Updates that force-updated people’s NuGet package binaries using the GAC. Fortunately, this won’t be a problem going forward as the GAC starts to disappear and strong naming becomes a thing of the past.
  3. This is currently outstanding on GitHub. The naming of things has been a point of criticism.
  4. There was a long discussion on GitHub about the k* names.
  5. At some point more betas are going to come along. One problem with some of the beta packages is they are still in flux. Be careful when upgrading packages from one beta to another, it can break things even in other packages.