I’ve blogged about user impersonation in Asp.Net MVC three years ago and this article has been in top 10 of most visited pages in my blog.

Since Asp.Net Core is out and more or less ready for production, it is time to review the article and give guidance how to do the same thin in Core. Approach have not changed at all, we do same steps as before, only some API been changed and it took me a bit of time to figure out all the updated API parts.

Impersonation Process

Impersonation is when an admin user is logged in with the same privileges as a user, but without knowing their password or other credentials. I’ve used this in couple applications and it was invaluable for support cases and debugging user permissions.

The process of impersonation in Asp.Net core is pretty simple – we create cookie for potential user and give this to the current admin user. Also we need to add some information to the cookie that impersonation is happening and give admin a way to go back to their own account without having to log-in again.

In my previous article I’ve used a service-layer class to do impersonation. This time I’m doing everything in a controller just because it is easier. However please don’t be alarmed by this – I’m still a big believer of thin controllers – they should accept requests and hand-over the control to other layers. So if you feel you need to add an abstraction layer – this should be pretty simple. I’m not doing this here for simplicity sake.

So this is the part that starts the impersonation:

[Authorize]
public async Task<IActionResult> ImpersonateUser(String userId)
{
    var currentUserId = User.GetUserId();

    var impersonatedUser = await _userManager.FindByIdAsync(userId);

    var userPrincipal = await _signInManager.CreateUserPrincipalAsync(impersonatedUser);

    userPrincipal.Identities.First().AddClaim(new Claim("OriginalUserId", currentUserId));
    userPrincipal.Identities.First().AddClaim(new Claim("IsImpersonating", "true"));

    // sign out the current user
    await _signInManager.SignOutAsync();

    await HttpContext.Authentication.SignInAsync(cookieOptions.ApplicationCookieAuthenticationScheme, userPrincipal);

    return RedirectToAction("Index", "Home");
}

In this snippet you can see that I’m creating a ClaimsPrincipal for impersonation victim and adding extra claims. And then using this claims principal to create auth cookie.

To de-impersonate use this method:

[Authorize]
public async Task<IActionResult> StopImpersonation()
{
    if (!User.IsImpersonating())
    {
        throw new Exception("You are not impersonating now. Can't stop impersonation");
    }

    var originalUserId = User.FindFirst("OriginalUserId").Value;

    var originalUser = await _userManager.FindByIdAsync(originalUserId);

    await _signInManager.SignOutAsync();

    await _signInManager.SignInAsync(originalUser, isPersistent: true);

    return RedirectToAction("Index", "Home");
}

Here we check if impersonation claim is available, then get original user id and login with that user. You will probably ask what is .IsImpersonating() method? Here it is along with GetUserId() method:

public static class ClaimsPrincipalExtensions
{
    //https://stackoverflow.com/a/35577673/809357
    public static string GetUserId(this ClaimsPrincipal principal)
    {
        if (principal == null)
        {
            throw new ArgumentNullException(nameof(principal));
        }
        var claim = principal.FindFirst(ClaimTypes.NameIdentifier);

        return claim?.Value;
    }

    public static bool IsImpersonating(this ClaimsPrincipal principal)
    {
        if (principal == null)
        {
            throw new ArgumentNullException(nameof(principal));
        }

        var isImpersonating = principal.HasClaim("IsImpersonating", "true");

        return isImpersonating;
    }
}

And this is pretty much it. You can see the full working sample on GitHub

I’m building a prototype for a new project and it was decided to use DocumentDB to store our data. There will be very little data and even less relationship between the data, so document database is a good fit. Also there is a chance for us to use DocumentDB in production.

There is a comprehensive documentation about the structure and how it all ties together. Yet not enough coding samples on how to use attachments. And I struggled a bit to come up with the working solution. So I’ll explain it all here for future generations.

Structure

This diagram is from the documentation

And this is correct, but incomplete. Store this for a moment, I’ll come back to this point later.

Ignore the left three nodes on the diagram, look on Documents and Attachments nodes. This basically shows that if you create a document, it will be available on URI like this:

https://{accountname}.documents.azure.com/dbs/{databaseId}/colls/{collectionId}/docs/{docId}

That’s fine – you call an authenticated request to the correctly formed URI (and authenticated) and you’ll get JSON back as a result.

According to the schema you will also get attachment on this address:

https://{accountname}.documents.azure.com/dbs/{databaseId}/colls/{collectionId}/docs/{docId}/attachments{attachId}

And this is correct. If you do HTTP GET to this address – you’ll get JSON. Something like this:

{
    "contentType": "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
    "id": "1",
    "media": "/media/5VEpAMZpeasdfdfdAAAAAOFDl80B",
    "_rid": "5VEpAMZpeasdfdfdAAAAAOFDl80B=",
    "_self": "dbs\/5VEpAA==\/colls\/5VEpEWZpsQA=\/docs\/5VEpAMZpeasdfdfdAAAAAOFDl80B==\/attachments\/5VEpAMZpeasdfdfdAAAAAOFDl80B=",
    "_etag": "\"0000533e-0000-0000-0000-59079b8a0000\"",
    "_ts": 1493673393
}

Turns out that there are 2 ways you can do attachments in DocumentDB – managed and (surpise!) unmanaged. Unmanaged is when you don’t really attach anything, but just provide a link to an external storage. To be honest, I don’t see much sense in doing it that way – why bother with extra resource just to keep external links? It would be much easier to make these links as part of the actual document, so you don’t have to do another call to retrieve them.

Managed attachments is when you actually do store binaries in DocumentDB and this is what I chose to use. And unfortunately had to discover for myself that it is not straight forward.

Managed Attachments

If you noticed in the JSON above there is a line "media": "/media/5VEpAMZpeasdfdfdAAAAAOFDl80B". This is actually the link to the stored binary payload. And you need to query that URI to get the payload. So from knowing document id, you’ll need 2 requests to get your hands on attached binaries:

  1. Get list of attachments
  2. Every attachment contains link to Media – get that.

So this /media/{mediaId} is missing in the diagram above. Perhaps this is deliberate not to confuse users. I’ll go with that.

Code Samples

Now to the code samples.

I’m using NuGet package provided by Microsoft to do the requests for me:

Install-Package Microsoft.Azure.DocumentDB

Start with basics to get them out of the way:

private async Task<DocumentClient> GetClientAsync()
{
    if (documentClient == null)
    {
        var endpointUrl = configuration["DocumentDb:EndpointUri"];
        var primaryKey = configuration["DocumentDb:PrimaryKey"];

        documentClient = new DocumentClient(new Uri(endpointUrl), primaryKey);
        await documentClient.OpenAsync();
    }

    return documentClient;
}

where documentClient is a local variable in the containing class.

Now let’s create a document and attach a binary:

var myDoc = new { id = "42", Name = "Max", City="Aberdeen" }; // this is the document you are trying to save
var attachmentStream = File.OpenRead("c:/Path/To/File.pdf"); // this is the document stream you are attaching

var client = await GetClientAsync();
var createUrl = UriFactory.CreateDocumentCollectionUri(DatabaseName, CollectionName);
Document document = await client.CreateDocumentAsync(createUrl, myDoc);

await client.CreateAttachmentAsync(document.SelfLink, attachmentStream, new MediaOptions()
    {
        ContentType = "application/pdf", // your application type
        Slug = "78", // this is actually attachment ID
    });

Now a few things are going on here: I create an anonymous class for sample sake – use strongly typed models. Reading attachment stream from file system – that is also for sample sake; whatever source you have, you’ll need to provide an instance of Stream to upload an attachment.

Now this is worth paying attention to: var createUrl = UriFactory.CreateDocumentCollectionUri(DatabaseName, CollectionName);. UriFactory class is not really a factory in the broad OOP sense – it does not produce other objects that will do actual work. This class gives you a lot of patterns that create URI addressess based on names of things you use. In other words there are a lot of String.Format with templates.

Method UriFactory.CreateDocumentCollectionUri is a going to give you link in format /dbs/{documentId}/colls/{collectionId}/. If you are looking on CreateAttachmentUri it will work with this template: dbs/{dbId}/colls/{collectionId}/docs/{docId}/attachments/{attachmentId}.

Next line with await client.CreateDocumentAsync(createUrl, myDoc) is doing what you think it is doing – creating a document on Azure – no surprises here.

But when you look on block of code with client.CreateAttachmentAsync(), not everything might be clear. document.SelfLink is a URI that links back to the document – it will be in format of dbs/{dbId}/colls/{collectionId}/docs/{docId}. Next big question is Slug – this is actually works as attachment ID. They might as well could’ve called it Id because this is what goes into id field when you look on the storage.

Retrieving Attachments

Once we’ve put something in the storage, some time in the future we’ll have to take it out. Let’s get back our attached file.

var client = await GetClientAsync();
var attachmentUri = UriFactory.CreateAttachmentUri(DatabaseName, CollectionName, docId, attachId);

var attachmentResponse = await client.ReadAttachmentAsync(attachmentUri);

var resourceMediaLink = attachmentResponse.Resource.MediaLink;

var mediaLinkResponse = await client.ReadMediaAsync(resourceMediaLink);

var contentType = mediaLinkResponse.ContentType;
var stream = mediaLinkResponse.Media;

Here we have some funky things going on again. This part UriFactory.CreateAttachmentUri(DatabaseName, CollectionName, docId, attachId) will give dbs/{dbId}/colls/{collectionId}/docs/{docId}/attachments/{attachmentId}. And GETting to this address will return you JSON same as in the start of the article. Value for attachmentResponse.Resource.MediaLink will look like /media/5VEpAMZpeasdfdfdAAAAAOFDl80B3 and this is the path to GET the actual attached binary – this is what we are doing in await client.ReadMediaAsync(resourceMediaLink). The rest should be self-explanatory.

Conclusion

To be honest, lack of explanation in documentation of this /media/{mediaId} does not add kudos to the team. And I feel like the provided API is not straight-forwrard and not easy to use – I had to decompile and have a wonder about what is actually happening inside of the API library. Also there is too much leakage of the implementation: I really could’ve lived without ever having to know about UriFactory.

I’m a big fan of unit tests. But not everything can and should be unit tested. Also I do love CQRS architecture – it provides awesome separation of read and writes and isolates each read from the next read via query classes. Usually my queries are reading data from a database, though they can use any persistence i.e. files. About 90% of my queries are run against database with Entity Framework or with some micro ORM (currently I’m a fan of PetaPoco, though Dapper is also grand).

And no amount of stubs/mocks or isolating frameworks will help you with testing your database-access layer without actual database. To test your database-related code you need to have a database. Full stop, don’t even argue about this. If you say you can do unit-tests for your db-layer, I say your tests are worth nothing.

A while ago I’ve blogged about integration tests and that article seems to be quite popular – in top 10 by visitors in the last 2 years. So I decided to write an update.

Continue reading

UPD There is a part 2 of this blog-post explaining how to do roles and fixing a minor issue with authentication.

UPD If you are on Windows 10 and get “System.IO.FileNotFoundException: The system cannot find the file specified”, have a look on this page. Thanks to David Engel for this link.

A while back I had to implement a login system that relied on in-house Active Directory. I did spend some time on figuring out how to work this in the nicest possible ways.

One of the approaches I used in the past is to slam Windows Authentication on top of the entire site and be done with it. But this is not very user-friendly – before showing anything, you are slammed with a nasty prompt for username/password. And you need to remember to include your domain name in some cases. I totally did not want that on a new green-field project. So here goes the instructions on how to do a nice authentication against your Windows Users or in-house hosted Active Directory.

For this project I’ll use Visual Studio 2015. But steps for VS2013 will be the same. Can’t say anything nice about any earlier versions of Visual Studio – I don’t use them anymore.

Continue reading

First time I have tried TFS Build feature a few years ago and that was on hosted TFS 2010 version. And I hated it. It all was confusing XML, to change a build template I had to create a Visual Studio Project. I barely managed to get NUnit tests to run, but could not do any other steps I needed. So I abandoned it. Overall I have spent about 3 days on TFS. Then I installed Team City and got the same result in about 3 hours. That is how good TeamCity was and how poor was TFS Build.

These days I’m looking for ways to remove all on-prem servers, including Source Control and Build server. (Yep, it is 2016 and some companies still host version control in house)

Visual Studio Team Services Build – now with NuGet feed

Recently I’ve been playing with Visual Studio Team Services Build (used to be Visual Studio Online). This is a hosted TFS server, provided by Microsoft. Good thing about it – it is free for teams under 5 people. At the end of 2015 Microsoft announced a new Package Management service. So I decided to take it for a spin. So today I’ll be talking about my experience with VSTS build, combined with Package Management.

Disclaimer

I’m not associated with Microsoft, nor paid for this post (that’s a shame!). All opinions are mine.
VSTS is moving fast with new features popping up every other week. So information in this post might become out-of date. Please leave a comment if things don’t work for you as expected here.

C# package in private NuGet feed

Today I actually need to create a C# library for internal use and publish it to a private NuGet feed. So I’ll write down my steps for future generations.
For the purposes of this guide, I presume you already have VSTS project and have some code in it. You also know what CI is and why you need a build server.

Continue reading

This post is a summary of links I’ve studied about ARM.

One of the things we do for our project – automatic provisioning of services in MS Azure for new clients. Couple years ago we used to do provisioning manually and that took days. Now we have a system that does it all for us – web-based system that talks to Azure API and asks for new websites/databases/storages/etc. to be created for the system we work on. And I’ve been actively writing this system for the last 3-4 months.

I’ve been using Azure Management C# libraries to access Azure API for couple years now. And as far as I could remember, these libraries never actually were out of preview and approved for production. And I had a lot of trouble with these, especially when I took half-year breaks from that project then came back and realised that half the API’s I’ve used are changed.

This time I come back to this problem and I realise that I’ve missed Azure Resource Management band-wagon and the plethora of new libraries. And need to start learning from scratch again (don’t you hate that?).

Now there are 2 ways to access Azure API: Azure Resource Management (ARM) and Classic Azure Management (the old way). ARM is the new cool kid in the block and looks like it is here to stay because new Azure Portal is totally based on this system. See comparison new and old ways.

There are a lot of differences between new and old. Old system required to authenticate through a certificate that you had to attach to your HTTP requests. And you had to manage these certs and all that. I’m sure there were other ways to authenticate, but when I started working with Azure Management API this was the only way that I knew. ARM allows to authenticate via Azure AD where you need to know couple Guids and a password. Here is the overview of ARM

The most radical change that ARM is really based on Resource Groups. And everything you create must be in a group. So you need to create a Resource Group first, then resources. There are benefits to that: you can view billing per group – i.e. put all resources related to a project and you can see how much that project costs you, without having to go through per-item subscription billing. Another massive benefit is access control. Now you can give users access only to a specific group of resources (you could only give access to a subscription before) (Read more about Role-Based Access Control and built-in roles).

Authentication

But I disgress – I’m working with API at the moment. Authentication is slightly easier now. You’ll need to create an application in Azure Active Directory, get a “client secret” from it and do 3-line C# code execution to get an authentication bearer token. Read more about authentication process here (including code sample). And this one shows creation of AD application.

Then for every request to ARM you need to attach this token as an Authentication header to HTTP request: request.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + token);

Requests

Now you don’t even need any libraries – you can form requests yourself pretty easy. You need authentication token as a header, you need to know the URL you need to work with. And then you POST/PUT json-formatted object. And to delete you do a DELETE request to that URL. And URL always maps to a resource – very RESTful indeed.

URL you need to work with looks similar to this:

https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourcetype}/{resourceName}?api-version={apiversion}

Here is the sample of URL accessing a website resource:

https://management.azure.com/subscriptions/suy64wae-f814-3189-8742-b53d5a4532cd/resourceGroups/NewResourceGroup/providers/Microsoft.Web/sites/MyHappyWebsiteName?api-version=2015-08-01

But don’t worry about this – you have a Resource browser https://resources.azure.com/ that will tell you exactly the URL you need – just navigate to already existing item and see the generated URL. This site is also going to give you JSON format/data to send through to portal.

Templates

Another great feature of ARM is Templates. Basically you write information about resources you need in JSON, add parameters there and feed that to ARM (either programmatically or through Azure Portal). Though I’ve not found a good programmatical way to use templates – I have seen samples where you need to upload template json file and parameters json file to Azure Storage and tell ARM where to look for. But I’m not convinced about this – sound like too many steps with uploading.

Here is the definition of templates. And you can create them within Visual Studio 2015 or you can copy-paste json from existing objects in resource browser and modify to your needs.

Links

This post will be a dumping ground of links, problems and solutions related to Swagger. I’ll be putting updates here as I go along


I’m starting a new project and we would love trying new things. This time we would like to jump on the whole Azure Api Applications with Swagger as a descriptor of API.

Great article “What is Swagger”

Here is the whole Swagger Spec – have a look what is possible.

Here is the pretty cool page with Swagger file editor

Tutorials for ASP.Net

Continue reading

I have seen a fair amount of questions on Stackoverflow asking how to prevent users sharing their password. Previously with MembershipProvider framework it was not a simple task. People went into all sorts of crazy procedures. One of the most common was to have a static global list of logged-in users. And if a user already in that list, the system denied their second login. This worked to an extent, until you clean cookies in the browser and try to re-login.

Luckily now Asp.Net Identity framework provides a simple and clean way of preventing users sharing their details or logging-in twice from different computers.

Continue reading

I’m an avid user on StackOverflow in questions about Asp.Net Identity and I attempt to answer most of the interesting questions. And the same question comes up quite often: users try to confirm their email via a confiramtion link or reset their password via reset link and in both cases get “Invalid Token” error.

There are a few possible solutions to this problem.

Continue reading