Using private NuGet feed with dotnet publish

In Azure DevOps you can have your own private NuGet feed and I’m using that extensively. But sometimes authentication there is a pain.

This time I run into authentication issue while executing dotnet publish for a probject that used that private feed:

Unable to load the service index for source https://mycompany.pkgs.visualstudio.com/DefaultCollection/_packaging/MyFeed/nuget/v3/index.json. [D:\a\1\s\src\MyProject.csproj]
error :   Response status code does not indicate success: 401 (Unauthorized). [D:\a\1\s\src\MyProject.csproj]

The reason for that is that dotnet publish does restore by itself, even though all the packages have been restored previously via dotnet restore.

The solution for this is to do the obvious thing: don’t restore on publish vis --no-restore.

But that gave me more errors:

error NETSDK1047: Assets file 'D:\a\1\s\src\MyProject\obj\project.assets.json' doesn't have a target for '.NETCoreApp,Version=v2.0/win-x64'. Ensure that restore has run and that you have included 'netcoreapp2.0' in the TargetFrameworks for your project. You may also need to include 'win-x64' in your project's RuntimeIdentifiers. 

Solution for that was to add <RuntimeIdentifier>win-x64</RuntimeIdentifier> in my csproj file.

The overall process goes like this:

dotnet restore MyProject.csproj --configfile D:\a\1\Nuget\tempNuGet_2089.config --verbosity Detailed

Where tempNuget file is configured by Azure DevOps and contains the credentials for the private feed.

dotnet build MyProject.csproj
dotnet publish MyProject.csproj --configuration Release --output D:\a\1\a\MyProject -r win-x64 --no-restore

And my csproj file looks like this:

<PropertyGroup>
  <OutputType>Exe</OutputType>
  <TargetFramework>netcoreapp2.0</TargetFramework>
  <RuntimeIdentifier>win-x64</RuntimeIdentifier>
  <StartupObject>MyProject.Program</StartupObject>
</PropertyGroup>

User impersonation in Asp.Net Core

I’ve blogged about user impersonation in Asp.Net MVC three years ago and this article has been in top 10 of most visited pages in my blog.

Since Asp.Net Core is out and more or less ready for production, it is time to review the article and give guidance how to do the same thing in Core. Approach have not changed at all, we do same steps as before, only some API been changed and it took me a bit of time to figure out all the updated API parts.

Impersonation Process

Impersonation is when an admin user is logged in with the same privileges as a user, but without knowing their password or other credentials. I’ve used this in couple applications and it was invaluable for support cases and debugging user permissions.

The process of impersonation in Asp.Net core is pretty simple – we create cookie for potential user and give this to the current admin user. Also we need to add some information to the cookie that impersonation is happening and give admin a way to go back to their own account without having to log-in again.

Continue reading

Working With Attachments in DocumentDB

I’m building a prototype for a new project and it was decided to use DocumentDB to store our data. There will be very little data and even less relationship between the data, so document database is a good fit. Also there is a chance for us to use DocumentDB in production.

There is a comprehensive documentation about the structure and how it all ties together. Yet not enough coding samples on how to use attachments. And I struggled a bit to come up with the working solution. So I’ll explain it all here for future generations.

Structure

This diagram is from the documentation

And this is correct, but incomplete. Store this for a moment, I’ll come back to this point later.

Continue reading

Database Integration Tests in Visual Studio Team Services and Cake Build

I’m a big fan of unit tests. But not everything can and should be unit tested. Also I do love CQRS architecture – it provides awesome separation of read and writes and isolates each read from the next read via query classes. Usually my queries are reading data from a database, though they can use any persistence i.e. files. About 90% of my queries are run against database with Entity Framework or with some micro ORM (currently I’m a fan of PetaPoco, though Dapper is also grand).

And no amount of stubs/mocks or isolating frameworks will help you with testing your database-access layer without actual database. To test your database-related code you need to have a database. Full stop, don’t even argue about this. If you say you can do unit-tests for your db-layer, I say your tests are worth nothing.

A while ago I’ve blogged about integration tests and that article seems to be quite popular – in top 10 by visitors in the last 2 years. So I decided to write an update.

Continue reading

Using OWIN and Active Directory to authenticate users in ASP.Net MVC 5 application

UPD There is a part 2 of this blog-post explaining how to do roles and fixing a minor issue with authentication.

UPD If you are on Windows 10 and get “System.IO.FileNotFoundException: The system cannot find the file specified”, have a look on this page. Thanks to David Engel for this link.

A while back I had to implement a login system that relied on in-house Active Directory. I did spend some time on figuring out how to work this in the nicest possible ways.

One of the approaches I used in the past is to slam Windows Authentication on top of the entire site and be done with it. But this is not very user-friendly – before showing anything, you are slammed with a nasty prompt for username/password. And you need to remember to include your domain name in some cases. I totally did not want that on a new green-field project. So here goes the instructions on how to do a nice authentication against your Windows Users or in-house hosted Active Directory.

For this project I’ll use Visual Studio 2015. But steps for VS2013 will be the same. Can’t say anything nice about any earlier versions of Visual Studio – I don’t use them anymore.

Continue reading

Build Service in Visual Studio Team Services

First time I have tried TFS Build feature a few years ago and that was on hosted TFS 2010 version. And I hated it. It all was confusing XML, to change a build template I had to create a Visual Studio Project. I barely managed to get NUnit tests to run, but could not do any other steps I needed. So I abandoned it. Overall I have spent about 3 days on TFS. Then I installed Team City and got the same result in about 3 hours. That is how good TeamCity was and how poor was TFS Build.

These days I’m looking for ways to remove all on-prem servers, including Source Control and Build server. (Yep, it is 2016 and some companies still host version control in house)

Visual Studio Team Services Build – now with NuGet feed

Recently I’ve been playing with Visual Studio Team Services Build (used to be Visual Studio Online). This is a hosted TFS server, provided by Microsoft. Good thing about it – it is free for teams under 5 people. At the end of 2015 Microsoft announced a new Package Management service. So I decided to take it for a spin. So today I’ll be talking about my experience with VSTS build, combined with Package Management.

Disclaimer

I’m not associated with Microsoft, nor paid for this post (that’s a shame!). All opinions are mine.
VSTS is moving fast with new features popping up every other week. So information in this post might become out-of date. Please leave a comment if things don’t work for you as expected here.

C# package in private NuGet feed

Today I actually need to create a C# library for internal use and publish it to a private NuGet feed. So I’ll write down my steps for future generations.
For the purposes of this guide, I presume you already have VSTS project and have some code in it. You also know what CI is and why you need a build server.

Continue reading

Azure Resource Management vs Classic Management

This post is a summary of links I’ve studied about ARM.

One of the things we do for our project – automatic provisioning of services in MS Azure for new clients. Couple years ago we used to do provisioning manually and that took days. Now we have a system that does it all for us – web-based system that talks to Azure API and asks for new websites/databases/storages/etc. to be created for the system we work on. And I’ve been actively writing this system for the last 3-4 months.

I’ve been using Azure Management C# libraries to access Azure API for couple years now. And as far as I could remember, these libraries never actually were out of preview and approved for production. And I had a lot of trouble with these, especially when I took half-year breaks from that project then came back and realised that half the API’s I’ve used are changed.

This time I come back to this problem and I realise that I’ve missed Azure Resource Management band-wagon and the plethora of new libraries. And need to start learning from scratch again (don’t you hate that?).

Now there are 2 ways to access Azure API: Azure Resource Management (ARM) and Classic Azure Management (the old way). ARM is the new cool kid in the block and looks like it is here to stay because new Azure Portal is totally based on this system. See comparison new and old ways.

There are a lot of differences between new and old. Old system required to authenticate through a certificate that you had to attach to your HTTP requests. And you had to manage these certs and all that. I’m sure there were other ways to authenticate, but when I started working with Azure Management API this was the only way that I knew. ARM allows to authenticate via Azure AD where you need to know couple Guids and a password. Here is the overview of ARM

The most radical change that ARM is really based on Resource Groups. And everything you create must be in a group. So you need to create a Resource Group first, then resources. There are benefits to that: you can view billing per group – i.e. put all resources related to a project and you can see how much that project costs you, without having to go through per-item subscription billing. Another massive benefit is access control. Now you can give users access only to a specific group of resources (you could only give access to a subscription before) (Read more about Role-Based Access Control and built-in roles).

Authentication

But I disgress – I’m working with API at the moment. Authentication is slightly easier now. You’ll need to create an application in Azure Active Directory, get a “client secret” from it and do 3-line C# code execution to get an authentication bearer token. Read more about authentication process here (including code sample). And this one shows creation of AD application.

Then for every request to ARM you need to attach this token as an Authentication header to HTTP request: request.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + token);

Requests

Now you don’t even need any libraries – you can form requests yourself pretty easy. You need authentication token as a header, you need to know the URL you need to work with. And then you POST/PUT json-formatted object. And to delete you do a DELETE request to that URL. And URL always maps to a resource – very RESTful indeed.

URL you need to work with looks similar to this:

https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourcetype}/{resourceName}?api-version={apiversion}

Here is the sample of URL accessing a website resource:

https://management.azure.com/subscriptions/suy64wae-f814-3189-8742-b53d5a4532cd/resourceGroups/NewResourceGroup/providers/Microsoft.Web/sites/MyHappyWebsiteName?api-version=2015-08-01

But don’t worry about this – you have a Resource browser https://resources.azure.com/ that will tell you exactly the URL you need – just navigate to already existing item and see the generated URL. This site is also going to give you JSON format/data to send through to portal.

Templates

Another great feature of ARM is Templates. Basically you write information about resources you need in JSON, add parameters there and feed that to ARM (either programmatically or through Azure Portal). Though I’ve not found a good programmatical way to use templates – I have seen samples where you need to upload template json file and parameters json file to Azure Storage and tell ARM where to look for. But I’m not convinced about this – sound like too many steps with uploading.

Here is the definition of templates. And you can create them within Visual Studio 2015 or you can copy-paste json from existing objects in resource browser and modify to your needs.

Links

Swagger Awesomeness

This post will be a dumping ground of links, problems and solutions related to Swagger. I’ll be putting updates here as I go along


I’m starting a new project and we would love trying new things. This time we would like to jump on the whole Azure Api Applications with Swagger as a descriptor of API.

Great article “What is Swagger”

Here is the whole Swagger Spec – have a look what is possible.

Here is the pretty cool page with Swagger file editor

Tutorials for ASP.Net

Continue reading

Prevent Multiple Logins in Asp.Net Identity

I have seen a fair amount of questions on Stackoverflow asking how to prevent users sharing their password. Previously with MembershipProvider framework it was not a simple task. People went into all sorts of crazy procedures. One of the most common was to have a static global list of logged-in users. And if a user already in that list, the system denied their second login. This worked to an extent, until you clean cookies in the browser and try to re-login.

Luckily now Asp.Net Identity framework provides a simple and clean way of preventing users sharing their details or logging-in twice from different computers.

Continue reading