Today I’ve spent quite some time figuring out why my new Azure Subscription Settings file was not picked up by Octopus Deploy. And I was getting an obscure error message:

Get-AzureWebsite : Communication could not be established. This could be due to an invalid subscription ID. Note that subscription IDs are case sensitive.

Turned out that old Subscription Settings File was stuck in user cache and I had to “unstuck” it by executing this script from under the user account I was trying to execute PowerShell.

Remove-AzureSubscription 'Subscription Name' -Force

This does not actually do anything to the actual Azure subscription (I panicked about it first). Documentation says this only deletes subscription data file from so PowerShell can’t use it. Nothing to do with the actual Azure Subscription.

After removing old subscription data you can re-import new Subscription Settings File:

Import-AzurePublishSettingsFile 'C:/path/to/subscription.publishsettings'

Select-AzureSubscription -SubscriptionName "Subscription Name"

Hope this helps someone!

This post is a summary of links I’ve studied about ARM.

One of the things we do for our project – automatic provisioning of services in MS Azure for new clients. Couple years ago we used to do provisioning manually and that took days. Now we have a system that does it all for us – web-based system that talks to Azure API and asks for new websites/databases/storages/etc. to be created for the system we work on. And I’ve been actively writing this system for the last 3-4 months.

I’ve been using Azure Management C# libraries to access Azure API for couple years now. And as far as I could remember, these libraries never actually were out of preview and approved for production. And I had a lot of trouble with these, especially when I took half-year breaks from that project then came back and realised that half the API’s I’ve used are changed.

This time I come back to this problem and I realise that I’ve missed Azure Resource Management band-wagon and the plethora of new libraries. And need to start learning from scratch again (don’t you hate that?).

Now there are 2 ways to access Azure API: Azure Resource Management (ARM) and Classic Azure Management (the old way). ARM is the new cool kid in the block and looks like it is here to stay because new Azure Portal is totally based on this system. See comparison new and old ways.

There are a lot of differences between new and old. Old system required to authenticate through a certificate that you had to attach to your HTTP requests. And you had to manage these certs and all that. I’m sure there were other ways to authenticate, but when I started working with Azure Management API this was the only way that I knew. ARM allows to authenticate via Azure AD where you need to know couple Guids and a password. Here is the overview of ARM

The most radical change that ARM is really based on Resource Groups. And everything you create must be in a group. So you need to create a Resource Group first, then resources. There are benefits to that: you can view billing per group – i.e. put all resources related to a project and you can see how much that project costs you, without having to go through per-item subscription billing. Another massive benefit is access control. Now you can give users access only to a specific group of resources (you could only give access to a subscription before) (Read more about Role-Based Access Control and built-in roles).

Authentication

But I disgress – I’m working with API at the moment. Authentication is slightly easier now. You’ll need to create an application in Azure Active Directory, get a “client secret” from it and do 3-line C# code execution to get an authentication bearer token. Read more about authentication process here (including code sample). And this one shows creation of AD application.

Then for every request to ARM you need to attach this token as an Authentication header to HTTP request: request.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + token);

Requests

Now you don’t even need any libraries – you can form requests yourself pretty easy. You need authentication token as a header, you need to know the URL you need to work with. And then you POST/PUT json-formatted object. And to delete you do a DELETE request to that URL. And URL always maps to a resource – very RESTful indeed.

URL you need to work with looks similar to this:

https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourcetype}/{resourceName}?api-version={apiversion}

Here is the sample of URL accessing a website resource:

https://management.azure.com/subscriptions/suy64wae-f814-3189-8742-b53d5a4532cd/resourceGroups/NewResourceGroup/providers/Microsoft.Web/sites/MyHappyWebsiteName?api-version=2015-08-01

But don’t worry about this – you have a Resource browser https://resources.azure.com/ that will tell you exactly the URL you need – just navigate to already existing item and see the generated URL. This site is also going to give you JSON format/data to send through to portal.

Templates

Another great feature of ARM is Templates. Basically you write information about resources you need in JSON, add parameters there and feed that to ARM (either programmatically or through Azure Portal). Though I’ve not found a good programmatical way to use templates – I have seen samples where you need to upload template json file and parameters json file to Azure Storage and tell ARM where to look for. But I’m not convinced about this – sound like too many steps with uploading.

Here is the definition of templates. And you can create them within Visual Studio 2015 or you can copy-paste json from existing objects in resource browser and modify to your needs.

Links

For my own future reference, when I get a large SQL file, large enough that SSMS freaks out about lack of memory, I need to use Sqlcmd tool. And here how it is done:

sqlcmd -S localhost\sqlexpress -d MyDatabaseName -i .\InputFileName.sql

Note that localhost is important to put there, if you try to connect to .\sqlexpress, you will only get a connection error

Tonight I’ve done a talk on my local Aberdeen Developers .Net User Group about CQRS architecture and I think this was a success. After 70 minutes talk (I took my time in some code samples) we had another 30 minutes discussions and Q&A time. Some good quality questions were raised and some things been asked I have not thought about before.

I really enjoyed doing the talk and getting the message across. I think I’ll try doing this more and will look for a possibility to go to other cities to present these ideas. And if you think you and your team might benefit from CQRS architecture, let me know, we can organise something.

For those who have been on the talk, here are the slides and code samples

This post will be a dumping ground of links, problems and solutions related to Swagger. I’ll be putting updates here as I go along


I’m starting a new project and we would love trying new things. This time we would like to jump on the whole Azure Api Applications with Swagger as a descriptor of API.

Great article “What is Swagger”

Here is the whole Swagger Spec – have a look what is possible.

Here is the pretty cool page with Swagger file editor

Tutorials for ASP.Net

Continue reading

I have seen a fair amount of questions on Stackoverflow asking how to prevent users sharing their password. Previously with MembershipProvider framework it was not a simple task. People went into all sorts of crazy procedures. One of the most common was to have a static global list of logged-in users. And if a user already in that list, the system denied their second login. This worked to an extent, until you clean cookies in the browser and try to re-login.

Luckily now Asp.Net Identity framework provides a simple and clean way of preventing users sharing their details or logging-in twice from different computers.

Continue reading

My current setup for one of the projects requires a VM running in Azure – a tiny build server I set up just for myself. And I only need it when I actually working on the project. And the idea is to shut the VM down when I’m not working on the project. The fear was that one day I’ll forget to shut it down and I’ll be billed for time that I have not used.

So I’ve created a tiny script in PowerShell:

# Get your publishsettings file here: https://manage.windowsazure.com/publishsettings/
Import-AzurePublishSettingsFile "c:\Path\to\subscription\file.publishsettings"

Select-AzureSubscription -SubscriptionName "MySubscriptionName"

Stop-AzureVM -Name myVmName -ServiceName myVmName -Force

NB: you need -Force parameter otherwise the script will prompt for Y/N and will require user intervention.

And then I hooked the script as part of shutdown on my work PC:

  1. Run gpedit.msc
  2. Go to Computer Configuration -> Windows Settings -> Scripts -> Shutdown
    2015-07-04 15_03_14-Local Group Policy Editor

    1. Open Shutdown option and switch to PowerShell tab; Add your script there.
    2. PROFIT!

Now your VM will be shut-down when you shut-down your PC.

Another alternative was to have free web-site running on Azure, deploy there a Web-Job that constantly polls on your PC if it is working or not. And if your PC if offline, shut-down the VM. But this sounds like a lot of work and you’ll have to expose some sort of publicly available endpoint on your PC. So nothing fancy!

I’m an avid user on StackOverflow in questions about Asp.Net Identity and I attempt to answer most of the interesting questions. And the same question comes up quite often: users try to confirm their email via a confiramtion link or reset their password via reset link and in both cases get “Invalid Token” error.

There are a few possible solutions to this problem.

Continue reading

TL;DR: Go here for a code sample.

This is long overdue post – I’m too busy these days with contract work and other commitments.

By the nature of my work, I have to do data migrations on a regular basis. Some of the migrations from one database into another, but most of the migrations are from Excel spreadsheets into a database.
A database to database migration is somewhat nerve racking – usually the structure is broken, data is dirty, fields re-purposed and all other sort of terrible things you read about on The Daily Wtf. The best so far I’ve seen are along the lines of person middle names stored in field called CompanyName2; employee information stored in table called tbl_Jobs and info about actual jobs in table tbl_Job – see the difference here? That is all to avoid the confusion!

Database-to-database migration are unique in their own kind. But I found a lot of similar things in a process of export from Excel into SQL Server. So I’ll write down these things here – for my own future reference and you, my dear reader, might find this useful as well.

Continue reading