Over the Christmas and New Year holidays I have finished a book by Uncle Bob Martin The Clean Coder. It was a pleasure to read this book and I have finished it rather quickly. The overall feel to the book was like I’m reading a big collection of blog-posts from Uncle Bob.

Entire book is filled with advice for software developers. But I found that a lot of this advice is aimed at junior developers. Myself being in a senior role for a while now and doing project management as well, I did not find a lot of new stuff. There was a few interesting points of view on some things, like “being in The Zone is counter-productive”. I did not agree with this point at first, but then thinking about it when I was coding, I do get some of the arguments against being in The Zone.

Another thing I did not quite agree was that “QA Should Find Nothing”, meaning when you send your system for testing, there should be no bugs. Yes, you should not let a buggy software escape your desk. But testing it over and over again – I find this counter-productive. If you release a feature, most likely you have been looking on that piece of project for a few days or even weeks. And it works because you tested it. But I would not rely on your testing because you know how it works and you know what input is expected. If somebody else looks on the system and feeds unexpected input, in a lot of times it will be different point of view on the feature. In my view purpose of QA testing is exploratory testing: what happens if I type rubbish input into this field, double click on Save button and close the browser immediately instead of waiting for the transaction to complete? Or something like that. Because all the other testing can (and should) be automated.

Also 100% code coverage by unit tests? Please don’t do it – that’s a waste of time. All critical business logic should be tested. But there is are tons of boilerplate code that have no point in testing. I.e. Controller Actions in MVC, when using CQRS architecture. OK, OK. I will not go there – there have been numerous debates about purpose of 100% code coverage. I have tried 100% coverage once on a smallish-project and that was a waste of effort that business did not appreciate.

I very much agree with continuous learning – this applicable to all developers however senior. I actually think all professionals, not just software developers, should do it. Software technology moves so fast now – you need to learn new stuff before you go out of fashion.

Though not all advice was for novice – conflict management and “Saying No” was right to the point. I’m pretty good with saying “no” but my diplomacy skills are lacking, so these chapters were very useful.

Overall this was a good read with some interesting points and insights, but if you have been following the industry last few years you already know quite a lot from this book.

First time I have tried TFS Build feature a few years ago and that was on hosted TFS 2010 version. And I hated it. It all was confusing XML, to change a build template I had to create a Visual Studio Project. I barely managed to get NUnit tests to run, but could not do any other steps I needed. So I abandoned it. Overall I have spent about 3 days on TFS. Then I installed Team City and got the same result in about 3 hours. That is how good TeamCity was and how poor was TFS Build.

These days I’m looking for ways to remove all on-prem servers, including Source Control and Build server. (Yep, it is 2016 and some companies still host version control in house)

Visual Studio Team Services Build – now with NuGet feed

Recently I’ve been playing with Visual Studio Team Services Build (used to be Visual Studio Online). This is a hosted TFS server, provided by Microsoft. Good thing about it – it is free for teams under 5 people. At the end of 2015 Microsoft announced a new Package Management service. So I decided to take it for a spin. So today I’ll be talking about my experience with VSTS build, combined with Package Management.

Disclaimer

I’m not associated with Microsoft, nor paid for this post (that’s a shame!). All opinions are mine.
VSTS is moving fast with new features popping up every other week. So information in this post might become out-of date. Please leave a comment if things don’t work for you as expected here.

C# package in private NuGet feed

Today I actually need to create a C# library for internal use and publish it to a private NuGet feed. So I’ll write down my steps for future generations.
For the purposes of this guide, I presume you already have VSTS project and have some code in it. You also know what CI is and why you need a build server.

Continue reading

Last year, before Christmas I had a jolly time in my office, cleaning up old bits of code and other stuff. One of the things I’ve “Cleaned-up” was old expired Azure Management Certificates. Then I went for 2 weeks holiday travelling around England. Until I had a text message from my colleague saying that one of our production system is down and not going back up. Error was something to do with Azure authentication.

Quickly doing the math I’ve realised that I have accidentally removed a current management certificate that was used in our system. The cert was used on the start-up phase of the app to check that all Azure Resources were available for the system to operate. And of-course at the time when I deleted the certs there were no outages and all systems run as normal. Until one of them decided to get restarted (because Azure). And when it went down it never came back up from because certificate that the system had was removed by me.

After fixing the systems (sorry, man!) and a bit of investigation turned out that our system had an expired certificate and it could authenticate with the service with it. So Azure Management Portal did not check if the management certificate was expired.

Now I’m not sure if this is a bug or a feature, but certainly I’m not going to report that because if this is going to be fixed, how many other systems with expired certs will go down without a warning?

From now on a rule of thumb – never to delete management certificates, even expired ones!

Today I’ve spent quite some time figuring out why my new Azure Subscription Settings file was not picked up by Octopus Deploy. And I was getting an obscure error message:

Get-AzureWebsite : Communication could not be established. This could be due to an invalid subscription ID. Note that subscription IDs are case sensitive.

Turned out that old Subscription Settings File was stuck in user cache and I had to “unstuck” it by executing this script from under the user account I was trying to execute PowerShell.

Remove-AzureSubscription 'Subscription Name' -Force

This does not actually do anything to the actual Azure subscription (I panicked about it first). Documentation says this only deletes subscription data file from so PowerShell can’t use it. Nothing to do with the actual Azure Subscription.

After removing old subscription data you can re-import new Subscription Settings File:

Import-AzurePublishSettingsFile 'C:/path/to/subscription.publishsettings'

Select-AzureSubscription -SubscriptionName "Subscription Name"

Hope this helps someone!

This post is a summary of links I’ve studied about ARM.

One of the things we do for our project – automatic provisioning of services in MS Azure for new clients. Couple years ago we used to do provisioning manually and that took days. Now we have a system that does it all for us – web-based system that talks to Azure API and asks for new websites/databases/storages/etc. to be created for the system we work on. And I’ve been actively writing this system for the last 3-4 months.

I’ve been using Azure Management C# libraries to access Azure API for couple years now. And as far as I could remember, these libraries never actually were out of preview and approved for production. And I had a lot of trouble with these, especially when I took half-year breaks from that project then came back and realised that half the API’s I’ve used are changed.

This time I come back to this problem and I realise that I’ve missed Azure Resource Management band-wagon and the plethora of new libraries. And need to start learning from scratch again (don’t you hate that?).

Now there are 2 ways to access Azure API: Azure Resource Management (ARM) and Classic Azure Management (the old way). ARM is the new cool kid in the block and looks like it is here to stay because new Azure Portal is totally based on this system. See comparison new and old ways.

There are a lot of differences between new and old. Old system required to authenticate through a certificate that you had to attach to your HTTP requests. And you had to manage these certs and all that. I’m sure there were other ways to authenticate, but when I started working with Azure Management API this was the only way that I knew. ARM allows to authenticate via Azure AD where you need to know couple Guids and a password. Here is the overview of ARM

The most radical change that ARM is really based on Resource Groups. And everything you create must be in a group. So you need to create a Resource Group first, then resources. There are benefits to that: you can view billing per group – i.e. put all resources related to a project and you can see how much that project costs you, without having to go through per-item subscription billing. Another massive benefit is access control. Now you can give users access only to a specific group of resources (you could only give access to a subscription before) (Read more about Role-Based Access Control and built-in roles).

Authentication

But I disgress – I’m working with API at the moment. Authentication is slightly easier now. You’ll need to create an application in Azure Active Directory, get a “client secret” from it and do 3-line C# code execution to get an authentication bearer token. Read more about authentication process here (including code sample). And this one shows creation of AD application.

Then for every request to ARM you need to attach this token as an Authentication header to HTTP request: request.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + token);

Requests

Now you don’t even need any libraries – you can form requests yourself pretty easy. You need authentication token as a header, you need to know the URL you need to work with. And then you POST/PUT json-formatted object. And to delete you do a DELETE request to that URL. And URL always maps to a resource – very RESTful indeed.

URL you need to work with looks similar to this:

https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourcetype}/{resourceName}?api-version={apiversion}

Here is the sample of URL accessing a website resource:

https://management.azure.com/subscriptions/suy64wae-f814-3189-8742-b53d5a4532cd/resourceGroups/NewResourceGroup/providers/Microsoft.Web/sites/MyHappyWebsiteName?api-version=2015-08-01

But don’t worry about this – you have a Resource browser https://resources.azure.com/ that will tell you exactly the URL you need – just navigate to already existing item and see the generated URL. This site is also going to give you JSON format/data to send through to portal.

Templates

Another great feature of ARM is Templates. Basically you write information about resources you need in JSON, add parameters there and feed that to ARM (either programmatically or through Azure Portal). Though I’ve not found a good programmatical way to use templates – I have seen samples where you need to upload template json file and parameters json file to Azure Storage and tell ARM where to look for. But I’m not convinced about this – sound like too many steps with uploading.

Here is the definition of templates. And you can create them within Visual Studio 2015 or you can copy-paste json from existing objects in resource browser and modify to your needs.

Links

For my own future reference, when I get a large SQL file, large enough that SSMS freaks out about lack of memory, I need to use Sqlcmd tool. And here how it is done:

sqlcmd -S localhost\sqlexpress -d MyDatabaseName -i .\InputFileName.sql

Note that localhost is important to put there, if you try to connect to .\sqlexpress, you will only get a connection error

Tonight I’ve done a talk on my local Aberdeen Developers .Net User Group about CQRS architecture and I think this was a success. After 70 minutes talk (I took my time in some code samples) we had another 30 minutes discussions and Q&A time. Some good quality questions were raised and some things been asked I have not thought about before.

I really enjoyed doing the talk and getting the message across. I think I’ll try doing this more and will look for a possibility to go to other cities to present these ideas. And if you think you and your team might benefit from CQRS architecture, let me know, we can organise something.

For those who have been on the talk, here are the slides and code samples

This post will be a dumping ground of links, problems and solutions related to Swagger. I’ll be putting updates here as I go along


I’m starting a new project and we would love trying new things. This time we would like to jump on the whole Azure Api Applications with Swagger as a descriptor of API.

Great article “What is Swagger”

Here is the whole Swagger Spec – have a look what is possible.

Here is the pretty cool page with Swagger file editor

Tutorials for ASP.Net

Continue reading

I have seen a fair amount of questions on Stackoverflow asking how to prevent users sharing their password. Previously with MembershipProvider framework it was not a simple task. People went into all sorts of crazy procedures. One of the most common was to have a static global list of logged-in users. And if a user already in that list, the system denied their second login. This worked to an extent, until you clean cookies in the browser and try to re-login.

Luckily now Asp.Net Identity framework provides a simple and clean way of preventing users sharing their details or logging-in twice from different computers.

Continue reading