Moving from Entity Framework 5 to Entity Framework 6 and back or “Could not determine storage version”

Recently we have upgraded our project to run on EF 6. And this is still in beta in Deb branch. So this has not been deployed to most of our customers. And today I had to solve a bug for older version of our system that is still on EF5.

Long story short: I’ve tried to connect to DB that already had migrations from EF6, but I was connecting with EF5. And cryptic error messages emerged, just like EF loves to throw cryptic error messages, this time it was no exception, excuse the pun.

This time the exception is saying: The provider did not return a ProviderManifest instance. with internal exception saying Could not determine storage version; a valid storage connection or a version hint is required. Which made no sense at that point. Until you spend couple hours trying to figure out what is happening.

Let me go into EF migration system details slightly. EF stores database state information in __MigrationHistory table that looks like this for EF5:

EF5_MigrationHistory

For EF6 the structure of the table have slightly changed:

EF6_MIgrationHistory

Notice on the second screenshot there is another column for DbContext type name used for the migration. This is a new feature in EF6 – allows more than one context to live in one project and you can run migrations for both of them. Have not tried this one yet, but time will come pretty soon!

Also once we have run migrations in EF6, the version recorded next to the migration record have changed from 5.xx to 6.xx – this is also visible on the last screenshot.

And if you try to connect to this database with older version of EF, table structure throws the system off it’s tracks. I’ve tried updating version number in data, but it did not change anything, do don’t bother.

I have not found the solution to this issue, other than re-create database migrations in EF5. So be careful when you update to EF6 and run migrations – there is no way back to EF5.

OData controllers in WebApi 2.1: getting 404 page not found?

Note for yourself, when you are registering odata route in web-api configuration, it must be placed before the default catch-all route. I.e. :

Gloabal.asax.cs:

    protected void Application_Start()
    {
        // .. do stuff

        // This must be before registering all other routes    
        GlobalConfiguration.Configure(WebApiConfig.Register);

        AreaRegistration.RegisterAllAreas();

        RouteConfig.RegisterRoutes(RouteTable.Routes);

        // do other stuff...
    }

Of you place web-api route registration after MVC route registration, MVC routes will take priority and probably you won’t have OdataController in your MVC code.

And no, nothing magic about OData there, just standard routes registered in Asp.Net application.

Azure Deployment: The feature named NetFx451 that is required by the uploaded package is not available in the OS * chosen for the deployment.

I’ve updated one of our Azure-hosted projects from MVC3 to MVC5. And at the same time I’ve bumped from .Net 4 to version 4.5.1. And after deployment I’ve bumped into a cryptic error:

The feature named NetFx451 that is required by the uploaded package is not available in the OS * chosen for the deployment.

A quick search revealed that if your service is based on something below Windows Server 2012 R2, you’ll get this error for .Net 4.5.1. To fix this go into all your *.cscfg files and in the very top of the file, in the signature of <ServiceConfiguration> node, there is osFamily="3" property. Update this to osFamily="4":

image_476D46C2

Don’t forget to update this in all of your .cscfg files. By the way, this is recommended by Scott Gu in one of his announcements.

MVC bundling and minification: a BetterStyleBundle

After using MVC bundling and minification for a bit I run into a few problems. All of them are well known and will cause you trouble.

First of the problems I encountered – reordering of files in the bundle. When I add to bundle Main.css and then add MainOverrides.css I bloody-well mean it to stay in this order, cause overrides will not do anything if they are posted before the main file. Same goes to jquery and plugins.

Another issue I run into was image urls inside of css. When css files are bundled, they are output as:

<link href="/bundles/jquery-ui_css?v=U5E9COcCYIcH9HU1Qr9aiQotqSzLGT8YYDi6qhOpObk1" rel="stylesheet"/>

Continue reading

Return Exception ??

Generally Exceptions are usually thrown by parts of the code that encounter unrecoverable situation. Like famous NullReferenceException is thrown when code tries to access properties of object that is null. I’m sure you know this, don’t need to tell that 101 of programming again. My point is that exceptions are always thrown and it is down to other parts of code to catch exceptions and do something with them. Because if exceptions are not caught, the’ll bubble up to the top level and will crash your application.

I’ve never seen any methods that return exceptions. Never thought of exceptions as something that can be returned by a method. Have you?

Today I’ve spent some time doing work on bundling and minification in MVC5 project. I did open dotPeek decompiler to check how it actually works. And I came across this bit of code:

  Exception exception = this.Items.IncludePath(virtualPath, transforms);
  if (exception != null)
  {
    throw exception;
  }
  // do other stuff

I’m not posting full source code here, but it is available from MS Symbols server. That’s where dotPeek got the sources from

Note in the block above a method does something and exception is returned. It does not throw exception. Exception is thrown a level closer to user – method I’m showing you is public and I’m going to use it. Method that returns exception is internal and not accessible for consumer.

My first reaction was that this breaks Clean Code convention: method modifies state of application and does not return objects, or method returns information, but does not change anything. Basically commands and queries in CQRS: commands for writing, queries for gathering data.
Here we have a clear violation – method modifies program state and returns exception.

I went exploring deeper and looked through some other methods. The convention in this library is: private methods return exceptions. Public methods throw exceptions. This looks strange and new to me.

Remembering university course on machine code (a.k.a. Assembler, yes, I’ve done that one and loved it!) throwing an Exception involves walking execution stack which involves a lot of CPU cycles and tracing (details escape me now). This results thrown exception being expensive in terms of CPU cycles. A quick googling revealed a stack of answers and blog posts to confirm my understanding.

Given this, I presume this technique is used in the library to avoid performance penalty on throwing an exception. And the deeper the exception is thrown in the execution stack, the slower it is. Also this technique adds a lot of extra boilerplate code to methods. If you use traditional throw new Exception() then instead of this:

    Exception exception = this.IncludePath(virtualPath, (IItemTransform[]) null);
    if (exception != null)
      return exception;

you’d simply have this:

this.IncludePath(virtualPath, (IItemTransform[]) null);

Where private method would just throw an exception.

Bottom line. Unless this is used for purposes other than optimisation, don’t use this technique – it’s confusing and against conventions. And if you find your exceptions to be a bottleneck (enough to optimise it), you throw way too many exceptions and need to rethink your application. If there is other purpose to this approach, please tell me!

Check-in Nuget packages into your TFS

Yesterday Nuget server was down for couple hours. That wasn’t nice. I got stuck for a bit, because I blew up my environment and needed to re-download all the packages again. And some of them were not in the local cache. And just the same night Mr. Mark Seemann blogged that auto-restore in Nuget is considered harmful. And I could not disagree with Mark. Amount of time I’ve spent on trouble-shooting missing packages or package restore in the last year mounts to probably a week. Most of that time was spent for package restore on build servers. And just for these little shits I had to dig very deep into TFS build template, modify it, so it can restore packages before attempting to build it. It was not a nice experience.

Anyway, long story short. Today I decided to give that advice a go and check in packages for one medium sized project hosted on TFS. I went to Source Control Explorer, added existing folder of packages, it added a big number of files and then I checked-in.

Build server crunched for a moment and failed the build: references are not found. Strange. Connected to the build server and discovered that TFS by default ignored *.DLL files but included all other crap that comes with nuget packages (xml files, readme.md, source code, etc.). That was strange. After a bit of digging I found that TFS has a list of file-types that it looks for.

Continue reading

Run xUnit in hosted Team Foundation Service

Now, as promised my other project is hosted and built on hosted Team Foundation Service and it took me a bit of fiddling before I managed to run NUnit tests on it. However, there is an excellent walk-through how to do that: http://www.mytechfinds.com/articles/software-testing/6-test-automation/72-running-nunit-tests-from-team-foundation-server-2012-continuous-integration-build

Much in the same way you can do xUnit:

  1. Download xunit.runner.visualstudio-x.x.x.vsix from Codeplex.
  2. Change file extension from .vsix to .zip and open with zip archiver.
  3. Copy all xunit.*.dll files to a folder that is mapped to TFS and check them in. I prefer to have these files in /BuildFramework folder

xUnit_VS

  1. Now, pretty much the same as with NUnit set up (If you have not done that before), you need to update settings for Build Controller. In VS go to Team Explorer -> Builds -> Actions -> Manage Build Controllers. And set “Version Control path to custom assemblies” to the folder where you have saved adapters for xUnit:

Build_controller

And you are pretty much done. You can now check in and your xUnit tests will run on hosted TFS (Presuming TFS is set up correctly to run your tests).

Convert your projects from NUnit/Moq to xUnit with NSubstitute

UPDATE: After initial excitement about xUnit, I needed to deal with some specifics. And now I don’t advise to move ALL your tests to xUnit. I’d recommend go into that slow and read through my rant here, before you commit to converting.

Nunit is a great testing framework. Only it is a little old now and new testing approaches are no longer fit into Nunit concepts. I’m talking about using Autofixture for generating test data. So I decided to see how xUnit will pan out for our projects and should we convert all our tests into xUnit or not.

Benefits of xUnit over NUnit are simple for me: AutoData attribute from Autofixture does not play nicely with NUnit due to architectural wirings. There are ways to have Autodata for
Nunit
, but this does not work nicely with Resharper 8.1 and NCrunch.

Continue reading

Visual Studio 2013 *-package did not load correctly

Today for no reason I’ve faced a quirk from Visual studio: on attempt to connect to TFS I faced with strange error: Page not found.. And projects did load but with bunch of errors, like Microsoft.VisualStudio.Web.PasteJson package did not load correctly and many other saying that other packages did not load correctly.

Quick googling revealed that reinstall of VS helps, but I did not fancy that – too much hassle. Further googling also suggested that runnig as admin devenv /setup from your command line should fix the issue. And it did! Thanks to this answer on SO.

Also there was a suggestion to run devenv /resetuserdata if the above fails. But I have not tried that, because the first option worked for me.

And since I’m talking about devenv, here is the list of all the switches applicable to Visual Studio 2013.

You can see there options like /build or /clean or /rebuild – these do not start up Visual Studio, but rather work as Msbuild – build/clean/rebuild solutions. The more useful flags are

  • /safemode – start VS in Safe Mode. In case one of the extensions messes up big time.
  • /ResetSettings – resets user settings to default
  • /setupForces Visual Studio to merge the resource metadata that describes menus, toolbars, and command groups, from all available VSPackages. Whatever it means, but it fixes problems I’ve described above with loading packages.

Bundling all your assemblies into one or alternative to ILMerge

Before today I’ve heard of IlMerge tool. This allows you to bundle all your small .dll files into one. So if you deploy stand-alone applications, you only have to give a single file to your users. Today I decided to give it a go. And I was disappointed -( error messages were endless. IlMerge did not like this or that or some other assembly.

I quickly looked around and found Costura.Fody package that did exactly what I needed, only I did not have to do any work. All you need to do is install nuget package:

Install-Package Costura.Fody

And you are done. No, really. Build your project and check out the output. You’ll see a bunch of dll files as before. This is because input files are not deleted automatically. But if you check your executable – it is much larger now. And you can move it away from the rest of libraries files and it will work.

There are some settings available. Check the details on https://github.com/Fody/Costura.

Package installation will create FodyWeavers.xml file in the root of your solution – put your settings there. I needed to exclude debug symbols from the resulting file. Here is how my FodyWeavers.xml look like:

<?xml version="1.0" encoding="utf-8" ?>
<Weavers>
  <Costura IncludeDebugSymbols='false' />
</Weavers>

Also I did not care about “minifying” the libraries into one file when I built the debug, but I needed it for Release build. So in my .csproj file, at the very end I have this:

  <Import Project="Fody.targets" Condition=" '$(Configuration)' == 'Release' " />
  <!-- Below is the clean-up - removing the input dll files-->
  <Target AfterTargets="AfterBuild;NonWinFodyTarget" Name="CleanReferenceCopyLocalPaths" Condition=" '$(Configuration)' == 'Release' ">
    <Delete Files="@(ReferenceCopyLocalPaths->'$(OutDir)%(DestinationSubDirectory)%(Filename)%(Extension)')" />
  </Target>

By default the package only adds one line:

  <Import Project="Fody.targets" />

And usually this is good enough. But I’ve added checking for build type and also my OCD required to produce a clean output folder. All this is explained on the GitHub page, on the very bottom.

Overall, very nice package! It just works.