Quartz.Net in Azure with Autofac. Smoothness.

Quartz.Net is an awesome scheduler for .Net world. It is a copy-cat from Java brother. The documentation is pretty useless, as it has not been updated for a while. The docs are useful to understand the general principals of operation, but examples of code usually don’t work. Instead the download provided on the site has a lot of examples. Use them as a guidance. Read the code, documentation sucks.

Anyway, I’ve spent couple days trying to get Quartz.Net to work inside of Azure and with Autofac dependency injection. And there are a few lessons that I’ve learned that I would like to share.

Lesson 1. Don’t run Quartz inside of the IIS.

Quartz is a service and it must be run as a service (think of Windows Service or Linux deamon). IIS is a service as well, but it does not run web-sites in the service mode, rather IIS itself runs as a service and it starts up the web-sites when needed. You can tell IIS not to shut down your application pool ever. But this is still not reliable. If you are limited by your hosting, where you don’t have a control over IIS and have to have scheduler, you can execute Quartz from App_start() but make sure your jobs are tested to be executed in the wrong time, as you will miss the scheduled time for jobs. After all, if nobody looks on the web-site, nobody will know that the data is not updated accordingly.

Microsoft Azure has a few ways to set up your systems. The basic is Web-site – when you deploy only web-site. Think of a shared hosting and shared IIS. In this scenario you would not be able to execute Quartz as a service. Next step up is Web Role and Worker Role.

Web Role is your own virtual machine configured to host your web-app in IIS. Worker Role is your VM, but with no IIS. In Worker Role you do background operations that can’t (or should not) be done in IIS context. Your Quartz.Net installation must be run within Worker Role. But your application deployed in Web Role already has ability to be Worker Role. Look on your WebRole.cs file in the root of your Azure application. You should see this:

public class WebRole : RoleEntryPoint
{
    public override bool OnStart()
    {
        // For information on handling configuration changes
        // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

        return base.OnStart();
    }

    public override void Run()
    {
        base.Run();
    }

    public override void OnStop()
    {
        base.OnStop();
    }
}

This is part of WebRole that runs as a service. You have you service initialisation in OnStart(), your main operation should happen in Run() and shut-down sequence is inside OnStop(). You need to start your scheduler inside of OnStart(). I’ll show code examples later.

Lesson 2. Your WebRole is running as a stand alone app.

When I configured Quartz.Net to be run inside of the WebRole, I tried to access Autofac DI container configured inside of the IIS. And exceptions started to fly all over the place. And then I realised that Worker Role has nothing to do with IIS configuration or objects. WebRole class does not have access to any of the objects inside of your Asp.Net context. Think of your WebRole.cs to be executed on separate machine from your IIS. Even if your WebRole is a part of your MVC application. Azure spawns a service thread separately for you WebRole operations: different AppDomain. When I realised that, I thought “OK, you cheeky, I’ll just create DI container again and get on with that”. Yep, sure that would work. But our previous configuration of DI involved Http.Current somewhere and some other MVC stuff that was injected. So next exception I got from Autofac was that it can not inject descenders from Http.Current because of NullReferenceException. That took a bit of thinking before realising “THERE IS NO HTTP CONTEXT!”. Of course, WebRole is a different application, outside of ISS. So it took me another day to pick apart our DI configuration and separate everything into modules: dependencies that are required only if the thread is inside of MVC, domain object dependencies – objects required for domain, like IRepository and IService. And then module to configure dependencies to be run inside of the WebRole thread.

Lesson 3. WebRole is not web-app.

Fine, when I got DI refactored and re-configured, I thought “happy days! let’s finish this!”. And got an exception about “Can’t load assembly because NLog assembly version required does not match existing assembly version”. Again, that got me puzzled for a bit. I checked web.config and found assembly binding redirect:

  <dependentAssembly>
    <assemblyIdentity name="NLog" publicKeyToken="5120e14c03d0593c" culture="neutral" />
    <bindingRedirect oldVersion="0.0.0.0-2.0.1.0" newVersion="2.0.1.0" />
  </dependentAssembly>

And that worked fine in MVC app. I used an awesome CheckAsm tool to check what assembly requires older version of NLog. Turned out that adapter for Nlog to Logentries was compiled against older version of Nlog. Yeah, but binding redirect was in place and it worked in MVC. And with yet another light-bulb moment I shouted “THIS IS NOT A WEB-APP!” and created app.config and placed assembly binding re-direct there:

<?xml version="1.0"?>
<!--This file is required for WebRole worker process. See WebRole.cs-->
<configuration>
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
      <dependentAssembly>
        <assemblyIdentity name="NLog" publicKeyToken="5120e14c03d0593c" culture="neutral" />
        <bindingRedirect oldVersion="0.0.0.0-2.0.0.0" newVersion="2.0.1.0" />
      </dependentAssembly>
      <dependentAssembly>
        <assemblyIdentity name="Common.Logging" publicKeyToken="af08829b84f0328e" culture="neutral" />
        <bindingRedirect oldVersion="0.0.0.0-2.1.2.0" newVersion="2.1.2.0" />
      </dependentAssembly>
    </assemblyBinding>
  </runtime>
</configuration>

After that I hit F5 and did not get “Assembly not found” exception. That was orgasmic! OK, just exciting. Oh well, it just worked, what’s the big deal? But wait a minute. Let’s jump a bit further. When I deployed the up into Azure, and tried to run it, I got the same exception again: Azure VM could not bind the assembly and it ignored the binding redirect. In fact, it ignored app.config completely. After a bit of googling it turned out that I’m not alone on that issue. But the solution was simple: rename your app.config into assemblyName.dll.config. And that works locally and on Azure.

Lesson 4. Serialise your objects

OK, I have DI configured, Quartz running inside of Azure as a service. How do I create jobs and schedule them. My first reaction was to create classes, populated properties with data, create logic for Execute() method, give it dependencies from DI and place them into a scheduler. Yeah, sure. But Quartz does not serialise your objects. You can’t simply pass data into your jobs and store them until the execution time comes. You have to wrap your jobs into JobDetails and for your data create a Dictionary<String,String> with required data. And then when the job gets executed, pull the data from that dictionary. That looked shit to me. Plenty of scope for developer errors and bugs. And goes against Single Responsibility Principle: objects should not be responsible for populating their own data.

I poked about Google for the solution and did not find anything I like. Solution came easily. I already had Json.Net nuget package installed, so I just used that. I serialised the objects into JSON, stored that text in the Quartz dictionary and passed it into the job. On the other end I get the string with JSON and populate the objects back into memory-object. That saves a lot of plumbing code for getting data out of dictionaries and managing magic strings.

Alternatively, if you don’t want to take dependency on Json.Net, you can use standard C# ways of serialising objects: [Serializable] attribute, convert the objects into strings and store the strings in the dictionary. The problem here that Json is taking much less space: object with 4 Guid parameters in Json is about 0.4Kb, while if serialised C#-way, it’s taking 1.6Kb. All because of the massive overhead of meta-information for the class. Also I’m not sure how C#-serialisation will react to class refactoring. If job is placed in a queue, lets say for the next month, but while job is in the queue the implementation of the job has changed. What would happen when you try to de-serialise strings into the class? In this case Json.Net would not care, as long as the property names match, it’ll just do it, even if the class was renamed. So for fast-growing application I’d say JSON is a safer bet.

Code

And finally some code for you to copy-paste.

Here is the WebRole.cs

public class WebRole : RoleEntryPoint
{
    private IContainer autofacContainer;
    private IConfiguration settingsService;
    private ISchedulerService schedulingService;

    // error handling and logging is shortened for brievety
    public override bool OnStart()
    {
        // wrapper to get settings form Azure config files
        settingsService = new WebConfiguration(); 

        // Create DI container
        autofacContainer = AutofacConfig.ConfigureForWorkerRole(settingsService); 

        // wrapper for Quartz.Net scheduler
        schedulingService = autofacContainer.Resolve<ISchedulerService>();

        // start the service
        schedulingService.StartScheduler();

        return base.OnStart();
    }


    public override void Run()
    {
        // don't really have anything here related to Quartz
    }



    public override void OnStop()
    {
        schedulingService.StopScheduler(); // gracefully finish all the tasks before shutting down.
        base.OnStop();
    }
}

Here is my SchedulerService class

public class QuartzService : ISchedulerService
{
    private readonly IConfiguration settings;
    private readonly ISchedulerFactory schedulerFactory;
    private readonly IScheduler scheduler;


    public QuartzService(IConfiguration settings, IJobFactory autofacJobFactory)
    {
        this.settings = settings;

        var properties = new NameValueCollection();
        properties["quartz.scheduler.instanceName"] = "MyScheduler";
        properties["quartz.scheduler.instanceId"] = "instance_one";
        properties["quartz.jobStore.type"] = "Quartz.Impl.AdoJobStore.JobStoreTX, Quartz";
        properties["quartz.jobStore.useProperties"] = "true";
        properties["quartz.jobStore.dataSource"] = "default";
        properties["quartz.jobStore.tablePrefix"] = "QRTZ_";
        // if running MS SQL Server we need this
        properties["quartz.jobStore.lockHandler.type"] = "Quartz.Impl.AdoJobStore.UpdateLockRowSemaphore, Quartz";
        properties["quartz.dataSource.default.connectionString"] = this.settings.GetConnectionString(); // gets sql server connection string
        properties["quartz.dataSource.default.provider"] = "SqlServer-20";
        properties["quartz.jobStore.misfireThreshold"] = "60000";  // polling time is 1 minute

        schedulerFactory = new StdSchedulerFactory(properties);
        scheduler = schedulerFactory.GetScheduler();

        // assign job factory injected by autofac, so we can have DI injections into jobs.
        scheduler.JobFactory = autofacJobFactory;
    }


    /// <summary>
    /// Creates a job to be run only once on the date/time specified. 
    /// If the scheduler is missing the execution time, it'll be fired once the scheduler is back online.
    /// Jobs are identified by name and group name. Names must be unique within group names. 
    /// So here we give GroupName as the type of the job class and ID would be the entity ID that is requiring the job.
    /// If you are trying to create another job with duplicate name, exception will be thrown.
    /// </summary>
    public void CreateScheduledJob<T>(T jobObject, object jobIdentity, DateTime executionTime) where T : IJob
    {
        var jsonData = JsonConvert.SerializeObject(jobObject);

        var jobDetail = JobBuilder.Create<T>()
            .WithIdentity(jobIdentity.ToString(), typeof(T).Name)
            .UsingJobData("data", jsonData)
            .Build();

        var trigger = TriggerBuilder.Create()
            .WithIdentity(jobIdentity.ToString(), typeof(T).Name)
            .StartAt(executionTime)
            .WithSimpleSchedule(x => x.WithMisfireHandlingInstructionFireNow())
            .Build();

        scheduler.ScheduleJob(jobDetail, trigger);
    }

    public void StartScheduler()
    {
        if (!scheduler.IsStarted)
        {
            scheduler.Start();
        }
    }

    public void StopScheduler()
    {
        if (scheduler.IsStarted)
        {
            scheduler.Shutdown(true);
        }
    }
}

And this is quite important bit: JobFactory for scheduler. Here we create job objects for Quartz to execute. Because we would like to have our dependencies injected into jobs, we would like to create jobs from DI container. Also here we populate data from JSON data stored in Quartz.

public class AutofacInjectorJobFactory : IJobFactory
{
    private readonly ILifetimeScope lifetimeScope; // this is Autofac LifetimeScope. Basically reference to the container

    public AutofacInjectorJobFactory(ILifetimeScope lifetimeScope)
    {
        this.lifetimeScope = lifetimeScope;
    }


    /// <summary>
    /// Implementation of the process that re-creates new jobs objects:
    /// here we inject dependent objects from Autofac
    /// </summary>
    /// <param name="bundle"></param>
    /// <param name="scheduler"></param>
    /// <returns></returns>
    public IJob NewJob(TriggerFiredBundle bundle, IScheduler scheduler)
    {
        Type jobType = null;
        try
        {
            var jobDetail = bundle.JobDetail;
            jobType = jobDetail.JobType;

            // resolve job object from DI - populate all services and repositories
            var job = lifetimeScope.Resolve(jobType);   

            // extract json serialised state of the object from quartzDictionary
            var jsonData = jobDetail.JobDataMap.GetString("data"); 

            // now populate the object from the json data
            JsonConvert.PopulateObject(jsonData, job);

            return (IJob)job;
        }
        catch (Exception exception)
        {
            var message = String.Format("Problem instantiating class {0}", jobType != null ? jobType.Name : "UNKNOWN");
            throw new SchedulerException(message, exception);
        }
    }


    public void ReturnJob(IJob job)
    {
        // do nothing here. Don't really care. Quartz.Net SimpleJobFactory does not have anything here.
    }
}

This is pretty much it for the plumbing code for Quartz. The rest is wiring up your DI container and placing jobs into the scheduler. And here is a sample of one of the jobs I’ve created:

public class ContractOnEffectiveDateJob : IJob
{
    public int PersonContractId { get; set; }

    private readonly IPersonContractRepository personContractRepository;

    // used by autofac
    public ContractOnEffectiveDateJob(IPersonContractRepository personContractRepository)
    {
        this.personContractRepository = personContractRepository;
    }


    public ContractOnEffectiveDateJob()
    {
        // nothing here for manual creation
    }

    public void Execute(IJobExecutionContext context)
    {
        var newContract = personContractRepository.Find(PersonContractId);

        // making sure the object still exist in DB
        if (newContract != null) 
        {
            newContract.IsEffective = true;
        }
        personContractRepository.Update(newContract);
        personContractRepository.Save();
    }
}

And here how you create that job:

var scheduledJob = new ContractOnEffectiveDateJob()
    {
        PersonContractId = 42;
    };

schedulerService.CreateScheduledJob(scheduledJob, newContract.PersonContractId, EffectiveDate);

And here is the Autofac module for registration of classes:

public class QuartzModule : Module
{
    protected override void Load(ContainerBuilder builder)
    {
        builder.RegisterType<QuartzService>().AsImplementedInterfaces();

        builder.RegisterType<AutofacInjectorJobFactory>().As<IJobFactory>().InstancePerLifetimeScope();

        // register all Quartz Jobs as themselves.
        builder.RegisterTypes(Assembly.GetAssembly(typeof(QuartzModule)).GetTypes() )
               .Where(t => t != typeof(IJob) && typeof(IJob).IsAssignableFrom(t))
               .AsSelf()
               .InstancePerLifetimeScope();
    }
}

Nice and simple. Testable, following SOLID principles, easy to understand and create new jobs.

UPDATE 12 Jul 2014

Since I wrote this article a year ago, I have ripped out Quartz from my projects and replaced it with Azure Scheduler as soon as it was out of preview. And I don’t regret. It works great for most of my scenarios.

Also I’ve been informed that a project was started dedicated to integration of Autofac and Quartz. Here is the Github link: https://github.com/alphacloud/Autofac.Extras.Quartz, also nuget package is available. I have not tried this myself (see above, I no longer use Quartz), but I did look through the source code. And I have criticism. At the moment of writing package has dependencies that can be avoided. First of all there is a dependency on Resharper.Annotations nuget. Why?? Not everyone uses Resharper (currently I’m trying to move away from it, as it becomes slower and slower with every release). I certainly don’t want to bring this package into my project. Another avoidable dependency is Common.Logging which in turns brings Common.Logging.Core. I did work with this (it was forced upon me by other nuget package) and I hated it with passion. Mostly because there was .dll version incompatibility: there was three other nugets depending on Common.Logging and every single one dependent on different versions; that gave me endless headaches. Idea behind was nice, but real-life just does not work with it. There are better ways to enable logging in your nuget module, without forcing endless number of dependencies upon consuming developer.

Other than that, if you don’t mind extra packages in your project, just use this project. It should just work, without headaches of setting everything up yourself.