C#

Precompiling ASP.NET MVC applications with Teamcity & Octopus

Notice the first time you open a page or view in your ASP.NET MVC application, it’s takes quite a bit longer, and subsequent loads are faster? This is because they are compiled on-demand by IIS the first time someone tries to access them – dynamically being turned into an alpha-numerically named DLL. There are quite a few problems with this process:

  • Some errors in your razor code won’t be made apparent until the view is compiled after being accessed for the first time. If you follow the principle of “crash early”, then you’ll agree the web server is much too late for this to happen!
  • Web servers are meant to serve web requests, not compile code. Compiling views comes with a performance overhead that may affect the performance of concurrent requests.
  • If a user is unlucky enough to be the first to access a view they will be met with a long load time, giving a poor impression that something may be wrong.

In this post I will show you how to setup true precompilation for your ASP.NET application. The goal is to package our entire web application, including views, into one or more DLL files. This comes with many benefits:

  • Any compilation errors in your razor code are found well before any code is deployed to a web server.
  • Compilation is done on your build server, allowing you to create a deployment package that requires no additional compiling on the web servers.
  • Users are no longer victim to long load times the first time a view is accessed.

I am assuming that you already have a build and deploy process setup using Teamcity and Octopus. I will be showing you the small tweaks necessary to that process to make precompilation work.

Setup a Publishing Profile

We’re going to leverage publishing profiles as a way of instructing MSBuild on how to compile our project.

  1. Start by right clicking your web project in Visual Studio and clicking Publish…
  2. You will be asked to select a publish target. Select Custom and enter a profile name when prompted
  3. Under publish method select File System
  4. Under target location enter $(ProjectDir)precompiled and click next
  5. Select the build configuration you want to apply, and under File Publish Options make sure both options to delete all existing files prior to publish and precompile during publishing are both checked
  6. Click the Configure button that is next to the precompile during publishing option. Details on all the options in this window are documented on MSDN. For now we will make sure the allow precompiled site to be updatable option is unchecked. Select the option to Merge all outputs to a single assembly and enter a name for the DLL file, for example MyWebProject.Precompiled
  7. Close out of the dialogs. You can push the publish button to test your profile. Once the compile is complete, you should be able to go into your project directory and see a new folder called precompiled. Inside of it you will find the bin folder where you will see some new compiled DLL’s that weren’t there before. Those are your precompiled views.

If you look in the Properties folder in your project you should have a new folder called PublishProfiles containing an xml file with the profile configuration. Here is a sample of what it may look like:

<?xml version="1.0" encoding="utf-8"?>
<!--
This file is used by the publish/package process of your Web project. You can customize the behavior of this process
by editing this MSBuild file. In order to learn more about this please visit http://go.microsoft.com/fwlink/?LinkID=208121. 
-->
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
 <PropertyGroup>
 <WebPublishMethod>FileSystem</WebPublishMethod>
 <LastUsedBuildConfiguration>Release</LastUsedBuildConfiguration>
 <LastUsedPlatform>Any CPU</LastUsedPlatform>
 <SiteUrlToLaunchAfterPublish />
 <LaunchSiteAfterPublish>True</LaunchSiteAfterPublish>
 <PrecompileBeforePublish>True</PrecompileBeforePublish>
 <EnableUpdateable>False</EnableUpdateable>
 <DebugSymbols>False</DebugSymbols>
 <WDPMergeOption>MergeAllOutputsToASingleAssembly</WDPMergeOption>
 <UseMerge>True</UseMerge>
 <SingleAssemblyName>MyWebProject.Precompiled</SingleAssemblyName>
 <ExcludeApp_Data>False</ExcludeApp_Data>
 <publishUrl>$(ProjectDir)precompiled</publishUrl>
 <DeleteExistingFiles>True</DeleteExistingFiles>
 </PropertyGroup>
</Project>

MSBuild Precompiling Views in Teamcity

Now that we have a publishing profile setup, the next step is to automate the precompilation step in Teamcity.

  1. Add a new MSBuild step to your current build configuration (you do have one setup already to compile your project, right?). We will want this to be one of the last steps in our configuration.
  2. Give it a name, point the build file path to your solution file, and set the command line parameters to the following:
/p:DeployOnBuild=true
/p:PublishProfile=<YourPublishProfileName>.pubxml
/p:VisualStudioVersion=14.0
/p:Configuration=Release
/p:AspnetMergePath="C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools"

And that’s it, Teamcity will invoke MSBuild using the publishing profile we created earlier, and generate the precompiled DLL’s.

If you are going to be deploying using Octopus, make sure the Run OctoPack option is checked in the build step.

Creating an Octopus Package

The last step is to take our precompiled application and package it up for octopus to deploy. The first thing we need to do is create a .nuspec file in our project, make sure it has a build action property of Content. This will tell OctoPack how and what to package in our project. Name the .nuspec file the same as your web project and enter the following:

<?xml version="1.0"?>
<package xmlns="http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd">
 <metadata>
  <id>MyWebProject</id>
  <title>MyWebProject</title>
  <version>0.0.0.0</version>
  <authors>Me</authors>
  <description>The MyWebProject deployment package</description>
  <releaseNotes></releaseNotes>
 </metadata>
 <files>
  <file src="precompiled\**\*.*" target=""/>
  <file src="Web.*.config" target=""/>
 </files>
</package>

Basically we’re telling OctoPack some basic information about our project, and to include everything in the precompiled folder into our package. We are also asking Octopack to include any extra config transformations, this is optional but necessary if you wish to perform config transformation during your Octopus deploy process.

That should be it. Now when TeamCity runs, it will tell MSBuild to precompile all your views into one or more DLL’s using the publishing profile you created. Once that is done it will invoke OctoPack which will look at the nuspec file in your project and create an Octopus package containing the contents of the precompiled folder. You can then push that package to your Octopus server where it can then be deployed to your web servers.

Compile Time View Validation in ASP.NET MVC

Open up your favorite .cshtml file, put the mouse cursor in the middle of some razor code, and have a cat walk across your keyboard. If you don’t have a cat nearby, rolling your face on your keyboard will also suffice. You should start seeing things highlighted and underlined in red. Now go ahead and build your project.

Build Succeeded – Really?

Unlike all the other code in your project, your view files are not compiled when you hit the Build button in your IDE. Instead, they are compiled on-demand by IIS the first time someone tries to access them – dynamically being turned into an alpha-numerically named DLL. The problem is that any errors you have in your views won’t be made apparent until IIS tries to compile them, at which point the user who requested the view would see an error page. So how do you protect yourself from this happening?

Pre-compilation To The Rescue

Pre-compiling Razor views is possible, there are projects out there that will allow you to turn your views into DLL files before they even touch an IIS server. However doing so in this case would be overkill, we just want to know if there are obvious errors in our views.

To let you find those compile-time bugs there’s a flag you can set in your .csproj file.

<MvcBuildViews>true</MvcBuildViews>

This will cause your views to be test compiled when your project is built. Why do I emphasize test compiled? Because they aren’t compiled in the traditional sense that you end up with resulting DLL files, they will still need to be dynamically compiled by IIS later on. It’s just a test to see if when they are compiled by IIS, if any errors will be thrown.

You will find that this setting is false by default, and there’s a good reason for that – view compilation takes time. In a large enough project it could take enough time to seriously annoy a developer who is used to those quick compiles. A medium sized project of around 70 views has the compile time grow by 36 seconds when this feature is enabled.

But there’s a compromise, instead of having your views test compile during every build, we can set it to only test compile when performing a release build. If you look in your .csproj file, you will find a PropertyGroup block for each build configuration in your project. Find your release build configuration and add the MvcBuildViews property. In this example my build configuration is simply called Release.

<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
    <MvcBuildViews>true</MvcBuildViews>
    ...
</PropertyGroup>

This way the debug builds you do on your machine will run fast, while the builds that run on your build server will take a bit longer, while validating that all your views will compile. If a view can’t be compiled, the build fails, and the code will never be deployed to an IIS server.

Ignoring time when filtering dates in Telerik Kendo grids

kendofilteringTo the right is what the filter looks like on a Telerik Kendo grid when Filterable(true) is set on a DateTime column.

If I was a user, I would expect the grid to return all rows that match the date (8/13/14) regardless of the time associated with that date. If it’s 8/13/14 02:00 or 8/13/14 17:41, the expectation is that they should all appear because I am asking the grid to show me all the data that occurred on that date.

Instead, the Kendo grid defies that expectation and will only return data that precisely matches the date, at midnight, ie. 8/13/14 00:00:00. I’ve had users who were convinced this functionality is actually a defect, when it was just a case of it being really unintuitive.

So, the goal is to modify the filtering behavior in the grid to effectively ignore the time, and only use the literal date when filtering data. But, still preserve the ability to sort the data by time.

After doing the prerequisite search around the Telerik forums and StackOverflow, it became quite clear that existing solutions hacks are really messy and either involve some trickery in the underlying model that is bound to the grid (ewww, no) or some nasty JavaScript (for the love of kittens, no).

The basis of my solution involves making use of a custom DataSourceRequest attribute that implements a custom model binder. The custom model binder will iterate through the filters being applied to the grid and transform them accordingly.

What do I mean by transform? Here are some examples of what happens:

isEqual("08/13/14")

becomes:

IsGreaterThanOrEqual("08/13/14 00:00:00") AND IsLessThanOrEqual("08/13/14 23:59:59")

And another example:

isLessThanOrEqual("08/13/10") AND isEqual("08/13/14")

becomes:

isLessThanOrEqual("08/13/10 23:59:59") AND IsGreaterThanOrEqual("08/13/14 00:00:00") AND IsLessThanOrEqual("08/13/14 23:59:59")

Using the same logic, I apply it to all the other possible logical operators when filtering (is not equal to, is greater than, is equal to, etc.)

So first, lets starting extending the default Kendo DataSourceRequest attribute:

We will use this attribute to decorate our request data when reading data for our grid. Next, is the heart of our solution which is the custom model binder:

First, notice the recursive calls to TransformFilterDescriptors(), this is to handle cases where the user may be requesting two or more different filters for a field. If you read through the comments in the code you will see where the original filter logic is being translated into a single or composite filter with the time set to 00:00:00 or 23:59:59 to match the appropriate situation.

Finally, we decorate the Kendo DataSourceRequest being passed into our Actions with our new [CustomDataSourceRequest] attribute. Here is what a basic Action would look like:

The added benefit of this is there is absolutely no front end work – no javascript or view model tweaking, and no page or model specific modifications. The solution is generic enough to work across all the grids and models in your application.

The full code from this post is available on Github as a Gist.

Update (2015/05/03): While at the Build 2015 conference I had a chance to speak with some of the folks at Telerik working on Kendo UI. While they do acknowledge that the current DateTime filter behavior isn’t very intuitive, their concern with making it the default is that it will affect people who expect that functionality in existing applications. So it looks like we have to make do with the solution above, at least for now.

Update (2017/11/28): Updated the code to handle the “Is Null” and “Is Not Null” filters for nullable dates. Also updated the logic to support high precision DateTime values. I also want to make a note that if you are filtering UTC DateTime objects, you will need to add a call to  .ToUniversalTime() at the end of any DateTime constructors inside the main switch loop of the TransformFilterDescriptors() method.

Dynamically Generating Lambda Expressions at Runtime From Properties Obtained Through Reflection on Generic Types

Lately I’ve been having to export some of my data entities into CSV files, and I’ve been using the CSVHelper nuget package to achieve this. As is common, property names don’t translate well into readable column headers, so you have to provide some kind of property to string mapping.

This is how CSVHelper handles it:

namespace MyApplication.CSVMapping
{
	public class MyModelCsvMap : CsvClassMap
	{
		public override void CreateMap()
		{
			Map(m => m.Id).Name("Model Id");
			Map(m => m.Description).Name("Model Description");
			Map(m => m.StartDate).Name("Start Date");
			Map(m => m.EndDate).Name("End Date");
			Map(m => m.RunDate).Name("Run Date");
		}
	}
}

Nothing too fancy, just passing my model type into the derived class, and going through each class member, setting the Name property.

However, as is also common, I may also have a form tied to this model and I want to use the built in DataAnnotations to set the form labels for each field, like so:

namespace MyApplication.Models
{
	public partial class MyModel
	{
		[DisplayName("Model ID")]
		public int Id { get; set; }
		[DisplayName("Model Description")]
		public string Description { get; set; }
		[DisplayName("Start Date")]
		public DateTime StartDate { get; set; }
		[DisplayName("End Date")]
		public DateTime EndDate { get; set; }
		[DisplayName("Run Date")]
		public DateTime RunDate { get; set; }
	}
}

Noticing some redundancy here? Could I perhaps have CSVHelper get the property column header names from the DisplayName Attribute in the model rather than having to create a seperate CsvClassMap? That way I wouldn’t have to repeat my property to string mappings.

For this I will have to create a generic version of the CsvClassMap class, which takes in my entity type. From there I can get all the properties in that type, and start iterating through them. For each property, I check if it has a DisplayName attribute, and if it does, get what the value is. The tricky part is passing in the property into CSVHelper’s map method which expects a Expression<Func<TEntity, object>>. Here’s the complete code:

using System;
using System.ComponentModel;
using System.Linq;
using System.Linq.Expressions;
using System.Reflection;
using CsvHelper.Configuration;

namespace MyApplication.Common
{
	public class BaseCsvMap : CsvClassMap where TEntity : class
	{
		public override void CreateMap()
		{
			PropertyInfo[] props = typeof(TEntity).GetProperties();
			foreach (PropertyInfo prop in props)
			{
				var displayAttribute = prop.GetCustomAttributes(false).FirstOrDefault(a => a.GetType() == typeof(DisplayNameAttribute)) as DisplayNameAttribute;
				if (displayAttribute != null)
				{
					var parameterExpression = Expression.Parameter(typeof(TEntity), "x");
					var memberExpression = Expression.PropertyOrField(parameterExpression, prop.Name);
					var memberExpressionConversion = Expression.Convert(memberExpression, typeof(object));
					var lambda = Expression.Lambda<Func<TEntity, object>>(memberExpressionConversion, parameterExpression);
					Map(lambda).Name(displayAttribute.DisplayName);
				}
			}
		}
	}
}

That should be fairly self explanatory. The only strange “gotcha” is having to call Expression.Convert() before constructing the lambda expression. This is because the expression explicitly expects “object” as it’s type, and your entity likely contains typed members ie. strings, ints, decimals, etc.

You can also modify the above class to working with any custom attributes that you may have defined, just remember to pass true into the GetCustomAttributes() method.

Converting System.Data.LINQ.Binary to an ASCII Encoded String

I ran into a situation where I was storing files in a database as a binary field. Some of these files were in HTML format and I wanted to dynamically display them inside of some ASP pages I was developing. After a bit of research I was finding a lot of other solutions people were using involved timestamp conversions which were not ASCII based like HTML. The following code will convert your binary object into ASCII and display it as it was originally uploaded.

byte[] myByteArray = myBinaryObj.ToArray();
System.Text.ASCIIEncoding enc = new System.Text.ASCIIEncoding();
string result = enc.GetString(myByteArray);

Releasing Files In Use By Other Processes

When developing an application that uses a SQL Server Compact Edition database, you may run into a problem getting your application to build if you frequently compile it to test changes. Specifically the following error:

Problem generating manifest. The process cannot access the file ‘C:\…\mydb.sdf’ because it is being used by another process.

The problem is that your application didn’t properly release its lock on the SQLCE database file the last time you ran it. I find this especially happens when you’re debugging and hit an unhandled exception. Since your application runs as a child of the devenv.exe (Visual Studio) process, closing and reopening Visual Studio will release the lock on the SDF file and allow you to successfully compile again. Obviously, restarting Visual Studio everytime you want to test your application isn’t very convenient.

There is an easier solution to this problem. You’ll need to download Process Explorer, a free utility provided by Microsoft. According to the website, “Process Explorer shows you information about which handles and DLLs processes have opened or loaded”. This is precisely what we need to release the SDF file that Visual Studio has taken hostage.

So open up Process Explorer, and using the “Find Handle or DLL…” feature search for “sdf”. You may end up with several results, but what you’re looking for is the SDF that you use in your application. Once you find it, double click it. The file will then appear highlighted on the bottom half of the window, right click it and select “Close Handle”. The lock on the file will be destroyed, allowing you to successfully build your application without getting manifest generation errors.