Release Management Using VSTS

If you have been tracking Azure at all lately you know that it is growing at breakneck pace. Microsoft has really turned up the volume on their enterprise cloud at all levels.  Just diving in can sometimes be a rough experience. Azure is a wide open field with many paths to solve your problem. I would like to show you the path I have found to get my release management up and running for my complex micro-modularity and microservices.

In the last post we created an ASP.NET Project for Visual Studio Team Services (VSTS) Release in a minimal way.  Now we will check ‘the check box of continuous integration’ and ‘push the button of continuous deployment’. Then we will add a second deployment environment to get your creative juices flowing.

Pre-requisites

So to be clear, there is some configuration you need to do upfront that I won’t cover here.  Some of the setup is clearly part of the awkward dance of preview services.  But if you want to get ahead of the curve and take advantage of these services, I can attest to it being well worth it. They only need to be setup once.

We will:

  • Create an Azure Build definition.
  • Create an Azure Release definition
  • Deploy Azure Resource Group containing a Web Site
  • Configure a second environment

For the purpose of this tutorial you need the following:

  • Project to Deploy with Azure Resource Templates (follow tutorial, or fork it)
  • A VSTS account with a project, the RELEASE hub enabled and version control system (VCS), it is easy enough to use the hosted Git or Github or most any Git repo visible to the Internet.
  • A service end point configured in Team Services or TFS with rights to connect to your Azure subscription.
  • An agent that is capable of running the deployment tasks. I am just using the one on VSTS.
  • if you are using GitHub for version control you can connect it using a personal access token

Once you have the above resources you should be able to walk through the rest of this Release Management tutorial for Visual Studio Team Services.

Azure Build Definition

In your project click over to Build.  You will see a nice little green plus on the lefthand side that will create a new build definition.  Select the Visual Studio template and select the repo you are going to use.

new build def
New Build

 

 

new build def - repo
Connect Version Control System

 

build def - edit
Edit Build

 

If you are using github you can specify it later when you edit the configuration.  This template gives you a nice list of tasks that will handle our simple situation.  It is easy to add and remove tasks. I would encourage you to do so when you finish this tutorial. The build config is very robust, able to trigger on-prem builds and even using crazy automation tools like like Ant, Gradle, Grunt, Gulp, Maven… go nuts.

Version Control

The easiest thing to do is use the Git repo built into the project, but I am going to build the example code from GitHub. You can follow the link above to connect it to your VSTS account for use. The configuration looks like the following using the built-in VCS or Github.build def - Repository

Web Deploy Package

To make the deployment package MSBuild basically deploys to a zip file.  To trigger this you need to add ‘MSBuild Arguments’ on the Visual Studio Build step. In that field type

/p:DeployOnBuild=true;PublishProfile="Web Deploy Package"

After you make the change, click save. A great feature of the system is version controlling the builds. You allowed to rename the build here and add a comment. If you comment every save you create a rich history of changes, each of which can be scrutinized on the History tab.

Save Dialog

Continuous Integration (CI)

At this point verify you have checked ‘the check box of continuous integration’. Note that you can also schedule builds if you like ‘Nightly’ instead of ‘Continuous’.

 

check the check box of continuous integration
the check box of continuous integration

 

Build

Now comes the fun part: Click Queue Build and watch it do it’s thing. When it is done you will find that these logs are saved and you can brows the artifacts created or download them as a zip.

Azure Release definition

Now click over to the Release tab. Release uses the same engine as build, so you will see many of similarities, and you can do many of the same tasks. Layout should be familiar, so click the green plus sign over on the left that says it is used to ‘Create a release definition’.  Choose the Azure Website Deployment template and click ok. This starts you out with a single environment that has a couple tasks.

Here the term ‘environment’ means no more than a single set of tasks, variables and sign-offs that can be strung together to make an automated release process.

You will note that you are encouraged to test often since the second task you are given is a test task. I personally run Unit Tests during build and Integration Tests after a release. If you wanted you could even have a little sanity test to run when you release to production.

First it is good to configure the Azure Web App Deployment task so you can save. Simply select your Azure Subscription endpoint, give your app a name, and select a location near you for now. Give the release config a name and click save.

Infrastructure Deployment

Right now you could click deploy and the thing would just go and use defaults and you would get a website in your Azure account.  However, you would not have any real control over how it would be billed and it would use whatever web.config you checked in. So in this step we will take control of all of this with a couple little JSON files.

Click Add Task. Under deploy click Add next to Azure Resource Group Deployment. I will admit when I first saw this I thought it sounded grand an complex. However, a couple clicks later I was elated with how simple it is.

When you click close you will notice the task is incompletely configured by default and at the end of your tasks.  Make it the first task by dragging the task to the start of the list. Then select your Azure Subscription service endpoint (I use a Service Principal for this). Then name the resource group. Later when you master variables you can name everything by convention using an environment specific variable. I always add ‘-resource’ to the resource group name for additional clarity.

Now click the ellipsis on the Template line.  It will tell you that you don’t have any build definitions linked.  Click ‘Link to a build definition’ and select the build definition you made earlier as the Source and click the Link button.

Now you have a tree of the artifacts created by that build when you ran it last.  The files you want will be under ‘drop’ and the folder name you gave your Resource Group project. Then under bin/Release/Templates select the WebSite.json file and click OK. Assign the Template Parameters value by clicking the ellipsis again, browsing to the same location and selecting the WebSite.parameters.json.

Now you are to the point where the magic starts and where things really started to click for me. Because you defined the website name and connection string as a parameter you can assign those via the Override Template Parameters field.  In this field set your values like this:

-hostingPlanName "myPlanName" -websiteName "myWebsiteName" -connectionString "Server=tcp:mydatabase.database.windows.net,1433;Database=myDbName;User ID=myUsername;Password=MyPassword;Encrypt=True;"

Release

One last thing to check before you release is that your Locations match between the Resource Group Deployment and the Web App Deployment. Then click Save.

Now the point of all this is to get you to continuous deployment. To enable this click the Triggers tab and check the check box, select your artifact source and select your environment. Then click save.  Now when you check in a change, it will build. When done building it will release.

Release Trigger

To give it a test without checking anything in click the release drop down and select Create Release. You can choose the artifacts from the release you did earlier and select the environment you just configured and click create. If the process succeeds you can verify by viewing the Resource groups in the Azure Portal.  The great thing about resource groups is now, to remove everything you just released to Azure, simply delete the resource group. To deploy it again, make a commit or release it manually.

The great thing about using the resource template is that if you make changes the environment will be updated to reflect those while you keep the progression of the environment under version control.

Configure a second environment

To understand why I think Release environments are cool, click the ellipsis on the Default Environment and ‘Configure variables’.  You will notice there are a variety of them predefined. Lets create one of our own.

env-ellipsis

At the bottom of the list click Add Variable. Name it ‘websiteName’ and give it a value you like. Click OK, and go back to your Resource Group Development task. In the Resource Group filed type:

$(websiteName) -resources

Then in your Override Template Parameters put $(websiteName) after -websiteName. Select the Azure Web App Deployment task and put the variable $(websiteName) in Web App Name field.

For a second, imagine you have a much more complex release environment you have configured. Now, click the ellipsis next to Default Environment again and click ‘Cone environment’ and call it something like QA.

env-ellipsis-clone

Now configure variables on that environment and change value next to websiteName, perhaps appending ‘-qa’ or something. Click ok and Save the definition.  I don’t know about you but the first time I did something like this I giggled enough to make everyone in cubes around me feel uneasy.

Where to go from here

From here you can add more variables, parameter and infrastructure. Add a database or other services and make a self contained set of services so you can spin up tests of all sorts.  There are many tutorials out there for expanding on your JSON templates and great functionality built into Visual Studio (maybe Code too) to help you edit these configurations. I would be interested on where you personally take things from here.

In a post in the future up I will dig a little deeper into how I have overcome the issues of managing many tiny libraries using private NuGet repositories and multiple Git repositories.

After looking at VSTS Release, how does this compare to other tools you have used?

 

Release Management Using VSTS

ASP.NET Project for VSTS Release

So you have discovered the intense desire to manage your infrastructure as code and continuously deploy with your eyes closed. It is true, continuous integration and continuous deployment, once implemented, open a whole new set of opportunities. For me personally, most attractive of these is micro-functionality. Microservices are all the rage, but on a more fundamental level, it is hard to manage complex modularity without CI and CD.

For this reason I am going to assemble a refrence ASP.NET project that will demonstrate a common sinario in which the developer and Visual Studio are removed from the deployment process and endless deployments are made possible through Azure Resource Group template parameters. In a second post I will walk through setting up the Azure side of things.

Connection Strings and Migrations with EF6

Entity Framework 7 seems to be shaping up as a key part of my development stack.  However there are a host of things it does not yet do so I have stuck with Entity Framework 6 for most of my current development.  One of the flaws I have found in EF6 is that it is difficult to get it to use a specific connection string for code migrations. This makes automated migrations with VSTS Release even more difficult. There are even stack overflow questions about it that venture down some dark paths to get it right.

The method I have chosen when using EF6 is to make sure my migrations are clean and selectively enable automatic migrations. Then I assign a connection string per deployment as described here. So far this is the best and only way I can find that makes sense.  If you have other ideas let me know in the comments.

Looking at Infrastructure as Code, it seems like a daunting task. Especially if you are like me and have had a brush with sickening array of tools that are needed in some environments to get the job done.  Tools like Jenkins, Chef, Puppet, Ansible… sure they are OSS so they deserve support. On Azure, all you need for IaC are a couple JSON files, no extra software and no Admin involvement to get the right packages installed or make sure the OS configured correctly.

So if you want to skip to the end, grab the JSON and jump to the next post to see where VSTS uses them. I will try my best to give you a concise walk through from here on out so you can understand the things I like about this solution.

Visual Studio Solution

I will assume a level of comfort with Visual Studio, and save bits on the internet by being terse here.  From VS, choose to create a new project. From the New Project dialog select ASP.NET Web Application. For fun select MVC in the New ASP.NET Project dialog and then check WebApi.  Leave ‘Host in the cloud’ unchecked because we will be taking a different path. Click ok to create the solution.

Next we need to configure the web project with a publish profile to create a deployment package when built on VSTS. Right Click the web application project and select ‘Publish’. On the Publish Web dialog select ‘Custom’. When prompted for a name I always name it ‘Web Deploy Package’ so I can use the same build template over and over without a bunch of reconfiguring. On the next screen select a Publish method of ‘Web Deploy Package’. For a package location you should choose somewhere that will be picked up by the Artifacts step(s) in your build task. This usually defaults to a mask like “**\bin\$(BuildConfiguration)\**”, so if you choose your “bin\Release” directory you can get going quickly. Then go ahead and click publish to see what happens. When you check things in, it is best to leave the bin directory out of your commit, so this configuration will save you that way too.

You will see that deploying a WebApi or oData project is as easy as deploying a website. Using this method you can add any service offered by Azure like Service Fabric or to setup VMs if you like mucking about at the OS level.  I hope that after you take this first step you will try some other crazy things. Just remember to delete the resource group after you play with it. As a side note, deleting a resource group in Azure removes everything in it. So when you get down to deploying this you can simply delete the resource group and not worry about unknown things hanging out to penny and dime you down the month.

After you have your solution with a website, right click it and add a new project.  In the Add New Project dialog select Azure Resource Group. In the Select Azure Template list select ‘Web app’ and click ok.  There are other interesting options here, but for this demo, I want to show that there is already a DB up and running that this app will connect to.

Infrastructure as Code (IAS)

Don’t be afraid. As I have written in the past, there is no need for name soup or acronym mehem. Just a JSON file and a mouse.

At the time of writing the Azure Resource Group template that I have installed is using API version ‘2015-08-01’.  You can verify this by opening the WebSite.Json in the Azure resource Group project. The default templates can certainly get you going as is. However there is no way to pass in a connection string or application configuration.  By default the template makes some junked up website name that I dislike as well so we will tweak that too.

In the WebSite.json we want to add website name and connection string so we can assign them during release. First you want to add parameters for `websiteName` and `connectionstring` as shown in this Gist. You can simply delete the websiteName variable and replace all instances of `variables(‘webSiteName’)` with `parameters(‘webSiteName’)`. Then you need to add the section that does the inserting of the values to the WebApp environment in the `Microsoft.Web/sites` section as seen in the gist.

NEXT

I hope this gets you on your way, and perhaps lets you see the potential of parameterized Azure IaC. Next you should commit this to your project on VSTS. (Although you could just as easily use GitHub with VSTS Build and Release) In the next article I will walk you through configuring Azure to deploy the same code to different environments with their own connection strings.

git the code here

ASP.NET Project for VSTS Release

Implementing IQueryable on a Business Logic Layer – Part 2

In Part 1 of this series I explained what sort of situation I am building towards.  In short I am building a highly modularized scaffolding system geared toward a microservice architecture.

Central to the functionality is a data object decorator and a gateway class that completely abstract away Entity Framework.  It was trivial to implement Take() and Skip(), passing the values to an internal IQueryable from EF.  Then I implemented ToList() to do a select as decorator.  As I mentioned in the previous post, that is great for doing linq against the gateway, but you cant cast it to a IQueryable as is expected in various places like WebApi.

Querying about the web I found a project with great potential called ReLink.  The project claims to be used by Entity Framework 7 and NHibernate afterall.  However, as I read through the introduction page I noted the phrase “Specification framework” and realized I may be reaching too deep by doing this first.  So I kept looking.

Moving up the stack I found an interesting repository containing what was called Query Interceptor. The goal of the QueryInterceptor is to intercept the query just before Execute().  This is an interesting concept.  Even so, I decided I would keep looking for something simpler.  I may need to optimize later, so I am keeping these projects bookmarked.

Several times I came across a project called Linq To Anything. I will be honest and say that the name prompted me to overlook it on several occasions. When I was not finding what I knew must exist, I took a closer look.  Linq To Anything is built on QueryInterceptor and System.Linq.Dynamic. This prompted me to try an implementation and to take a look at the code.  It seems deceptively easy to use, but in my experiments so far it seems to fit my needs.

Looking at unit tests is my favorite way to see how a project is meant to be used.  What I found was I could simply implement the properties needed for IQueryable and use LinqToAnything.QueryProvider<T> to expose a protected method I called DataAccessMethod(). Inside this method I specified how to pass Where, OrderBy, Take and Skip.

It ended up looking something like this:

#region IQueryable implementation
        
        public System.Linq.Expressions.Expression Expression => System.Linq.Expressions.Expression.Constant(this);
 
        public Type ElementType => typeof(ILoggerEntry);
 
        public IQueryProvider Provider => new LinqToAnything.QueryProvider<IMySpecificDecorator>(this.ApplyQueryInfo, (qi => this.ApplyQueryInfo(qi).Count()));
        
        protected IEnumerable<IMySpecificDecorator> ApplyQueryInfo(LinqToAnything.QueryInfo queryInfo)
        {
            foreach (var clause in queryInfo.Clauses.OfType<LinqToAnything.Where>())
            {
                this.Where((System.Linq.Expressions.Expression<Func<IMySpecificDecorator, bool>>)clause.Expression);
            }
 
            if (queryInfo.OrderBy != null)
            {
                var orderBy = queryInfo.OrderBy.Name;
                if (queryInfo.OrderBy.Direction == LinqToAnything.OrderBy.OrderByDirection.Desc)
                    orderBy += " descending";
 
                this.CurrentQuery = this.CurrentQuery.OrderBy(orderBy);
            }
 
            if (queryInfo.Take != null && queryInfo.Take.Value > 0) this.Take(queryInfo.Take.Value);
            if (queryInfo.Skip > 0) this.Skip(queryInfo.Skip);
 
            return this.ToList();
        }
       
        #endregion

The key to what is here is in the this.CurrentQuery.  It keeps a reference to the IQueryable as it has it’s various Linq methods called.  Notice I have not implemented OrderBy() and so I simple add it to the current query and execute it before Take() and Skip().

I have to admit, I was headed for a complex implementation before this.  This will work well for my first implementation.  At some point I will re-evaluate it for possible refactoring.  When I do you can be sure I will post about it here.

What do you think of my solution? What has been you experience with IQueryable?

Implementing IQueryable on a Business Logic Layer – Part 2

Implementing IQueryable on a Business Logic Layer – Part 1

So you might be thinking this is a bad approach from the start.  That this sort of functionality belongs on the DAL or at least in a Repository.  However, in this age of Microservices and in the context of complex applications, Business logic will exist in several layers.  Think of it in MVVM or N-Tier style where there are complex validations and business logic that runs faster when closer to the DAL in a multi-tier environment. In this particular instance I am exposing this sort of module via oData and as an Actor via. Azure Service Fabric.

Getting the internals right is particularly important for me because I am using extensive code generation via T4. If I get it right, it quickly extends to my whole infrastructure of microservices.

Early on I thought I could implement just parts of IQueryable and be able to get away with it. I tried only implementing Where(), Skip() and Take() without the host of classes needed for an actual IQueryable.  This worked great when my code is loosely coupled to the implementation and blindly executes only these operations.

The catch is that I couldn’t just cast a partial implementation to IQueryable for things like WebApi to use. It would be great to just implement these three operations and have some sort of generic implementation of a decorator that bridges the implementation to an IQueryable. Alas, there is no such bridge in native .NET.  Thus, we must help ourselves.

Poking around the web you will find an ancient Microsoft secret Walkthrough: Creating an IQueryable LINQ Provider.  Many posts about this subject fail to address the fact that you may not be using any sort of IQueryable under the hood. The MSDN post shows you how without directly addressing it directly.

At a high level: during the Execute() phase you will need to figure out what you can pass on, do so, and then execute the underlying query to return your list.  This list then becomes the subject of the remainder of the query.

The following post will walk through my implementation thought process.

Implementing IQueryable on a Business Logic Layer – Part 1

T4 Templates in Visual Stuido

There are not many places on the web to understand T4.  As I have been poking around T4 lately here are a few notes I have gathered together.  One thing of interest is Razor Generator I have talked about before.  It can be used as a T4-like preprocessor template although I feel like it is less mature.

Output Multiple Files

Generating multiple files from a single template seems to be a fundamental need in the community.  I personal prefer multiple files because you can see inclusions and deletions quickly in the git log instead of having to do the extra step of a diff.  Some folks think that it shouldn’t be done because it doesn’t work well.  Despite it being such a simple thing, it seems everyone has a solution, or at least an opinion.  As always I want to reuse the components closest to the core.  So I start with reusing the EF File Manager.

Entity Framework has a nifty file manager that you can use even if you are not using EF in the assembly you are generating in.  You start by adding `<#@ include file=”EF.Utility.CS.ttinclude” #>` to the beginning of your .tt file.  Then create an instance via `var fileManager =EntityFrameworkTemplateFileManager.Create(this);`.  Then every time you want to start a new file you simply call `fileManager.StartNewFile(newFileName);`.  Then at the end tell it to process by calling `Process()` on the `fileManager` object.

This method works fine, but I had trouble with my files disappearing every other time I hit save.  That is the first time I execute the script, it makes the file.  The second time I execute the script the file is deleted.  The third time I execute the script it is generated again.  If anyone has a solution to that, leave me a comment.

To investigate further I decided to try out the Tangible T4 template called TemplateFileManagerV2.1. It is intended to operate simularly: include `TemplateFileManagerV2.1.ttinclude`, instantiate `var fileManager = TemplateFileManager.Create(this);` and use it the same way.  The down side was it had the same problem.

Then I tried the DamienG  (git hub) solution with the same results.  That version really only adds a `fileManager.EndBlock()` after each new file content.

The next step was to go completely low tech with Oleg’s SaveOutput() function.  The issue there is that it doesn’t add/remove the file to/from the solution.

Since T4 Toolbox conflicts with Tangible T4 editor that I was using at the time, the T4 Toolbox option was the last I tried.  It is also the most different.  The documentation pushes you to use it’s template class method.  This isn’t much of an issue but it is a change.  The first step is to create a T4 class that extends `Template` with your output being encapsulated within a method called `TansformText()`.  Because of scope, you need to pass in any values you need to the constructor of the class or otherwise assign it to the instance.  Once your class is created, to create output to a particular file, you need to set the property `<instance>.Output.File` to the your file name.  Then you call `Render()` on your instance.  This method was predictable, but it took away my T4 Editor since Tangible is the only T4 editor for VS 2015.

Since I want an independent solution, and I like my T4 editor at the moment (I will investigate others as they become available for VS 2015) I returned to Oleg’s post and downloaded the source for `MultiOutput.tt` and tried it.  To my delight it worked by simply calling `SaveOutput(<filename>)` at the end of the file content as described in the blog post mentioned above.  But did it have the issue where the file disappeared every other time I hit save? Not at all after extensive testing.  Thus this is the solution I am using currently and it seems to be rock solid.

I like the template method used in the T4 Toolbox.  It seems a little cumbersome at first, but it brings the added benefit of template reuse and decoupling the template from the manor you are acquiring the objects you are generating the template with.  One could also create a custom class for passing into the template to further obfuscate the result from how you acquire the source data.  At some time in the future I may do this and start putting my templates up on Github to share.  If you are interested let me know.

Generation Based Existing Classes

Often you may want to generate factories, decorators, bridges, facades or just interfaces for existing generated files.  One proven method is to use System.Reflection to iterate over the types.  This requires a complied assembly.  You can use the one in the current project or use an external one.  Once you have the path to the assembly you can call `Assembly.LoadFile(<dll.path>)` to load the assembly and then call GetTypes() on that to iterate over the public types.  This looks something like this:

var assembly = Assembly.LoadFile(location);
foreach (var type in assembly.GetTypes())
   
{
    ... Do stuff here with the type
}

Get The Current Namespace

Something commonly done is outputing of namespace information into a generated class.  To get the current namespace:

string currentNamespace = System.Runtime.Remoting.Messaging.CallContext.LogicalGetData("NamespaceHint").ToString();

More

As I find more I will share them here.  If you have any you would like to share, leave them in the comments below.

T4 Templates in Visual Stuido

Sticking to Default Routs in ASP .NET MVC

When dealing with the sharing of code on a project, I feel like having conventions and standards and following are of the utmost importance.  Only by following the patterns can someone else walk in and understand what is going on.  It also helps newer programmers by letting their acquired knowledge apply across environments.

Occasionally I see people really get into routing when dealing with a web MVC framework.  I have even seen some who place custom routing in their top 5 must haves when looking for a web MVC framework. While it can be fun and rewarding, the most elegant solution sometimes is the one that is default.  Further in my environment we hold to the standard that the simplest is usually the best.  This means using default routing if possible, period.  The first choice we made was already MVC .NET and therefore adhearing to the internal conventions is implied.  For the sake of this conversation we will be assuming we are using MVC as a UI.  API functionality beyond REST may or may not require a different approach. Let us take a walk down the path of :

‘What does that mean practically?’

Here is the default route registered in a new project:

Default Route

Imagine you publish to a place that is reachable via http://example.com/.  Say you have a controller called MyController that resides within the Controllers folder in your project.  It has two methods : Index() and World().  If you go to http://example.com/My you will see the output of the Index() action on MyController since the default action is Index().  You can also execute the same action by going to http://example.com/My/Index.  Hopefully this aids in making sense of the fact that if you visit http://example.com/My/World the action MyController.World() is executed.  Also, by understanding how defaults work, you understand that when you visit http://example.com/ the routing is wanting you to have a controller named Home with an action method of Index().

This brings us to the first point of contention I will acknowledge. That of the third parameter ‘id’.  If we were going to leave the default routing and simply use this value, we end up having method parameters named id when they may be used as something like ‘name’.  Here is where my team and I have made the call that if it is any sort of identifier (like name or GUID, which it usually is) that we leave the default routing and reassign the value to a more specific variable inside the method.

Note that this routing pattern is matching the URL and not any parameter values like those sent in a HttpPost.  For example the Login() method of the default AccountController takes a LoginModel and a returnUrl without changing the default route.  The LoginModel is set via POST values and often the returnUrl is added to the URL queryString like http://exmaple.com/Account/?returnUrl=some_value_here.  On my team these are the preferred methods of assigning values when building with .NET MVC.

Does this mean we never add routs.  The short answer is: We never add routs.  We may make exceptions in extreme situations, but we have not come across any that need it. This quickly brings us to Areas.  Yes we use the defaults here too.

Areas

Creating an Area by the name of AreaName you are given the routing:

Default Area Route

So in this case if you had a controller named MyController like mentioned above but in the project folder Areas\AreaName\Controllers you could get to the World() action method by visiting http://example.com/AreaName/My/World.  Because we often build our controllers for areas in external libraries we add a little salt to our area routs. *Gasp* However, we still don’t actually modify the default route.

Area Route Modified

As you can see above, we add the namespace for the area and a default controller.   This is still the only routing we need.

 Conclusion.

You still may be asking yourself, where does this rule stop? The answer is simple: When making things work with the default routs is more complicated than route clutter.  I have yet to see where this is the case and it doesn’t warrant a whole new project.  Keeping it simple is an art.

Sticking to Default Routs in ASP .NET MVC

MVC Area in an External Assembly

The majority of ASP .NET MVC sites I build I treat the site itself as a single UI.  I don’t bother breaking it down into pieces, it is an MVC after all.  Thus it already addresses separation of concerns.  But what happens when your site gets large and you want to break it into application verticals within one MVC host.  The first thing you do is break the site into Areas.  This makes it possible to keep things organized.  Now lets take it a step further, and you would like your area to be in a separate assembly.  Maybe it is shared between applications, or you just want to track the development and control deployment a bit more.

I Googled all over trying to find a good solution.  The most helpful post came from Patrick Boudreaux a couple years ago.  Working your way through Patrick’s post you see that Javascript and Css still an issue:

To make a fully self-contained portable area, static content such as css, images and javascript needs to be embedded in the assembly, and found in a special way, just as the view templates are.

Thus we will include a way to solve that issue in my own ‘sepcial way’.  Mileage may vary, so let me know in the comments.

You can find the example code on GitHub. This series will be known as The Poor Mans Modular Area for ASP .NET MVC or PMMA for short.
works on my machineWe will try to make this as simple as possible.  I feel like ‘Simple‘ is a forgotten art form in the .NET world.  It would be great to just include a reference to another MVC application and use it’s areas.  If you try that though, you will find issues with both projects trying to take the wheel to drive.  We can avoid that by simply removing redundant pieces of the hosted Area.  Also, I want to simply and clearly prove the way the application is running, so we will assemble a prof of concept project.  We will  perform a simple string assignment in the Area’s controller, print that string in a View and load a Javascript file and a CSS file that all live in the Included project.

Add Area

The Project

Starting the project created a new MVC 5 solution and called it ‘MasterUI’.  Then add a second MVC 5 project called ‘Included’.  Next add an area by right clicking the Include project and selected Add > Area and name it “TestArea”.  Then within the area right clicked on the Controller folder and added a controller named ExampleController.cs.  Then inside the Index action add the line:

ViewBag.Message = "SUCCESS!!!";

Now right click on the Index() method name and selected add View.  My view Code looks like this:

@{
    ViewBag.Title = "Index";
}

@Styles.Render("~/Included/Content")

<h1>EXAMPLE AREA</h1>
<h2>@ViewBag.Message</h2>

@section scripts
{
    @Scripts.Render("~/Included/Scripts")
}

You will notice the Styles.Render() under the default ‘Title’ assignment that refer to a package with a URL that includes the Area name.  I did this to keep convention and keep it clear where the script is coming from.  It may make sense to be even more explicit, but this will be OK for this context.  Then note it will place the message inside H2 tags to prove the controller is talking to the view.  The Scripts.Render() call will place some scripts in the ‘scripts’ layout section of the master layout.  It too refers to the Area name.

From the Included project delete Global.asax, Controllers, fonts, Models and Views. In the ‘MasterUI’ project add a reference to the ‘Included’ project.  Run MasterUI and navigate to /TestArea/Example and prove you get a 404.

The Magic

Custom ToolThe first thing you want to do is add a Visual Studio extension that allows you to precompile your MVC Razor views.  Open Extensions and Updates and search for and install Razor Generator.  Razor Generator has two pieces: a VS extension and a NuGet package.  These enable you to compile your Razor Views into the library.  Enabling this is a little manual, but the payoff is well worth the effort.  Once you install the VS Extension, manage the NuGet packages on the Included project, search for and add RazorGenerator.Mvc.  As long as we are thinking about our views, browse to TestArea\Views in Solution Explorer.  Locate the _ViewStart.cshtml and view it’s properties.  Here is where the manual part happens.  You want to type “RazorGenerator” into the ‘Custom Tool’ field.  This should cause a .cs file to be generated from your view.  There is no need to edit the .cs file, simply make your changes in the .cshtml and it will be updated for you.  Next do the same to Example\Index.cshtml.

Now it is time to setup our static content.  ASP .NET MVC has great static content handling built-in.  We will leverage this by using a custom bundle transform.  You may not consider my method to be “correct” but it seems to work.  I am open to suggestions, let me know in the comments.  The class implements IBundleTransform as follows:

public class StringResourceTransform : IBundleTransform
{
    private List<String> _resourceFiles = new List<String>();
    private string _contentType;

    public StringResourceTransform(List<String> resourceFiles, String contentType = "text/javascript")
    {
        _resourceFiles = resourceFiles;
        _contentType = contentType;
    }

    public void Process(BundleContext context, BundleResponse response)
    {
        string result = String.Empty;

        foreach (var resource in _resourceFiles)
        {
            using (Stream stream = Assembly.GetExecutingAssembly()
                .GetManifestResourceStream(resource))
            {
                using (StreamReader reader = new StreamReader(stream))
                {
                    result += reader.ReadToEnd();
                }
            }
        }

        response.ContentType = _contentType;
        response.Content = result;
    }
}

This allows you to specify a Bundle by giving a list of Resource paths.  When the request comes in, the resources are read into a string and passed back out to via the response.  Given the above class, you would then specify your bundles in the Included project’s Bundle config similar to the this:

var scriptsBundle = new Bundle("~/Included/Scripts", 
  new StringResourceTransform(new List<String>() 
    { 
      "Included.Scripts.test1.js" 
    }));
scriptsBundle.Transforms.Add(new JsMinify());
bundles.Add(scriptsBundle);

var cssBundle = new Bundle("~/Included/Content", 
  new StringResourceTransform(new List<String>()
    { 
      "Included.Content.Test.css" 
    }, "text/css"));
cssBundle.Transforms.Add(new CssMinify());
bundles.Add(cssBundle);

Above first specifies a JS file that is located in the Included project under the Scipts folder by the name of test1.js.  You notice that the contentType defaults to “text/javascript” so it doesn’t need to specify this on these sorts of files.  The second set of lines do the same for a css file located in the Content folder.  The path is specified as the location within the project by setting the files to be an Embedded Resource.

First place a file in the Scripts directory by the name of test1.js.  Place the following text into it:

alert('success!!');

Then in its properties set ‘Build Action‘ to “Embedded Resource”.  Next, add a file by the name of Test.css to the Content directory and also set its Build Action to “Embedded Resource”.  Add the following text to it.

h2
{
    color: red;
}

Now head back to the MasterUI.  There are only a couple changes that need to be made to make it aware of Included.  With all the work we did in the Bundle config, simply add the following to the end of the BundleConfig in MasterUI:

Included.BundleConfig.RegisterBundles(bundles);

This will ensure the bundles fire.  Now you just need to make sure the routing can find the controllers.  The following line could be placed several places, but I just put mine in the Global.asax:

ControllerBuilder.Current.DefaultNamespaces.Add("Included.Areas.TestArea.Controllers");

Success

This bit of magic lets the routing know to look in our Included project area namespace and find controllers.  This way you don’t have to make any crazy changes to routing or build a custom anything else. Now run the application and go to /TestArea/Example.  If you see an alert that says ‘success!!” and a result like the screenshot you did it right.

Enhancements

As you may be thinking, you could certainly do this without having a sub-folder ‘Areas’.  The reason I did this is to make it clear we are dealing with an area, and to keep the namespace a little more organized.

If you are running in debug mode optimizations are off so you have to enable them with

BundleTable.EnableOptimizations = true;

But the problem then is that you have all your minification turned on too.  This is addressed by simply surrounding your minify transforms with #if(!DEBUG) compiler directives.

Conclusion

Using this method I plan on creating a multi-facet application that will be a one stop shop for custom tools where I work.  I will keep you posted on how it works out for me, feel free to do the same or ask questions in the comments.

MVC Area in an External Assembly