Tuesday, 23 October 2012

Build errors with TeamCity, MSBuild, MVC Build Views and .Net 4.5

Recently I moved an application from .Net 4.0 & MVC3 to .Net 4.5 & MVC4 which for the most part went fairly painlessly.

Previous to the .Net 4.5 upgrade the code was already being built and tested on TeamCity, but as part of the upgrade exercise it was decided to implement deployment from TeamCity using web deploy to a test server.

Getting web deploy set up on TeamCity is fairly painless and Troy Hunt has a nice set of posts about this, what it generally boils down to is using MSBuild to package the code and then calling web deploy to use the newly created package.

I set up the necessary build steps only to find that I was getting build failures on TeamCity when using the MSBuild runner (previously had used VS Solution runner) with the following error:

ASPNETCOMPILER error ASPRUNTIME: Could not load file or assembly 'System.Web.Razor, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.
Project XYZ\XYZ.csproj failed.
Project XYZ.Tests\XYZ.Tests.csproj failed. Project ABC.sln failed.

I appeared to be able to solve this by adding an additional MSBuild property /p:VisualStudioVersion=11.0 which I found from this stack overflow answer.

Once done the code then built successfully but when it then attempted to package and deploy the web site it was failing with the error:

MvcBuildViews AspNetCompiler
AspNetCompiler C:\TeamCity\buildAgent\work\f2cfe4b9b1db787a\XYZ\obj\uat\csautoparameterize\original\web.config(27, 0): error ASPCONFIG: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS.

I quickly discovered that this was an issue with the package command and happened because the obj folder wasn’t deleted, normally you get around this by altering the project file and telling it to use a different folder to output files to but this didn’t work for me. 

After much searching I finally found the answer with this comment on a Phil Haack's blog post where I needed to add another MSBuild parameter /p:BaseIntermediateOutputPath=<your path here> which then allowed the TeamCity agent to create the package and successfully execute web deploy.

This took me a couple of days to resolve and the initial problem was that I had just upgraded to TeamCity 7.1 and incorrectly suspected it of causing me the problem, I was wrong.

Hopefully this post will help you if you too run into the problem.

Thursday, 18 October 2012

Why does agile fail?

There is more than one company out there that has tried to “go agile” but have ultimately failed and gone away with the impression that “agile just doesn’t work” or “agile’s great but just not for us”.  You also have the situation where you have a company that has been working in an agile manner and then agile breaks down.

So why do people fail to successfully implement agile practices? what are the common reasons that are cited? who is to blame?  In this post I’m going to cover some of the factors I’ve seen that can cause an agile adoption to fail, I’m not going to offer any suggestions on how to combat the various reasons but you could consider this a list of symptoms that could indicate problems if you want to implement agile in your organisation.

Tuesday, 9 October 2012

Stop using Priority 1!

I will come as no surprise to hear that  I read a lot of blogs about agile and agile methodologies and  when it comes to ordering/prioritising the backlog you will usually have people talking about priorty 1 this, priority 1 that and with the stories being numbered in sequence e.g. 1,2,3,4, etc

To be honest I'm surprised that after all the years agile has been around that this is still what people are suggesting you do since its actually a complete pain to work with.

What am I talking about? Well if you have a backlog of say 30 items (I'm assuming they are in some sort of electronic system) and they are all number sequentially then if the business suddenly discovers something that must be "Priority 1" you have to alter every other story in the backlog to adjust their priorities and the work required to do this is really pure waste.

A better way of handling the priority is to invert the numbers, make the highest number the highest priority, this way if you want an new highest priority item you simply find the current top priority and add to its number, no need to reorganize the backlog, no change to what you have no waste, plus people often like the idea of giving a big number of a top priority so you are likely to get little resistance there.

An added bonus is with more numbers to "play with" you are less likely to get product owners/stakeholders trying to make everything "Priority 1" and more likely to engage in ordering the stories appropriately for the team to work on.

Also to make your life even easier don't number your stories using sequential numbers leave space between the numbers so that you can slot new stories in between existing ones if you need to.

So instead of 1,2,3,4 try using 4000, 3000, 2000, 1000 it will relive you of a little pain in your backlog management.

Thursday, 6 September 2012

DI != IoC

At the weekend I gave the a presentation “Inversion of control containers Vs HandRolled” showing the difference between using an Inversion of Control (IoC) container versus doing the dependency injection yourself.

I was asked a question on twitter Tuesday

@philpursglove: @nathangloyn meant to chat with you at #ddd10 about why DI != IoC. Have you got a blog on this?

There was a brief exchange on twitter about this but I wanted to just put my thoughts down around this.

For me Dependency Injection is a pattern, it is where a class that uses other classes i.e. is dependent on the classes,  has these dependencies injected into it, whether this injection is via the constructor, property or methods.

When people talk about IoC they most commonly think of a container such as Ninject/StructureMap/Autofac, they think about it as a mechanism to handle the injection of the dependencies so they don’t have to worry about writing all the code to do this.

I believe IoC isn’t just the mechanism to implement DI but its exact definition is a little more difficult to pin down, if you read this wiki on the Castle project documentation then IoC is all about frameworks if you read Martin Fowlers article then its a principle to bear in mind when building your application.  One thing that Martin Fowler does say in his 2005 article:

There is some confusion these days over the meaning of inversion of control due to the rise of IoC containers; some people confuse the general principle here with the specific styles of inversion of control (such as dependency injection) that these containers use.

He is describing DI as a specific style of IoC but I believe it simpler to think of DI as a separate pattern that can stand by itself but could easily fall under the general principle of IoC, hence DI != IoC.

Interestingly the closer you look at IoC to me it seems very similar to the Dependency Inversion Principle but that’s another conversation Smile

Tuesday, 4 September 2012

DDD10 Slides & code from the presentation

On Saturday I presented “Inversion of control containers Vs HandRolled” at DDD 10, I had a really good time and wanted to thank everybody that came to the session, this post is to provide links to the slides and code.

The slides can be found here although the font seems to have been altered so it doesn’t appear as it did in the presentation, I didn’t create the slides in comic sans honest!

At the beginning of the talk I said I didn’t intend to upload the code but I was talking to Rory Becker at the geek dinner and he convinced me to upload the code so that others could look at/play with the code, so you can find the code here.

As always any questions feel free to ping me Smile

Wednesday, 1 August 2012

Book Review: Continuous Integration in .Net

Searching on Amazon for the phrase continuous integration (CI) you’ll get over 2000 results but looking at them you’ll find very few books actually on the subject.

This book has a unique slant in that it focus’ on the the .Net world and how you can introduce CI into your development practices and once implemented build upon that foundation to extend the functionality of your build server.

The authors have structured the book to take a CI novice from installing a CI server through to automating deployment or as Paul Stack describes in his talks how to climb the CI ladder.

Tuesday, 10 July 2012


Everyday we need and use feedback to help us understand that what we are doing is working, when:
  • Executing a build - it tells us if the code compiles
  • All unit tests passing - tells us the code does what we expect
  • Acceptance tests passing - tells us the code is doing what the business expects
Day in and day out we utilise feedback in our work to help us write our code and improve what we are doing.

But what about ouside of code? do you utilise feedback to help you improve?

Do you:
  • Ask people to review your code?
  • Talk to your manager regularly about any targets you've been set?
  • Ask companies you interview with to provide feedback if unsuccessful in?
  • Ask your peers if there is anything you can do to improve?
What TDD has shown us is the tighter the feedback loop the quicker we can adjust our behavior and improve what we are doing. 

Feedback may just be the most important mechanism there is to help you improve as receiving feedback, in whatever form, tells you about how you are doing. The feedback may not always be what you want to hear, it may even be painful, but if you want to improve listen to it as it is could be the quickest way for you to get better at not only what you do but possibly how you do it.

Conversely if you are a manager or perform interviews provide honest feedback as anything less doesn't actually help the person you manage/interviewed to improve.

Tuesday, 26 June 2012

Book Review: The Art of Agile Development

I have decided to go back and re-read some of the agile books from my list of books I recommend to people wanting to know about agile.

The first book that I picked was the last on my list, it is one of the larger of the books I recommend at 440 pages so it has taken me a while to get through it although I’m glad that I have.

The title of the book is a little misleading as the book is not about general agile development but it is in fact all about Extreme Programming (XP) and its aim is to provide the reader with information based on the authors experience not only about what XP is, its practices and principles but some real world advice about agile projects.

Thursday, 17 May 2012

Is definition of done no longer needed?

A while ago now there was a discussion about “definition of done” (DoD) with critics and supporters alike commenting on how they saw DoD and need or lack of.

The critics seemed to fall into 2 categories: those that believe work isn’t done until its actually deployed to production and those that did not see a need for DoD at all, where as the supporters generally reiterated the need for DoD and tried to explain its worth (which on twitter in 140 characters can be difficult), the most memorable tweet from these exchanges came from Hadi Hariri suggesting that DoD was akin to “agile mental masturbation”.

So are the critics right? is definition done not needed?

Tuesday, 15 May 2012

Is software craftsmanship a luxury most can’t afford?

I’m guessing that for a lot of developers out there the idea of software craftsmanship sounds fantastic and that they’d love to be able to do it but everyday they are facing deadlines to deliver software to a client and as such are under pressure to “just do it” rather than taking the care they would wish to.

I firmly believe that software development is a craft, whilst it has engineering principles it can at the same time be very close to art, much like a master carpenter who when creating furniture has specific techniques for creating different parts of that piece of furniture but what they produce when it is all put together is often viewed as a form of artwork.

What a master carpenter has though is usually time to exercise his skills to enable him to produce the best “product” that he can, people can see and appreciate what it is he is producing and therefore the time it takes to create it is better understood by the customer as they can see what has gone into the finished product.

Software, on the other hand, is viewed completely differently. 

Tuesday, 8 May 2012

Developing my first WP7 app

Back in February I learned that Nokia & Microsoft had teamed up to provide free phones to developers that could show that they were developing applications for Windows Phone 7 (WP7), so I decided that this was too good an opportunity to miss and set about creating my first application.

I already had an idea about what I was going to write as had downloaded an application on my existing WP7, a HTC Radar, to help my son with his math but was frustrated by certain shortcomings such as no tombstoning and I believed that I could write my own app that would keep all the good parts of the existing app but sort out the things I saw as wrong.

Friday, 13 April 2012

Windows Phone & WinRT….why?

As I mentioned in my last post my free time, when coding, is taken up with getting my first Windows Phone application completed and published.

So whilst doing this and getting to grips with WP7 development I started hearing about how Windows Phone will be changing (Mary Jo-Foley’s blog, Windows Phone dev blog) and that for Apollo, the next Windows phone OS release, Microsoft will make Windows Phone use the Windows 8 kernel.

My first reaction is why? why change it, Microsoft as a company need to get as many companies/people developing apps on

Wednesday, 4 April 2012

Windows Phone 7 Test Helper

Just a short post as currently all my time is taken up trying to get my first Windows Phone 7 app completed which I’ll blog about once its all done.

When I started my WP7 dev, me being me, I went TDD from the off but ran straight into an issue where I couldn’t fake the IsolsatedStorageSettings.ApplicationSettings which I’ve been using for data storage.

As I’m new to the platform I decided to live with it but last night I created code that I needed to test around storing data so I decided enough was enough and created a little wrapper class so I could fake the ApplicationSettings class.

Thus is born my WP7.TestHelper assembly where I’ll add any other classes that I find where I need to fake them for testing and you can’t do it out of the box.

As usual you can find the code in a repo on my GitHub.

Monday, 6 February 2012

NDepend – final thoughts

Over the last 4 posts I’ve covered the different editions of NDepend, installing and starting off, the report and Visual NDepend and wanted to just wrap up this series summarising the information and adding some of my own thoughts.

The Product

I have to say that the fact NDepend offers a free version for individual developers is brilliant, there used to be several tools in the .Net world that did this but gradually they all seem to have become commercial and only offer time limited trial versions. 

There is a small catch in that the version you download is time limited itself but there is nothing to stop you downloading the another trial edition (web page even tells you when the next free download will be available) but I can imagine this becoming a pain after a while and could lead people to looking for an alternative.

One thing that surprised me was the amount of help and documentation that is available on the NDepend website ranging from simple FAQ’s to video tutorials which will help with not only how to use NDepend but the information the NDepend provides.

The Functionality

As I mentioned in my post on the report when you first start using NDepend you are most likely to focus on the data in the report and although it contains a lot of information I still believe that the report is of most benefit in a CI situation allowing you to track the evolution of your code base.

In day to day work Visual NDepend or the visual studio AddIn is where I believe the most value is to be found for individual developers allowing them to identify the points to focus on.

Although you can use the stand alone application (Visual NDepend) in my opinion for most developers it is the visual studio integration that will provide the most value since you don’t have to leave the IDE to utilise the functionality.

The Code Query Language is what differentiates NDepend from other static analysis tools allowing you to understand how it identifies code that violates its rules as well as making it easy to customize existing or create new rules.


The fact that NDepend provides a way for you to customise the rules that it runs against a code based is, I believe, one of its biggest strengths. You also get fair amount of configuration over how it behaves and looks.

However, there is no easy way to share this customization/configuration between projects let alone other members of the team.  For me this is probably the biggest failing of NDepend, yes you can manually alter each project and using source control if you add the ndproj file then when the next developer pulls/gets the code they will get the settings.

A couple of issues…

One issue is that you need to care about what NDepend is showing you about your code base. There is a lot of information it provides and with the amount of help on the website it is fairly easy to understand the various metrics that NDepend rules identify and use.  Even with all this information you have to care about it, for example if you have methods with a high Cyclomatic Complexity although you may understand what it means if you don’t care then the the tool is of little use.

I feel I have to mention cost, for a larger enterprise organisation the licencing cost is unlikely to be a big issue but for smaller companies & individual developers, or hobbyists, the cost of the professional edition could be just too high. 

I would also mention that I believe if you want the best results from NDepend you need everybody on your team to have it as just having it on a single machine, even if its the build machine, makes it hard to then follow up and fix violations.


As anybody who knows me will know I’m a big fan of SOLID clean code and I like to be able to use metrics to understand my code and where I may need to improve it, I find that NDepend helps me with this.

It is simple to get started with but there in lies its deceptive nature as it can not only tell you a lot about your code base but you can also learn more about various principles behind good code bases (I learnt all about the efferent & afferent coupling).

NDepend is a tool that you need to spend time with to understand what it can show you and how to get the most out of it, if you do decide to invest the time I believe it will repay you in cleaner code and increased knowledge.

Note: I was provided a copy of the professional edition to review back at the beginning of last year, this hasn’t effected what I have said about the product which hopefully you can see in this series of posts.

Friday, 3 February 2012

NDepend–Visual NDepend

The last post went over the report that NDepend generates, this post is going to focus on Visual NDepend, the stand alone program and the functionality it supports.


The difference between the report and Visual NDepend (from now on I’ll just call it NDepend) is that you have much greater ability to drill down into the code base allowing you to jump between various graphs/diagrams, you can then right click on the item and jump to a different view with the specific item the focus of the graph/matrix.

Wednesday, 1 February 2012

NDepend – Analysis Report

In my last post I discussed installing NDepend and getting your first report out, this post is about that report.

I’m not intending to go into every single aspect of the report as it contains a huge amount of information and the report itself provides help and assistance on nearly everything that it contains by linking to resources on the NDepend website.

Note: All details discussed are in relation to the Professional edition of NDepend, it is possible you will not see all this functionality if you are only using the trial edition

The Report

As I mentioned at the end of the last post with the Visual Studio AddIn in place when you build your application NDepend will automatically kick in, analyse the code and create a report for you detailing metrics, adherence to some standard developer rules, graphs displaying further information about the code base in different formats, etc.  Your browser will be launched and the html based report will be displayed but this isn’t a simple static report as it provides the ability to drill down into various sections to look at the analysis data in more detail.

Monday, 23 January 2012

NDepend–getting started

In my last post I outlined the various editions of NDepend that were available, in this post I’m going to go through the install process and first steps with NDepend


Open up the Extension Manager in Visual Studio (Tools –> Extension Manager) and search for NDepend,  you should then see the following:


Tuesday, 17 January 2012

An introduction to NDepend

This is a series of 5 posts about NDepend:

  1. An introduction to NDepend
  2. NDepend - getting started
  3. NDepend -  Analysis Report
  4. NDepend - Visual NDepend
  5. NDepend - final thoughts
Starting with an introduction....

If you are anything like me then you love to have information about the code you write whether its test coverage, cyclomatic complexity, or other metrics that can give you some insight into your code.

Before the shouting starts I’m not talking about metrics as KPI’s or using them for targets that developers have to meet, I’m simply talking about information you can use to better understand your code which may in turn help you to improve the code.

If you are lucky enough to have the Premium or Ultimate edition of VS2010 or VS2008 Team Suite/Developer then you get this functionality as part of visual studio, if not then you need another tool to provide you with this information. Even if you do have the native VS analysis functionality you may find that there are tools out there that provide a lot more information that the built in tools.


NDepend is one such tool it provides metrics on your code but that isn’t all, it will allow you to delve inside the architecture of your application, allow you to diff the code, has test coverage analysis and build server integration built in.

NDepend will also work with VS2005 upwards and as it can be used independently of Visual Studio you can actually use it even if you only have the Express editions of Visual Studio.

Disclaimer: I do not work for NDepend but I was lucky enough to be provided the software to “play with”, my opinion in this and any subsequent post is my opinion and nobody else’s.

I'm going to put together a series of posts, this being the first, on NDepend trying to give you an idea of the product and what you can, and can't, do with it.

In this post I’m going to talk about the different editions of NDepend and give some information on each of them:


Currently there are 3 editions of NDepend:
  1. Trial/Open-Source/Academic Edition
  2. Professional Developer Edition
  3. Professional Build Machine Edition

Trial/Open-Source/Academic Edition

The Trial/Open-Source/Academic Edition, referred to as Trial from this point, is a good starting place for anybody interested in trying NDepend.

As you may expect though this edition has limited functionality but to be fair to NDepend it provides the majority of features that the Professional Developer edition does so you can get a really good idea of what you can do though there are limitations imposed on the level of functionality supported for each feature.

You are able to use the trial version freely for non-commercial software but it is time limited (current downloadable version will stop working 12th March 2012) although you are able to download another version once the software has past its expiry date. The site provides guidance on using the trial version so you can decide if you really should be using it or not.

Professional Developer Edition

This is the full version that you’ll really want if you like NDepend, not only does it provide unfettered access to the majority of the functionality in the software (more on this in a bit) but also comes with additional features:
  • Test coverage import – happy to use NCover or Visual Studio Coverage
  • Reflector integration – provides an addin to reflector to allow you to jump into NDepend from a context menu entry
  • Build Comparison/Code Diff -  provides a way to diff code & builds to understand what’s changed
This edition is the one you’d most likely use day to day to check on how your code is doing and to be able to target any areas that you can see could do with some attention.

Professional Build Machine Edition

If you have a Continuous Integration server then you can enhance it with this edition that will allow you to generate reports from NDepend in your build process on the quality of the code when compared with the metrics you’ve defined.

This integration will not stop your build, it will only generate the report that you have configured.

Edition summary

Above I’ve given you a run down of some of the differences between the versions and what they do a full comparison can be found here on the NDepend site.

As I eluded to the professional version doesn’t appear to support some of the features that the trial version does namely those related to build server integration.

I can understand why this has been done as it allows people to download a single version of the software and try out both the interactive and build integration facets of the application.

It would be nice though to be able to get the Professional edition to do this functionality even if only for trial purposes, it is possible it already does this but the aforementioned comparision chart seems to indicate that it doesn't have those features.

What’s next?

The next post will probably just be a short one as I’m going to be looking at the installation of NDepend.

Tuesday, 3 January 2012

2011 into 2012

Just as I did in last years post this post is effectively a retrospective on 2011, including looking at whether I achieved what I wanted to achieve, and some goals for this year.



In my post from last year my goals were: