Sponsored By Aspose - File Format APIs for .NET

Aspose are the market leader of .NET APIs for file business formats – natively work with DOCX, XLSX, PPT, PDF, MSG, MPP, images formats and many more!

A simple trick to rationalizing your code environment and build process


My consulting
job mainly consists in auditing real-world project structure. To analyze
properly a tier code base, I prefer to install and recompile it on my laptop
instead of working on a machine that might not have installed tools I need. I
figured out that this first step represents a good way to give advices in order
to rationalize the code environment and build process.


How long
does it takes to muster all the code base into a single zip file (including
source code, resources, debug/release compiled assemblies, pdb, tests code and
resources, factory files, tiers assemblies, xml config files…)? Is there any
file residing outside the root path that needs to be manually copied?


What is the
weight of the zip file? Unless the project has especially big resources
(videos, large bitmaps, sounds…) the size of the zip file should be limited. For
example we spent time rationalizing the NDepend code base. It currently
weights 38MB once zipped for around 44K Lines Of Code. Thus 1 KB per LOC seems to be
a decent value, easy to remember. Are there multiple occurrences of the same
file that artificially load the zip file? For example using the referenced
assembly copy local option of VisualStudio can lead to huge zip file since most
of assemblies are unnecessarily duplicated.


How long
does it take to successfully re-compile on my machine the code base once the
files get unzipped? This duration can be quite long (>1h) if I encounter unexpected
problems to solve manually such as absolute paths hard-coded that need to be relative ; broken
VisualStudio assemblies/projects references ; missing tiers assemblies or
resources ; outdated (or absent) build scripts ; build steps that require
manual work ; build steps that relies on third parties tool that need to be
installed (obfuscator, code generator…) ; build steps that require admin rights
; build tools that require VisualStudio or another special process running ; build
script that doesn’t immediately check if
all outputted files can be created, destroyed or overridden ; special build
tasks such as delay-signing that require external resources…


The ice on
the cake would be to readily run all automatic tests (or even better, to
automatically ran them after the build) and get a comprehensive # test passed/code
coverage report.


We found
out that making sure that this simple test works as seamlessly as possible on our
code base make us de-facto more productive.


This entry was posted in Uncategorized. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • http://www.NDepend.com Patrick Smacchia

    I completely agree with you Fabrice. The point is that not everybody know how to set up a continuous build server but everybody know how to zip a folder (hence ‘simple trick’).

    For example, I know that in your company department a super expert is fully dedicated to set up and maintain your continous builds 😉

  • http://weblogs.asp.net/fmarguerie Fabrice

    Setting up a build server for your projects helps you to improve this. In fact, as soon as you have to build your projects on a second machine, you have to ensure that all the dependencies can be resolved and all the required tools are available.
    If you use continuous integration, your projects don’t build only on your development machine. You have already made a step in the right direction towards code mobility :-)