A typical effect of setting CopyLocal = true

If you read me in the past, you certainly know that I have a problem about the Visual Studio default option that set CopyLocal = true in project references assemblies. I mean this option:


I’ve already explained the problem in my blog, and also in White-Books available on the NDepend website. (see white-book Partitioning code base through .NET assemblies and Visual Studio projects page 4 CopyLocal = true is evil)


For example, in 2009 by fixing the problem on the NUnit code base, I’ve been able in less than an hour, to shrink the compilation time from 32 seconds to 12 seconds.

So what is the consequence of Nancy framework relying on CopyLocal = true: A huge number of cloned version on Nancy.DLL for each VS project referencing it. After a recompilation, I can count 56 occurences of Nancy.dll that weights 842 KB each, it means more than 46MB wasted. Apparently it doesn’t affect so much the VS compilation duration, I guess the VS team made VS 2012 more intelligent in the sense that if it has already parsed a reference assembly for a project, it doesn’t parse it for others, which is a good thing.


I tried to post something on the Nancy user group, without success, hence this post. To fix this issue, it is as simple as:

  • making sure that each project compile in a ..\bin\Debug and ..\bin\Release folder
  • from VS projects, reference the DLLs in the ..\bin\Debug directory, and set CopyLocal = false for each reference

Additional, I found ..\bin to be a good folder to host test assemblies. This way they can reference the application assemblies that live in ..\bin\Debug and the indirection can be handled with a App.config file like:

On Nancy code structure

One very good point is that Nancy.dll is a single DLL that only depends on .NET Fx assemblies so it is optimally packaged.


On the other hand, this large DLL made of more than 7.000 lines of code and almost 300 classes, doesn’t abide by any kind of layered architecture. The consequence is that each namespace depends on all other namespaces.


As a consequences classes are not layered at all, there is no high level or low level classes.

Let’s have a look at the dependency matrix below. It is made of the 287 Nancy.dll classes, in mode indirect dependency. A blue cells means that the class in column depends (directly or indirectly) on the class in row, a green cell means the opposite, and a black cell means that the 2 classes are dependents on each others. A perfectly layered set of classes (which is not wished, but what should be approached) would be a lower triangular matrix blue, and a higher triangular matrix green. Here the matrix has blue and green cells mixed equally in both lower and higher triangulars, which is another way to figured out that classes are not layered.


It means that it must be difficult to unit-test a set of cohesive classes (i.e a component) in isolation from the others (even by using mocks that don’t help in case or re-entrant dependency). It also means that maintenance and evolution must be impaired by this absence of code structure, because touching a single class can virtually have an unexpected effect anywhere in the code base.

This entry was posted in Acyclic componentization, Code Dependency, code structure, Code visualization, CopyLocal syndrome, Dependency Cycle, Dependency Graph, Dependency Matrix, graph, Graph of Dependencies, Lines of Code, Maintainability, Partitioning, Visual Studio, VS, VStudio. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • Visual Smarter

    Nice article. The copy to local as true will cause a lot of issues many times, such as API lib conflicts, loading issues, and so on. By the way, the Visual Smarter has a lot of widgets which can set reference properties automatically and can also help copy, add, remove, and update references automatically. The Multi-Reference Tweaker utility is also nice to try.



  • Filip Kinský

    I still think this is just kind of workaround to MSBuild weakness rather than the only right approach to take. Wouldn’t it be better if we could instruct MSBuild to create symlinks instead of hard copy of referenced assemblies? Because many times you just need to have the referenced assembly present in the output folder of a project – when the project is “executable” and needs to load all the referenced assemblies (that includes test projects as well).

  • El Servas

    In the Projects references vs assemblies references issue, Why not just keep using project references, BUT with Copy Local set to false?
    This way, you don’t lose the project building order, if you switch to Assembly References. Is there something I’m missing?

    Great posts!

  • http://blog.sublogic.com/ James Manning

    I just learned about this today, so apologies if this has already been mentioned and rejected and I just missed it, but it appears that starting with 4.x, there are some properties we can set so that the copies will be hard links (at least in the typical cases).


    The particular set of properties:


    The targets file using them (%windir%Microsoft.NETFramework*v4.0.30319Microsoft.Common.targets) has it disabled when BuildingInsideVisualStudio == true, although it’s not clear to me off-hand if that’s being defensive or if there’s a known problem introduced by doing so.

    If your primary concern is build inside of VS and you wanted to use this, then it would seem like you’d have to either figure out how to get BuildingInsideVisualStudio == false (which might break other things, since it seems to be used by many targets files) or modify the Microsoft.Common.targets to remove that BuildingInsideVisualStudio == true condition as forcing the CopyHardLinks* properties false.

  • http://www.joshka.net/ Joshua McKinney

    Hi Patrick,
    Do you have a time comparison (apples to apples comparison on you machine) to CopyLocal=False with Project references instead of assembly references. It’s been said that the concerns are orthogonal, and I’m curious which of the suggestions bears the most performance gains.

  • PatrickSmacchia

    Indeed, when converting project references to binary you need to right-click VS projects in Solution Explorer and select the option: “Project Dependencies” that leads to “Project Build Order”. That requires a bit of maintenance but offer a bit of flexibility.

  • PatrickSmacchia

    No I don’t have specific recommendations for web-project, would be interesting to have some though

  • PatrickSmacchia
  • MBR

    What tool was used to generated the class-diagram and DM?

  • Søren Trudsø Mahon

    Do you have any recommandations on copy local with web projects? Visual studio seems to require the “bin” with all references to be in the “iis” folder?

  • Filip Kinský

    There is much bigger problem in converting project references to binary – you loose project build dependency information, so VS/msbuild doesn’t know the order in which to build projects and you’ll probably end up with some obscure build error messages when you make a change in some shared project.

  • Dan Puzey

    Without having taken the time to go away and test this, surely you run here into a different problem? If, as you say “Whatever the VS config, the DEBUG version of the bins are referenced,” surely that means I can’t locally debug my application unless I’ve first built in Debug? This could be quite an overhead in a case where you need to use an alternative configuration (remembering that “Debug” and “Release” aren’t the only two options…)

  • Alexandre Defalque

    Unfortunately it still doesn’t work in VS2012 :( Yes I had seen that Resharper is solving that, unfortunately it’s a bit heavy to use especially in large solutions. Ideally, a smaller VS addin would handle the F12 only

  • PatrickSmacchia

    In VS2012 with R# F12 works perfectly with referencing assemblies. I am not sure though, if it is a R# facility or if in VS2012 they fixed that? With R#+F12 its is even possible to download .NET Fx source code when you want jump to a definition of a .NET Fx code element!

  • Alexandre Defalque

    One of the main drawback of referencing assemblies instead of projects is that F12 (Go to definition) does not work anymore. It will display the object browser instead of the type source code. Quite annoying, not sure if there is a workaround for this.

  • PatrickSmacchia

    Tim, my recommendation in the post is to have 2 bins folder, ..binDebug and ..binRelease. There is no risk of mismatch doing so, if you choose VisualStudio config DEBUG it runs binaries from ..binDebug, if you choose VisualStudio config RELEASE it runs binaries from ..binRelease.

    Whatever the VS config, the DEBUG version of the bins are referenced, but because CopyLocal = false, you have no risk to have a DEBUG assemblies to be copied inadvertently in ..binRelease.

    If you have a different public API between Debug and Release assemblies, then indeed there can be some breaking change at compilation time (but still no risk of loosing precious time on a hidden problem). But if you have a different public API between Debug and Release assemblies, then you are living on the edge zone anyway :)

  • Tim Wilde

    The wasted size and time point has been discussed already, but I’m concerned that your recommendation is to add a reference to the binaries in the debug folder rather than referencing the project output.

    If you reference the binary directly, rather than the project output you essentially bypass the build configuration which can cause all sorts of problems such as debug binaries getting deployed to live environments; the wrong configuration files being copied; speed and security issues…

    Don’t do that, please.

    Sorry to nit-pick, but that’s one of those things which can take hours to find when things go wrong.

  • PatrickSmacchia

    >46MB isn’t much by todays standards.

    It is not much, but it is a waste because it could be easily avoided. Also 6 seconds of compilation could become 2 or 3 seconds by using CopyLocal=false. Modifying all CopyLocal would take half an hour, maximum an hour, and every contributor would have a win on its machine. Is there any other 60 minutes to-do action with a biggest benefit for all?

    The problem in having super powerful today machine, is that developers don’t count resource anymore. So often, it leads to a mythical monster code base that is costly to touch and maintain anymore.

    Nancy is still not yet a huge application and it is OSS, still I don’t see a reason why such an obvious waste should remain unfixed?

  • PatrickSmacchia

    Sure, now that everything is public and used massively, there is no point in making breaking changes just to enhance the code structure. But if at the time when this method was written and then commited, a rule or a project spec have detected that Nancy.Helpers is low-level and must remain low-level, then the project structure would have been a little bit better by now.

  • http://www.philliphaydon.com Phillip Haydon

    What is the problem with it being “duplicated” or copied to the referencing projects, 46mb isn’t much by todays standards.

    Beside’s its just the codebase, the assemblies we use are often from Nuget and we only reference it once. There’s no issue outside the codebase. This complaint to me is nothing more than nit picking and I’m disappointed.

    We aren’t dealing with 100’s of projects. There’s currently 57 projects. A lot of which are tests, samples, or plugins.

    Deleting all bin directories, and doing a Rebuild, takes 6 seconds. That doesn’t warrant the time gained by referencing a single assembly.

  • marcusoftnet

    That sounds awesome, Patrick!

    3 simple abstractions. You could whip that PR together in no time :)

  • PatrickSmacchia

    Marcus, these are two facts, Nancy.dll is duplicated 56 times, Nancy.dll has no structured level in the DLL architecture. It doesn’t mean it is not a successful OSS project with plenty of talented contributors and awesome community.

    >Just because a graph comes out strange… doesn’t mean that the code is a mess.

    Indeed, but this has necessarily some impacts on how the code can be approached, tested, touched, maintained…
    While we can discuss how the no-structure might or might not impact the maintainability, I cannot imagine any advantage of having the main DLL duplicated 56 times, especially when it weights more than 800KB ?

    My guess is that the project is lacking some abstractions (not necessarily tons of abstractions) that would clarify how it is structured. For example, the namespace Nancy.Helpers is clearly a low-level namespace (as all Helpers). Still it uses the main Nancy namespace. If we look carefully, we can see that only CacheHelpers.ReturnNotModified() (in Nancy.Helpers) is using NancyContext;Request;RequestHeaders from Nancy. Abstracting these 3 types, or moving up CacheHelpers in Nancy would remove the Nancy.Helpers -> Nancy dependency, and this would be already a win for the project.

  • marcusoftnet

    Interesting post,

    But I must ask: have you even looked into the code? Nancy is the best maintained code bases I’ve seen. I have been looking through it a lot and things always make sense and behave as I expect.

    It’s also a model of how you run a great OSS-project (over 110 contributors also seems to like it). Quick responses to PR’s and issues. Awesome (!) community and all around nice people involved.

    Just because a graph comes out strange… doesn’t mean that the code is a mess.

    My two cents