A quick analyze of the .NET Fx v4.0 Beta1





Let’s have a look to see
what the tool NDepend
shows when comparing .NET Fx
v4.0 beta1 assemblies with v3.5 SP1 ones. I will use a few CQL Queries to get fact from the code base of the Framework.


A view of the work achieved




The following treemap
shows in blue, new methods (i.e WasAdded)
and methods that have been refactored (i.e where CodeWasChanged). A rectangle represents a method and the size of
the rectangle is proportional to the size of the method (in terms of IL
). It is pretty impressive to see
that a big part of the framework is still evolving, even core stuff in mscorlib and System.




New Core Public Types


SELECT TYPES FROM ASSEMBLIES “mscorlib”, “System”, “System.Core”

WHERE IsPublic AND WasAdded


221 new public core types, including:

  • some concurrent
    collections in System.Collections.Concurrent
  • the new parallel API in System.Threading and System.Threading.Tasks 
  • a new SortedSet<T> collection class, 
  • the long time awaited
    contract API in System.Diagnostics.Contracts
  • a new class System.Numerics.BigInteger (finally!) 
  • new support for System.TimeZoneInfo 
  • new delegates classes Action<…>, Func<…>, types with highest generic arity (up to 8 generic
    parameter types)
  • new structures Tuple<…> + System.Collections.IStructuralEquatable/IStructuralComparable/
    StructuralComparisons whose main use
    seems to handle cases where a method
    return multiples values (not 100% sure on this + I am not sure it is
    performance wise to use an extra generic structure instead of out/ref keywords in such situation).

Update: I forgot System.Core.dll, here is what’s new in this assembly:

  • more expression types in System.Linq.Expressions (dynamic, switch, goto, label, try, catch …)
  • Parallel LINQ types in System.Linq (ParallelQuery, ParallelEnumerable…)
  • 20 binder types in System.Dynamic
  • the new namespace System.IO.MemoryMappedFiles to finally have acess to this cool WIndows feature from the managed world (+associated safe handle types)




New Assemblies




I can see 28 of new
but the installation of
the .NET Fx assemblies in C:\Program
Files\Reference Assemblies\Microsoft\Framework
+ C:\Windows\Microsoft.NET\Framework is pretty messy and I might have
miss some important new ones. However it is worth

  • Microsoft.CSharp.dll: a pretty big assembly that shows only 17 public types related to the
    dynamic C# 4.0 keyword (see advanced explanations in this post).
  • System.Dynamics.dll : with mainly internal types that seems to represent the internal
    plumbing to link dynamic invocation with 
    COM invocation.
  • System.Windows.Forms.DataVisualization.dll, System.Web.DataVisualization.dll
    that contains the new Microsoft charting API
  • System.Xaml.dll with a new API to play with Xaml explained here.
  • System.Activities.XXX.dll and System.ServiceModel.XXX.dll
    assemblies for WF 4.0 (more info here
  • System.Caching.dll for a new caching API (code name velocity) exposed here.  
  • Microsoft.Build.XXX.dll assemblies apparently for  new
    (DataDrivenTasks, FileTracker) MSBuild task but I didn’t find more info on this
    on the web.
  • FSharp.Core.dll apparently, when searching for new assemblies, I missed this big
    assembly that resides in its own dir
    C:\Program Files\Reference Assemblies\Microsoft\F#\v4.0



New Public Types




All in all we have 1 560
new public types
. I take a chance here
to show a new NDepend v2.12 feature where code elements matching a CQL query can be
viewed hierarchically. This feature makes very convenient exploring a
large set of code elements matching some criteria, like these 1 560 new public types,.



New Public Namespaces




In the assemblies
considered, NDepend found 89 new namespaces containing at least one public




Rambling on the .NET Fx evolution

The .NET framework is
getting richer but bigger. Getting richer is positive, because programmers
can benefit from more API with the level of support of Microsoft. But I see some
issues now:

  • Bigger code means more CPU
    for loading and more precious in-process memory to host code. For example, just
    using the new dynamic feature will results in loading 2 assemblies
    (Microsoft.CSharp.dll 478KB + System.Dynamic.dll 119KB) + code compiled
    natively, so at least 1MB. One lesson learnt from the Vista fiasco
    is that programmers cannot count anymore on better hardware. Nowadays
    the doing more with less tenet makes
    more sense than a decade ago.
  • I didn’t include that in
    the comparison analysis, but there and there I saw several assemblies suffixed
    with v3.5 and v4.0. Microsoft addresses one by one flaws of the past, which is great,
    but this means more overridden code kept for ascendant compatibility only (non
    generic collection, old threading API, .NET Remoting…). The Fx keeps the trace
    of its history.
  • A bigger Fx means also more
    complexity for programmer to tackle it. While the core of the Fx (mscorlib + System)
    is still pretty neat, seasoned programmers cannot master anymore all aspects of
    the Fx (Winform + WPF + ASP.NET + WCF + WF + ADO.NET + Entity Fx…).
  • By developing a tool like
    NDepend we are stuck working with .NET Fx 2.0. When you are an ISV targeting multiple
    clients worldwide it is still too early to assert that the bulk of users have
    .NET Fx 3.5 installed. Hopefully some new features are available on .NET Fx 2.0
    like contract and some LINQ stuff, but they are more the exception than the

All in all the grow of .NET
Fx is a bless for programmers. We must keep in mind that today, if one is using a tier charting API
for example, this API require more code loaded at runtime + learning process of the API. By extending the .NET Fx with new features like a charting API, Microsoft is aggregating features that would require resources anyway.

The question is: should the Fx grows indefinitely?
What would a .NET Fx v10 look like? What are the plans of Microsoft for the
long-term, for the next 10 years?


This entry was posted in Uncategorized. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • http://www.dynconcepts.com TheCPUWizard

    I find the comments about the size of the framework quite funny (in a very sad sort of way).

    If Microsoft had decided to develop and release 20 DIFFERENT products for the different needs, then each one would be smaller and “simpler”; but think about the result.

    People would be screaming about having to learn different environments for different tasks.

    I have been doing heavy .NET development for nearly a decade (I own a consulting firm), and have not had a single instance where the size impact of assemblies was a concern [and I deploy some applications down to machines with 512MB total memory].

  • Daniel

    I think a large portion of the changes in existing mscorlib methods is due to the C# 4.0 compiler generating different code than the C# 3.0 compiler.

    For example, the “lock” statement now produces different code; and events also are implemented differently (instead of locking on “this”, they now use a lock-free algorithm to add/remove handlers).

  • Kris Williams

    I don’t mind so much that they are adding new features to the framework. I do feel they need to factor legacy APIs out of the core assemblies, which will make them smaller and load faster. For instance move some of these APIs to [orginal name].legacy.dll.

    For one, I’d like to see them move all GDI-based p/invokes out of mscorlib; which would allow basic .Net to be used on a minimally configured HyperV host or an embedded system.

  • A. T. Sesan

    Interestingly, no doubt that this calls for a new challenges, to boost our application performance with virtual realities.

    But I think the release do NOT permit developers to fully utilize the functionality in 2008 before releasing 2010 and who knows when the next version will be released.

    I think this area should be look into.
    Nevertheless, this is awesome. Thank you…

  • http://www.tomergabel.com Tomer Gabel

    Re tuples, I think this was intended to enhance functional-style programming, which would require conveniently accessible (read: part of the BCL), immutable tuple class. If I understand correctly, in C# 4.0 you can create a simple stream of anonymous tuples with the dynamic keyword, something like

    dynamic result = from x in source select new { x.Name, x.Url, … };

    Syntactic sugar though it is, this is most sweet and is also language-interoperable (F# and IronPython natively support tuples, so you can natively use tuples with Python/F# libraries). Maybe this will bring algorithm and software developers closer…

  • http://geekswithblogs.net/jbrayton/ Jeremy Brayton

    I second the ‘never been cooler’.

    I realized something as I read this a second time. What hasn’t been addressed is the client framework stuff. Using a bootstrapper or ClickOnce I think you can just install the assemblies your app references, no more, no less. They snuck that one in and if it works even remotely like it looks then it looks like the answer to the bloat problem, at least for now.

    Couple it with targeting and you can skip the “fluff” 3.0/3.5 W*F additions and the like considering it’s all just added to the 2.0 runtime. I don’t think you get the newer stuff like lambdas though so there is a trade off.

  • http://codebetter.com/members/Patrick-Smacchia/default.aspx Patrick Smacchia

    I got request from NDepend users that wish to perform .NET Fx comparison analysis by themselves.

    It won’t work as-is, you need to tweak .NET projects’ folders. Indeed, by default, NDepend append folders of .NET Fx version (highest on your machine) in the list of .NET projects’ folders. This is a problem because NDepend will then found several .NET Fx assemblies with different versions (like System V3.5 and System v4 for example)

    Concretely, what you need to do is then to go to:
    VisualNDepend > Project Properties > Code to Analyze > Folders
    and provide the folders of the .NET Fx version you wish to analyze. these folders can be found under:
    C:\Program Files\Reference Assemblies\Microsoft\Framework

    An even better solution I use is to copy all assemblies of a .NET Fx version in a custom folder and do the analysis from this folder.

  • http://codebetter.com/members/Patrick-Smacchia/default.aspx Patrick Smacchia

    >The time will come when specialization for the developers will be needed e.g. winforms developer, or wpf developer.

    I think the time has come since .NET 3.5 was released

  • Dimce

    Now comes to the point “Size does metters” :)

    Yes the .net is big…very big. As a developer in a transitional country, where customers have celerons with 1 gb ram, installing a .net 3.5 is a problem. What about the .net 4.

    And other thing that you have written about it specialization (ASP.NET, WPF, WF, WCF, WinForms…) it’s too much. It’s excelent thing knowing all the tools that Microsoft provided to satisfy the needs of the market…but for the developers it’s too much. The time will come when specialization for the developers will be needed e.g. winforms developer, or wpf developer.

  • http://www.deannolan.co.uk Dean Nolan

    Interesting analysis. I think most of the added stuff is worthwhile and size isn’t a problem yet.

    I also can’t imagine say .NET 10 being much bigger. By then we’ll probably be using a new lighter weight framework anyway 😉

  • http://codebetter.com/members/Patrick-Smacchia/default.aspx Patrick Smacchia

    Ibrahim don’t misunderstanding me. The .NET Fx and VS have never been so cool than now. IMHO there is not better development platform so far.

    I just see some indirect problems and surprisingly nobody seems to worry about the new .NET versions adoption or the giant-growing-size of the Fx and its impacts.

  • ibrahim

    Is this correct to continue with microsoft? what about the fate of our future developers with .net x?

  • http://codebetter.com/members/Patrick-Smacchia/default.aspx Patrick Smacchia

    Jonathan, the picture in PNG would weight 20MB, hence the choice for a degraded JPG format.

    Thomas, Mono is certainly a bit smaller just because it doesn’t implement everything, but it won’t solve the problem. The issue are not related to the framework itself but to its size.

  • http://blogs.southworks.net/jdominguez Julian Dominguez

    Nice article… it has a nice summary (although subjective, which I like) of the new features in the FX, and made me discover the upcoming things in XAML for example :)

  • Jonathan Gilbert

    Can you re-generate the image and save it as a PNG instead of a JPEG? JPEG is absolutely horrible at pure additive primaries, and especially bad with blue. There is a lot of JPEG artifacting in both the thumbnail and the fullsize image. :-(

  • http://www.wagnerblog.com Thomas Wagner

    I wonder how Mono would come out of an analysis like this. Seems that it is a smaller framework which would address the size concerns

  • http://codebetter.com/members/Patrick-Smacchia/default.aspx Patrick Smacchia

    Marc, do you have a link where use of Tuple for multiple-segement keys in a Dictionary/Hashtable is shown?

    Brian, despite the fact that I emit caveat about the size of the Fx, I disagree with:

    >Did we really need all the new API’s etc, for most the answer is no.

    We all have different needs and i am sure that all these API will meet their users. I am mostly concerned with the complexity of all this and also the weight of the code at runtime, even though assemblies are loaded on demand.

    Most of machines are still 32 bits where having more than 1GB dedicated to a managed process is pretty dangerous (OutOfMem exception is luring at anytime then). I did the experiment of playing a bit with VS 2010 beta1 and then saw with the debugger that more than 500 assemblies were loaded in memory!

  • http://musingmarc.blogspot.com Marc Brooks

    Tuple is really useful for multiple-segement keys in a Dictionary/Hashtable.

    Also, you’ve misspelled parallel as parralel in several places.

  • Brian

    .net has lost its way, it’s now so large and so diverse it’s Vista of the programming world. Did we really need all the new API’s etc, for most the answer is no. Does it make life easier for the user (developer !) no. Its slow, it’s complicated and that spells problems.

  • Tomasz Masternak

    Very informative article. When it comes to memory consumption maybe Microsoft should use OSGi?

  • http://mosesofegypt.net Muhammad Mosa

    Really cool, I should play with NDepend and try those queries out on .Net 4.0 Beta 1

  • http://blog.alvarezdaniel.com.ar Daniel Alvarez

    Useful information. Thanks!

  • http://www.davidezordan.net Davide Zordan

    Great article, thanks :)