Why is it that we abhor large numbers of references in our .Net projects? Beyond the obvious that is (I don’t care about start up time, or any other performance hogwash). It seems that early in my career I thought 10 assembly references was absurd, now that number is MUCH higher (almost without limit) but this view is intensely not the view held by the larger .net community. I take a look at other communities outside of .net and see people doing a much better job of pulling together a multitude of libraries to construct even simple sites versus the vast majority of .net projects I have seen that seem to go out of their way to avoid a dependency hit. Not to mention the enterprise projects that build massive systems and taking no more dependencies than exist in the BCL.
This bias of not wanting to import a larger number of references has impacted the way I design my .net code, and I am not sure I would do the same if I were in another language. For instance, I have a library called Magnum that is a junk drawer of functionality for my other projects. Every now and then someone will see the reference in another project and will go take a look at it. To there surprise, it contains a vast and deep amount of code that could be very useful to them. But to import it means to import a LOT of stuff. Why don’t I break up this codebase? Because of my negative view on assembly references. “Le sigh.” Should I just start splitting things up more? How does this splintering of Magnum effect other things?
A part of me wonders if this isn’t caused by the same effects of ‘The Tale of J. Random Newbie’ in the Art of Unix Programming pg 415. Go read the 4 pages or so. I will wait.
It would not surprise me if the lack of OSS experience and transparency in the large part of the .Net community isn’t responsible for this.
Of course, I can’t, in practice, ignore the impacts of additional assembly references on the performance of my applications. Has anyone actually measured this?