.NET Fringe, defining the future

test

There’s a long history in OSS communities of projects emerging from the community that start on the fringes. The people working on these projects are motivated by a desire for change, for doing things in a way that is not necessarily the norm. The fact that they exist is a sign of a healthy community.

As a very salient example, take jQuery. At one point it was a small JavaScript library that a few passionate developers worked on, it then grew to be the de facto library for developers everywhere. Thanks to a rich ecosystem around consumption and contribution to Open Source, what was once fringe become the mainstream.

Until recently, this ecosystem has not really existed in the .NET world. However, times they are a changin! In the past 5 to 10 years there have been major positive changes. One really important one was the ground work laid by ALT.NET. It made a very loud call for many of the changes we’re seeing. Another one is the bold steps Microsoft has taken to level the playing field for OSS libraries and tools. Regardless, the important point is this ecosystem I described is here in .NET and growing.

There’s too many examples to name all but I’ll list a few recent projects that illustrate this : jQuery, NuGetGithub, JSON.NET, AutomapperXamarinNancyFX, and .NET vNext. This is just a sampling that does not do justice as there are many many other examples.

This change is important. This is just the beginning, but it’s a great beginning. A group of us think this is so important, that we’re putting together an event focused on this topic, .NET Fringe.

We’re bringing together members of the .NET OSS community that have been working to define the future. They are going to share their works, share their learnings, and share their passion. And it’s happening in Portland, a place rich in OSS culture.

Be part of something amazing, come to .NET Fringe!

Posted in .net, oss | 10 Comments

A short note on build tasks

Surely, I’m behind the curve here but I’ve been thinking about the typical build process here at work. For a long time I’ve been operating off of the classic model from back in my NAnt days where it was all about Build -> Unit Test -> Integration Test. Maybe if you are feeling fancy you have some different steps for running database migrations or running other code analysis tools.

However, I’ve been working a lot with the concept of microservices in my side project. Trying to figure out where they break down and where they shine. Also, just trying to figure out the practical issues with running them. There is so much going on in our domain right now that trying to keep up can seem like a tidal wave. So if you are new to some of these concepts, well – so am I. :) The following is a brain dump of ideas for managing all of this crazy.

Package

First, I’ve started adding a few new steps to my build process. First off, is a reminder of how important a ‘packaging’ step is. Rob and I built this into UppercuT but its really coming back to me how important this is. I think its critical to realize that your default build output may not be enough packaging. In Visual Studio (well for me anyways) there is a very common behavior to just grab the ‘bin’ output and run. Because I need to package up my source better, I am now running my build step, then moving/copying all of the deployable content to something like a ‘build_output’ folder.

For those of you doing classic Visual Studio / C# / .Net development, I strongly recommend that you break out of your IDE for this. I would invite you to look at the power contained in your command line tools, even CMD can be used with great good. From there look at PowerShell. For me its bash, but I really need to look at ZSH.

Now that everything is in the ‘build_output’, my next step has been to run various HTML/ASPX/CSS/JS minification programs (and the litany of lilliputian tools arrive) to compress and optimize my application. Next step has been to then package all of this build_output content into a deployable unit. For my .net apps this is a NuGet via OctoPack and for my side project it has been Debian files.

Push

This leads me to my newest build task, push. Push for me takes all of my nice new build output and makes it available to the larger audience. Note that this step could be run by me or by my automated build tooling, but what it does is simply take the assets and for my .Net projects throws them in Octo’s NuGet repo or for my side project uses fpm/deb-s3 to generate my own Debian repository so that I can then pull these assets down and deploy them in testing / staging / production and have a consistent experience.

So, nothing really earth shattering here, but I wanted to share my thoughts. Also, I find these kinds of topics hard to find out there in the interwebs. If you have any great articles around the topic please do share in the comments.

Posted in Uncategorized | 1 Comment

Changing roles and focus – but not good bye

I’ve been presented with a great opportunity to work on the NeuronESB Team (www.neuronesb.com). I’ve always wanted to work on a product team – which is a big change from my work over the past 23+ years where my focus was on custom application development and related consulting. I’ve been on the Neuron team now since 1/5 and it has been a great experience.

So…what does this mean as far as my community activities? Candidly, I’ll be cutting that back quite a bit. I will try to get to 1 or 2 shows this year. If you follow me on Twitter, you began to see my travel schedule pick up. We’re not through February yet and I’ve already racked up 26K miles already. Given that trend, it doesn’t leave time for conferences – which is difficult because it means I won’t get to see good friends throughout the year. This is also a good thing as I’ve been speaking at shows, non-stop – for 6 years. Even without this job change, I was likely to cut back on the shows. It’s time for others to step up and share their knowledge with the community.

I’ll still continue to write for CODE Magazine and I will still continue to produce videos for WintellectNOW. I’ll do my best to support the Code Camps near me (Philly, NYC, Central Penn). With all of this travel, I may be in town for a user group and If possible and if you would like to have me, I’d love to speak at your group. I’ll post on Twitter my travel plans as they become known. Most of my contributions are going to be in the legal arena – specifically on open source and intellectual property as they relate to technologists, contracts, etc.

Posted in Uncategorized | 2 Comments

Typescript Support in Atom Editor for Windows

Recently I was trying to get TypeScript support working inside the Atom editor on windows.

In my attempt to get things working I went to the Atom site and found the TypeScript package. Per the documentation I did ‘apm install typescript’. After about 15 seconds it appeared that I was good to go. Sadly this was not the case. When I opened Atom (by typing in atom on the cmd prompt) I would receive this error.

atomtypescript
Because I like to follow directions I restarted Atom (again via the CMD prompt). Sadly I received the same error again… WTF.

Well a quick google search for ‘These are now installed. Best you restart atom just this once.’ yielded one result. However, when I clicked on the link I was taken to the github 404 page, seems that link is dead. What to do now? Lucky for me there was a cached version of the page I could look at (thank you google).

Looking through the source file I was able to find the block of code which was throwing this message (seen below)
atomsource

It appears that both linter and autocomplete-plus are required in order for TypeScript support to work. I assumed these would have been installed by default, but guess not.

I thought I would simply try to install these Atom packages in hopes the error would go away. To accomplish this I ran the following 2 commands

  • apm install linter
  • apm install autocomplete-plus

Once I had both of these packages installed I tried to reopen Atom. To my excitement the TypeScript message was no long present. To ensure my fix worked I decided to edit a .ts file and yup, my stuff recompiled down to js…

Hope this helps,

Till next time,

Posted in Uncategorized | 3 Comments

JavaScript Code Coverage using Karma-Coverage w/ Grunt

As part of our ongoing effort at my client to setup a testing environment for our JavaScript code I wanted to also setup the ability to do code coverage on our files.

To accomplish this I am going to integrate istanbul coverage reporting w/ our karma test runner via the karma-coverage plugin.

** I am going to assume you already have JS tests running w/ Karma and Grunt **

To accomplish this we first need to install the following NPM packages

  • npm install istanbul –save-dev
  • npm install karma-coverage –save-dev

Next thing we need to do is open our karma.conf.js file and make some changes to it.
1) Update the reporters configuration

reporters: ['progress', 'coverage'],



2) Add a preprocessor section to the configuration.

    preprocessors: {
        // source files, that you wanna generate coverage for
        // do not include tests or libraries
        // (these files will be instrumented by Istanbul)
        '**/js/page/**/*.js': ['coverage']
    },


3) Setup the coverage reporter. This is the outputted format of the results.

    coverageReporter: {        
        dir: '../../../grunt/js.coverage/',
        reporters: [
                { type: 'html', subdir: 'report-html' },                
                { type: 'teamcity', subdir: '.', file: 'teamcity.txt' },
        ]
    },



In my setup I am doing 2 things.

  • I am placing my coverage files inside my grunt working directory. This means I need to back navigate to the folder.
  • I am outputting to both HTML and teamcity format. You do not need to specify more than one format if you do not want or need to.



3) I added the karma-coverage plugin to the plugin section. When I left this out I would get an error, adding this resolved the missing plugin error.

   plugins: [
      'karma-phantomjs-launcher',
      'karma-jasmine-jquery',
      'karma-jasmine',
      'karma-coverage',
    ],


After you have made the following changes you should be able to run karma via grunt as you normally would and boom, you have code coverage for you Jasmine JavaScript files.

Till next time,

Posted in Grunt, Jasmine, Testing | 3 Comments