A new adventure at Splunk after an amazing 8 years at Microsoft

[Update: I removed the names I had listed below, as there is no way I could capture all the people I worked with and I don't want to miss anyone]


Two weeks ago I left Microsoft and joined Splunk. After 8 wonderful years this was not an easy decision. Why did I leave then? One reason, I am ready for a change. I looked within myself and felt the time is right. I am ready to take these years of learning and use what I’ve learned and the experiences gained outside Microsoft in a smaller growing company.

Joining Splunk

Splunk offers an amazing product which helps IT-Pros and Developers analyze machine generated data in realtime. It is an invaluable resource (from what customers tell me) in diagnosing problems that occur at the hardware, system or application level. It has many uses far beyond that, but everything revolves around operational intelligence.

I have joined the team working on the developer experience for Splunk. My team produces the SDKs (Python, Java, JavaScript, C#, Ruby and PHP)  for talking to Splunk as well as the new application framework.  If your applications are integrating with Splunk, or you are considering it, we should talk!

I am really excited to have joined Splunk and to be working on such a cool, powerful, and innovative product. I really like the team and the vision. I am looking forward to helping to take the developer story to the next level and to working with you to make it happen.

Microsoft and moving on

Almost 8 years ago I received my blue badge and started at Microsoft. I came after 10 years of working in the industry, and about 10 years of wanting to work at Microsoft. :-) I was so excited to have made it, I remember thinking “You have arrived”. I was passionate and driven to help change the world through technology. I was looking forward to working with amazing people and on important projects. Flash forward, I can say for sure that I found what I was looking for, and much more than I would have imagined.

Microsoft really was a land of opportunity for me. I joined at time when developers were being very critical of Microsoft’s platform. The common criticism which echoed very loudly from groups like ALT.NET was that the MS Platform/.NET was too heavy, did not embrace good software development practices that the community cared about like TDD, and that it did not play well with open source.

I found myself in a place where there was an opportunity to make a difference. It somehow became my personal mission to help change the way we build the platform and look at open source. I worked toward this throughout most of my time at Microsoft in various teams like Prism (patterns & practices), MEF, Web API and Windows Azure/Node SDK. I was not alone. I had a long list of wonderful people (too large to do justice to) I worked with internally toward this goal.

That mission brought me in places I never imagined. I got to work across the company with various teams helping them to adopt a newer approach. I got to travel the world and talk about the work we were doing, I met with (and worked with) fantastic people both inside and outside of Microsoft. I learned a lifetime. I couldn’t have hoped for better!

I am really happy with the progress that has been made at Microsoft. The DNA has changed in much of the dev platform. Microsoft did a complete 180 on the attitude toward OSS and supporting OSS tools. Nuget has been a big help in making open source much easier to consume in .NET. Several SDK projects are on Github, Apache 2 licenses for Microsoft projects are everywhere, Codeplex supports git, and many platform projects are taking pull requests from the community.  In addition, Microsoft is proactively supporting other open source projects like Node and jQuery.

With all this change, there are new critiques. In .NET there is the concern that Microsoft’s OSS efforts overshadow other efforts of the community. Some are concerned that Microsoft is investing too much in other stacks (like Node, PHP and recently Java). Others are worried about the future of .NET itself.  I understand the concerns, but for me personally, things are is much better than where they were.

As to my experience at the company, was it always rosy? No, there were plenty of challenges and I had plenty of my own tough moments. Look, it’s not a perfect company (I am not sure there is one) by any means and its definitely got its problems. I had my own particular style and passions that didn’t always gel with my management. I definitely had my frustrations and felt plenty of times like a square peg trying to fit in a round hole. Regardless, the pros definitely outweighed the cons. I couldn’t have dreamed such an experience would have been possible. I would do it again.

I want to thank everyone (too many to name) who has worked with me over the years and supported my efforts at Microsoft, both inside and outside the community. You made my work possible and I learned a lifetime from you.

Thank you to all my managers and mentors both internally and externally who’ve invested their personal time and energy to help me, you all know who you are!

One person I need to mention, Scott Guthrie. The things you’ve done with .NET and Windows Azure have been fantastic and I look forward to the future. You’ve been a personal force in my career and it’s fair to say that I would have left Microsoft a long time ago had I not believed in your leadership. Thank You!

Posted in Uncategorized | 32 Comments

Scripting ease with Script Packs

Script Packs are a really cool extensibility point we added into scriptcs. A pack delivers a bundle of functionality that makes frameworks more palatable to consume from script. They are available as nuget packages making them very easy to consume.

For example, if you look at our Web API sample, you’ll see there’s a bunch of friction if you just try to get Web Api working from scratch.

  • You need to add using statements for each namespace you want to use. This is a lot more painful than one might think when you don’t have intellisense.
  • You need to configure web API, this involves creating a host, defining default routes etc. Adding lots of object creation and such starts to make the script pretty hairy. Not impossible, but painful when there’s no template.
  • You need to teach Web Api how to resolve controllers in script by implementing a custom controller resolver.

Now pull in the Web Api script pack (scriptcs –install scriptcs.webapi) using the Require<WebApi>() function and your boiler plate code evaporates to this:

The script pack does all of the following to make the experience better:

  • Removes the need for using statements for common namespaces. The script pack provides those which is why you don’t have to add the web api namespaces in your example above.
  • Adds dll and nuget package references that bring the dependencies the framework needs.
  • Removes general boilerplate code. In the previous sample you need to create a host, define routes etc as I mentioned. In this case the script pack creates the host for you and configure with the default routes. You can customize if you need to.
  • Provide APIs to fill gaps that prevent the framework from working well in script / supporting dynamically emitted assemblies. The Web Api script pack brings in and configures a custom controller resolver for you.

We’re just getting started with the work we’ve done with script packs, but they are a really nice extensibility point and really take advantage of nuget as a delivery mechanism. The community has been rising to the occasion and building out quite the gallery as well.

There’s some great posts about script packs covering topics like how to build them or even use them from the REPL that you should really check out.

Have fun exploring the new world of scripting in C# with scriptcs!

Posted in c#, scriptcs | Leave a comment

scriptcs gets a REPL!

Hello c# scripters!

Before you go further, if you are wondering what all the scriptcs hype is about please check out Scott Hanselman’s great post and his new Tekpub video.

Last few days I’ve been working on a new REPL experience for scriptcs and now it’s in! REPLs are nothing new to dynamic languages but they have not really been available in C# with one exception, Mono has a great REPL. Roslyn introduces the Roslyn interactive window in VS which is also a REPL which runs in the editor.

This REPL is different than both in that it is specific for scriptcs, and like the rest of the scriptcs experience, there’ s no IDE required. Basically it combines the goodness of scriptcs (nuget) with an interactive experience. You can just install some nuget packages and type code which instantly executes. For example imagine just pulling in HttpClient and then just doing some http requests!

Below you can see I am installing the mongo nuget package. Then running scriptcs by itself and typing in some simple code to work with the package.


And then we have pretty error handling.


Thanks to the Roslyn team’s efforts layering a REPL on top was AMAZINGLY easy. The REPL part itself was like 15 lines of code! And thanks to the nuget team for even making this a remote possibility through having an awesome package ecosystem. I look forward to ALSO seeing the mono version soon :-)

If you wanna try it, you can grab the latest from github at http://github.com/scriptcs/scriptcs in the dev branch.

It will also be on our chocolately nightly builds tomorrow: http://www.myget.org/F/scriptcsnightly. It should be on our public feed very soon there after.

Posted in scriptcs | 8 Comments

Debugging node.js errors in Windows Azure

A common problem you might encounter in Windows Azure is seeing a big old “The page cannot be displayed” page due to an error occurring in your azure-deployed node app.

If you haven’t seen it, it looks exactly like this:

Screen Shot 2013-03-31 at 8.21.37 PM

The reason this is happening is because by default we don’t allow errors to propagate back to the user. This is in order to prevent you publishing to production and leaking details of your system.

Let’s assume I just published a simple express app, but I made an error. Below you can see that instead of requiring winston, I required win$ton.

Screen Shot 2013-03-31 at 8.29.21 PM

When I publish, I’ll see the page cannot be displayed error.

iisnode.yml to the rescue

To find out what is going on, you can enable displaying errors using our YAML configuration file otherwise known as iisnode.yml. If you use our azure-cli tool, and create a site in a directory where there is a node app, then we automatically create it for you. If not, you can create one really easy with any text editor. The entries you need to put in the file / update are below.

Once you save that file, go republish to Windows Azure. If the file did not exist and this is an Azure Website, do an “azure site restart”. if the file already existed, you won’t need to do anything.

Now with debugger errors enabled I see the following:

Screen Shot 2013-03-31 at 8.34.14 PM


Don’t go to production this way, but DO use tail for websites.

Make sure to disable debugger errors when you go to production. You can leave logging enabled however, because another command you can still use is “azure site log tail” which will stream logs directly to your console realtime. Also if you are a cloud service, you can remote in to view the logs written locally.

Using “azure site log tail” watch what happens when I refresh the page.

Screen Shot 2013-03-31 at 8.37.58 PM

azure-log tail has many more uses. You can use it to capture any log output that your applications give you in a realtime manner. For example here’s a screenshot of me tailing an app using socket.io and Service Bus in Azure.

Screen Shot 2013-03-31 at 8.40.42 PM

Yes it is awesome!

PS: Today “tail” works only for node apps, but that is going to change VERY shortly.

Posted in azure, cli, node.js | Leave a comment

azure-scripty – Azure CLI scripting made even easier

Note: At any time if you want to just get it and jump in, “npm install azure-scripty” and start scripting

If you are using our azure-cli then you might have thought of creating automation scripts to package up common tasks. A while ago I posted on how you can achieve some script-ability just using bash tools. Basically the techniques I listed there involve piping text from command to command and using tools like grep and awk to parse the results in order to feed to the next command.

It turns out there’s another way, and one that is much closer to the kind of fidelity you an achieve with Powershell. Sound interesting? keep reading. It turns out the majority (like 99%) of our cmds can return JSON objects if you apply the –json switch.

For example if I do “azure site list –json” here’s what I get.

Screen Shot 2013-03-31 at 5.35.57 PM

That means that I can take that result and parse it into a JSON object which I can then easily manipulate. Potentially I could even pipe results in from one cmd to the other. I could even imagine being able to define cmds as JSON objects so that I could script out my tasks in a more object friendly fashion. I could then wrap that all up in a nice pretty box to be used by myself or others.


And that is what azure-scripty is about. (Kudos to @JpScripter for reminding me I need to post.)

azure-scripty gives you a JSON oriented API that you can use within a node script for automation tasks. It lets you combine the power of raw node with the capabilities of our azure-cli. One thing that is also really nice is it leverages the knowledge you already have of the CLI. As it says on the github page, “if you know the CLI you know scripty.”

azure-scripty offers you the following:

  • Works anywhere node works i.e. Windows, Linux, Mac, Nodecopter :-)
  • Author using pure string command or JSON object command styles.
  • Uses the standard node callback model
  • Batching multiple commands together.
  • Piping results from one command to the other

Here’s a few examples to give you an idea:

String command style

This will list out my websites in json format.

Object command style

This sample creates a new mobile service using the object oriented style. Notice you can pass fixed position args, with all other named args being by convention.


This sample shows creating a web site and a mobile service using the object oriented style.


One of my favorites. This example shows using piping to pipe the list of sites into the stop command. Notice that :Name, that’s plugging in the Name parameter from the returned site.

Get it and please give feedback

  1. npm install azure-scripty
  2. go check the README.
  3. Start scripting!

Looking forward to your feedback and your contributions!

Posted in azure, node.js | 1 Comment