Sponsored By Aspose - File Format APIs for .NET

Aspose are the market leader of .NET APIs for file business formats – natively work with DOCX, XLSX, PPT, PDF, MSG, MPP, images formats and many more!

Code the Town! Investing in the next generation of programmers in Austin, TX

Code the TownAustin, TX is a hot-bed for technology.  You can find a user group for just about any technology and purpose meeting almost any day of the week.

And now, there is a group that intersects with giving back to the community and helping the next generation of programmers.  Code the Town is a group that does just that. Clear Measure, and other companies, are sponsors of the group.  The official description is:

“This is a group for anyone interested in volunteering to teach Hour of Code https://hourofcode.com/us in the Austin and surrounding area school districts. The goal is to get community volunteers to give the age appropriate Hour of Code to every student at every grade level. We want to have our own community prepare students for a technology-based workforce. We also want to build a community of professionals and students that have a passion for coding and teaching. We want to begin the Hour of Code in the high schools first. High school students would then be prepared to teach the younger students.  Once this group has momentum, it will be able to form motivated teams and use software projects done for local non-profit organizations to not only reinvest in our community but also to help our youth gain experience in software engineering.  Whether you are a student, parent, educator, or software professional, please join our Meet Up! This will be fun! And it will have a profound impact on the next generation.”

The long term vision is to create a sustainable community of professionals, educators, parents, and students that continually gives back to local community organizations through computers and technology while continually pulling the next generation of students into computer programming.
Simple codeIt all starts with some volunteers to teach students the basics of computer programming.  In the 1990s, the web changed the world.  Now, we have hand-held smartphones and other devices (TVs, bathroom scales, etc) that are connected to computer systems via the internet.  In the next decade, almost every machine will be connected to computer systems, and robotics will be a merging between mechanical engineering and computer science.  Those who know how to write computer code will have a big advantage in the workforce where the divide between those who build/create and those who service what is created might get bigger than it already has.
BlocklyCode the Town will focus on introducing students to computer programming and then pull them together with their parents, their teachers, and willing community professionals to work on real software projects for local non-profits.  In this fashion, everyone gets something.  Everyone gives something, and everyone benefits.  If you are interested in this vision, please come to the first meeting of Code the Town by signing up for the Meetup group.

Posted in Coding Principles, Developer Community, Featured, Learning, Tips & Tricks | Comments Off on Code the Town! Investing in the next generation of programmers in Austin, TX

Learn XHtml and CSS from the pros

We (Clear Measure) are a client of PSD2HTML.  We have a designer, but we have found it more cost-effective to have PSD2HTML take our initial UI designs and create XHtml and CSS out of them.  From there, we will add these screens to the custom web application we are building.

PSD2HTML has several examples showing how a design is converted into XHtml and CSS.  You can examine all the code on their examples page.  It’s interesting to see the techniques used by a company whose core competency is XHtml and CSS.

You can see the original post here: http://jeffreypalermo.com/blog/learn-xhtml-and-css-from-the-pros/

Posted in Uncategorized | Comments Off on Learn XHtml and CSS from the pros

Find me on twitter @jeffreypalermo

If you read twitter, you can find me at http://twitter.com/jeffreypalermo

Since I tend not to write short blog posts (like this one), I put the small ones on twitter.

If you do twitter RSS, you can use this feed: http://twitter.com/statuses/user_timeline/13253882.rss

And normal blog feed:  http://feeds.feedburner.com/jeffreypalermo

Posted in Uncategorized | Comments Off on Find me on twitter @jeffreypalermo

Realistically achieving high test coverage – MvcContrib

Since Eric Hexter and I started the MvcContrib project, we’ve mandated a high test coverage.  If a patch comes without tests, we’d reject the patch. 

Given that MvcContrib exists for the purpose of supplementing a presentation library, ASP.NET MVC, you might think it’s not possible to achieve such a high percentage.

If you were ever curious about how this is done, I invite you to take a look at the project.  The project has 1058 tests at this point and the main MvcContrib.dll assembly has 99% test coverage. 

As an aside, when code is test-driven, the test coverage naturally falls out of this. 

By the way, Eric just released version 0.0.1.118 of MvcContrib to CodePlex.

 

Keep tabs on MvcContrib by following my feed:  http://feeds.feedburner.com/jeffreypalermo

Posted in Uncategorized | 1 Comment

Objectively evaluating O/R Mappers (or how to make it easy to dump NHibernate)

I’m amazed that there is so much talk about object/relational mappers these days.  Pleased, but amazed.  I tend to be in the “early adopter” part of the Rogers technology adoption curve. (Subscribe to my feed:  http://feeds.feedburner.com/jeffreypalermo)

In the .Net world, I didn’t hear much talk about O/R Mappers in the early 2000s.  I started working with NHibernate in 2005 while on a project with Jeremy Miller, Steve Donie, Jim Matthews, and Bret Pettichord.  I researched, but never used, other O/R Mappers available at the time.  Now, in 2008, I find that O/R Mappers in the .Net world are still in the early adopter part of the adoption curve.  We have not yet hit early majority, but we have left the innovators section.

Microsoft has single-handedly pushed O/R Mapping to the center of conversation, and we struggle to objectively differentiate between the choices.  Arguments like “Tool X rocks”, or “Tool Y sucks” are hard to understand.  I’d like to more objectively discuss the basis on which we should accept or reject an O/R Mapper.  As always, it depends on context. 

Context 1:  Small, disposable application:  In this case, we would put a premium on time to market while accepting technical debt given the application has a known lifespan.  For this type of of situation, I think it depends on the skill set of the team we start with.  If the team already knows an O/R Mapper, the team should probably stick with it since the learning curve of any other tool would slow down delivery. 

Context 2:  Complex line-of-business application:  Here, the business is making an investment by building a system that is expected to yield return on the engineering investment.  The life of the application is unbounded, so maintainability is king.  We still want to be able to build quickly, but long-term cost of ownership has a heavy hand in decisions.  Here, we have to objectively think about the tools used by the system.

I’ll use O/R Mappers in this example.  On the right is a common visual studio solution structure.  We would probably leverage the O/R Mapper in the DataAccess project.  I consider the O/R Mapper to be infrastructure since it doesn’t add business value to the application.  It merely is plumbing to help the application function.  By following the references, we find that our business logic is coupled to the data access approach we choose as well as infrastructure we employ.  Often we can build the system like this, and we can even keep the defect rate really low.  This is very common, and I’d venture to guess that most readers have some experience with this type of structure.  The problem with this structure is long-term maintainability.  In keeping with the O/R Mapper decision, five years ago, I was not using NHibernate.  If I ask myself if I’ll be using NHibernate five years from now, I have to assume that I probably won’t be, given the pace of technology.  If this system has a chance of maintainability five years from now, I need to be able to upgrade parts of the system that are most affected by the pace of technology, like data access.  My business logic shouldn’t be held hostage by the data access decision I made back in 2008.  I don’t believe it’s a justified business position to say that when technology moves on, we will rewrite entire systems to keep up.  Sadly, most of the industry operates this way. 

On the left is the general solution structure I’m more in favor of.  You see that the core of the application doesn’t reference my other projects.  The core project (give it whatever name you like) contains all my business logic, namely the domain model and supporting logical services that give my application its unique behaviors.  Add a presentation layer for some screens, and the system delivers business value.  Here, you see I’ve lumped data access in with infrastructure.  Data access is just not that interesting, and system users don’t give a hoot how we write to the database.  As long as the system is usable and has good response times, they are happy.  After all, they are most happy when they are _not_ using the system.  They don’t spend their leisure time using our system.

I consider data access to be infrastructure because it changes every year or two.  Also consider communication protocols like ASMX, remoting, WCF to be infrastructure.  WCF, too, will pass in a year or 10 for the next wave of communication protocols that “will solve all our business problems”.  Given this reality, it’s best not to couple the application to infrastructure.  Any application today that is coupled to Enterprise Library data access will likely have to be completely rewritten in order to take advantage of any newer data access method.  I’d venture to say that the management that approved the budget for the creation of said system didn’t know that a rewrite would be eminent in just 4 short years.

How do we ensure the long-term maintainability of our systems in the face of constantly changing infrastructure?  The answer:  Don’t couple to infrastructure.  Regardless of the O/R Mapping tool you choose, don’t couple to it.  The core of your application should not know or care what data access library you are using.  I am a big fan of NHibernate right now, but I still keep it at arms length and banished to forever live in the Infrastructure project in the solution.  I know that when I want to dump NHibernate for the next thing, it won’t be a big deal

How do I ensure I’m not coupled to my O/R Mapper?

  • The project my domain object reside in doesn’t have a reference to NHibernate.dll or your O/R Mapper of choice
  • The unit tests for my domain model don’t care about data access
  • My domain objects don’t have specific infrastructure code specific to the O/R Mapper

The key is in the flipped project reference.  Have the infrastructure project reference the core, not the other way around.  My core project has no reference to NHibernate.dll.  The UI project has not reference either.  Only in the infrastructure project.

Keep it easy to dump NHibernate when its time has come

For now, NHibernate is the O/RM of choice in .Net-land.   When it’s time comes, don’t go to management and recommend a rewrite of the system because it’s completely tightly-coupled to NHibernate.  Keep NHibernate off to the side so you can slide in the next data access library that comes along.  If you tightly couple to your O/RM, you’ll sacrifice long-term maintainability.

When choosing an O/R Mapper:  The objective criteria I think is most compelling is to determine of the library allows isolation.  If the tool forces you to create an application around it, move on for a better one.  The good libraries stay out of the way.  If your O/R M always wants to be the center of attention, dump it for one that’s more humble.  Use an O/R M that plays well behind a wall of interfaces.  Beware the O/R M that doesn’t allow loose coupling.  If you tightly couple, it’s a guaranteed rewrite when you decide to change data access strategies.

Posted in Uncategorized | 29 Comments