Yesterday my team was talking about how often the CruiseControl.Net build ran each day. The question arose, how often should we check in? For example, our team of four developers creates about 20 successful builds in a full day of coding and I think that sounds all right. I can't imagine there being a definitive answer, but you should strive to check in every time you hit a stopping point , and come to a stopping point as often as you can. Much of the point of doing Continuous Integration is to get more feedback from the system and work in smaller steps. Stopping to do a full check in dance (update from the trunk, run the NAnt script with tests) is a way to verify that your code can be integrated into the trunk with everyone else's new code. Doing frequent check in's can keep those nasty merge problems away.
Some tasks are going to tear up the code for longer periods of time. If you've got stale code because of one of these Herculean tasks, update your code from the trunk as often as possible to reduce the risk and burden of merging later. Frequently run the entire suite of unit tests as you work if you're making changes that might ripple out into other areas of the code.
The key concept for me in all software development activities is rapid feedback, reducing the cycle between doing something and verifying that something. Frequent check in's ala Continuous Integration are definitely in line with that philosophy.
I'm going to assume that we all agree that checking in code that doesn't compile or pass its unit tests is a bad thing. Broken Windows and all that stuff. The unit tests don't provide much value if you ignore them or don't run them.