This entry sounds like project management 101 and can be summarized as: don’t assume because when you do… and you know the rest of that one. The trick is to uncover your implicit assumptions, the explicit ones are easier to identify, whereas the ones lurking in your subconscious may only want to come out when they can trip you up.
In my previous post I discussed various types of testing and alluded to what each type entails. Implicit in each of the test execution activities was the assumption that the necessary preparations were complete. The testing you are doing will determine the kinds of prerequisite preparation work you need to do and there’s no escaping the need for some upfront work.
I always think there is a level of theoretical abstraction that goes with project management because in an ideal world you can imagine a mechanistic approach where you have your tasks and your deliverables and as long as you sequence them correctly your project will complete. Reality has a way of distorting theory and throwing a few challenges into a project. I’ve not worked on many (any?) large projects where a the timing and coordination of deliverable completion has been perfect, so every project has to make a judgment call on whether enough of each deliverable set is available to meaningfully support the next set of activities.
Enough of the theoretical abstraction: let’s discuss an example. I once worked on a project where the team wanted to do some real world simulation testing, basically a day-in-the-life test (DITL) in a pre-production environment. The testing was to include functional configuration, reports, automatically executing interfaces, end user security and authorizations, master data extracted and converted and loaded from legacy systems, and overnight periodic processing. Unfortunately, certain aspects of the design weren’t complete–report specifications varied from non-existent to in-development; interface triggers had not always been included in the design and construction; no data conversion exercise had achieved more than ~60% success; and the batch scheduled had not been defined. So, aside from these shortcomings everything was good!
Any one of these issues in isolation could probably be managed and worked around during testing–after all, most of us can get by without a report for awhile, but in the end so many piece parts required for the DITL testing were incomplete that the testing was seriously delayed. In many respects the project became a slave to the timeline and the quality of the deliverables was allowed to slide until this testing activity late in the project lifecycle. A more thorough and dispassionate ongoing review of each of the components required for testing would/should/could have revealed their status and enabled a better deployment of resources to get the deliverables designed, built, and individually tested. Instead the project went through a period of triage to assess which were in the worst condition, which would take longest to fix, and which dependencies existed between the parts. The happy ending is that some weekend work and some dedicated brute force and intellectual effort resolved the situation.
The Takeaways
-
When you get parachuted into a project situation and you are told that you need to go execute an activity, take the time to layout your assumptions and push to get them validated. It is far better to spend a couple of days on due diligence than having the situation implode all around you.
-
If the required parts are not complete, you need to make a judgment call on whether you can proceed with testing on a smaller scale, go ahead regardless, or wait until everything is ready. Generally, I’d recommend testing whatever you can, but only if you are proving out something that hasn’t already been shown to work in an earlier type of testing.
-
If you proceed with testing before all the parts are ready, you need to build in iterations of testing – a kind of regression testing – to ensure that as each new finished part (code, configuration, etc.) is introduced it does not break anything previously tested.
-
Be prepared for increased overall project phase elapsed times when you have staggered completion of deliverables that need to be part of your testing activities.