Dispatch Integration’s project management philosophy is built on our DIVE methodology, which is an acronym for Discover, Innovate, Validate, and Empower. Part 3 of this blog series focuses on Validate and the criticality of testing throughout an entire project lifecycle.
In traditional software development approaches, testing is back-end loaded: develop a solution, throw it over the wall for the business to test, and have the bugs thrown back over the wall to remediate. This part of the “waterfall” method has been proven as a disastrous approach, that results in many project failures. The Waterfall method is now largely replaced with Agile, especially in product companies. Agile’s philosophy of iterative build->test->learn->adjust has proven to increase project velocity, reduce cost, and result in far better software.
Despite the obvious and proven benefits of Agile, it remarkably has not taken off as it should in professional services projects, even though many of these projects are extraordinarily expensive and business-critical. Instead, professional services firms often use two variants of a traditional approach:
- Dear client, you must provide us a massive business requirements document where we expect you have provided detailed specifications of precisely what you want. What you specify is what you will get. If it turns out you couldn’t perfectly articulate what you needed upfront, we’ll charge you extra for every change you request.
- Dear client, we are the experts. Because we are the experts, we will do this project “to” you. Yes, I know you think you have unique requirements, but we’ve done this a thousand times, and the best thing is for you to change to fit our model. Your input isn’t helpful because it will cause us to deviate from our proven plan.
There are many issues with these approaches, but they persist in professional services because they provide an illusion of certainty in a proposal or statement of work. The problem is, transformative projects have, by definition, lots of assumptions, variables, and uncertainties to manage.
Dispatch’s DIVE project management approach is a hybrid model that integrates Agile elements with more traditional methods. We believe this is a good approach that provides sufficient certainty regarding budgets and planning while leveraging the value of the Agile methodology.
Perhaps the most significant benefit we see with DIVE is in test and validation. The DIVE approach involves our clients early and throughout the entire project lifecycle. At our very first Discovery sessions, we begin documenting the definition of “quality” from different stakeholders’ perspectives and defining how we will validate that we’ve achieved those objectives.
Development happens in agile sprints, with continuous unit testing. Client teams are involved through each sprint – which serves as learning vehicles for both the client and us. An essential part of validation is whether originally conceived ideas about design and functionality still hold when demoed. Quite often, “a-ha” moments happen with the business team when they can see something working, necessitating revisions. This is not a bad thing; it’s normal and valuable to get that feedback early and often. As well, this approach shakes out bugs throughout the project.
Our agile approach doesn’t replace the final end-to-end and User Acceptance Testing (UAT). But DIVE tends to lower the stakes of UAT, with far less likelihood of negative surprises emerging at that time.
We find UAT still can be stressful for business users – after all, they each have day jobs, and UAT testing is usually squeezed in with regular responsibilities. Business users also are not professional testers, and the entire process of testing can be unfamiliar and confusing.
We try to streamline the UAT process in a few ways. Ideally, we believe UAT should be a final check that the solution is working as it should, across a spectrum of use-cases when operated by typical business users, without causing unintended consequences. It shouldn’t be an “over the wall” step that replaces comprehensive testing by our team. Whenever possible we run through each UAT test ourselves to uncover bugs before wasting business users’ time. This is far more efficient than scheduling a UAT session with a business team, only to have it stopped and rescheduled because issues were found that prevented testing from continuing.
We know testing is not in most business users’ job descriptions and is an unfamiliar activity. To help ensure UAT testers can be comfortable with this role and efficient, we deploy “Test Sherpas” during UAT. A Test Sherpa is typically a business analyst who provides guidance and training to the team on how to test, helps testers understand what they are testing for, prepares templates to ease documentation of issues, and generally helps client teams complete this activity efficiently and thoroughly so that exiting UAT, confidence is high.
We find two key advantages of this approach: 1) overall quality of the solution is higher – with far less chance of critical bugs slipping through into production, and 2) users are more confident because they were involved through the entire lifecycle.
In general, Validation is not a distinct time-based phase of our approach but rather a component of our project plan that continues from Discovery through to Empowerment. Some of the test concepts and artifacts we use include:
- Approved Test Scenarios: “Testing” begins in Discovery when high-level test scenarios are formulated to communicate back to the client our understanding of the acceptance criteria for the critical behaviours of the development piece of work
- Documented/Executed Unit Tests: Unit testing involves testing individual modules of an application in isolation, without interaction or dependencies. This ensures that the isolated piece of code is working as it was intended to.
- Documented/Executed Integration and End-to-End Tests: Integration (functional) testing encompasses checks to verify if separate code modules are working in synchronization with each other. This type of testing proves whether a slice of functionality in a system does or does not work.
- Documented/Executed Acceptance Test Cases: Acceptance testing ensures a system fully meets the needs of the business by evaluating the system’s compliance with the business requirements and assessing whether it is acceptable for delivery.
- Defect Log: A defect log is a living document, including triage history, usually updated by the team conducting User Acceptance Testing. In most cases, our clients own and maintain this log to track their testing history, often using the templates and methods we recommend.
- Smoke Testing: Post-deployment testing to ensure the migration and move to production have been successful.
Communication regarding validation is crucial throughout the project. Developers are responsible for unit tests of their work product, but otherwise, we prefer a clear division of responsibility where someone other than the developer does the testing. Because we feel testing is all about learning, solid communication between the tester(s) and developer(s) is vital to ensure efficient bug squashing. We also need to have complete transparency with the client regarding defects found so that they can participate in prioritization and re-validation.
Clients entrust us to deliver high-quality solutions – many of which are mission-critical. We take that responsibility very seriously, and baking Validation into our projects is one of the key ways we work to deliver value on-time, on-budget, with high quality.
The next blog in this series will be Empower – stay tuned for this final phase in our DIVE project methodology.
To learn more and to explore how we can help make your application ecosystem your competitive advantage, please contact us now.
Dispatch Integration is a software development and professional services firm that develops, delivers, and manages advanced data integration and workflow automation solutions. We exist to help organizations effectively deal with the complex and ever-changing need to integrate data and optimize end to end workflows between cloud-based, mission-critical applications.
Read More from Dispatch Integration:
DIVE IN – Integration Project Management: Part 1
DIVE IN – Integration Project Management: Part 2
The Role of Project Managers in Software Integration Projects
Stephen Fontana is a seasoned Project Manager with experience in managing projects for international financial services institutions. He specializes in the software development life cycle and Scrum/Agile Methodology.