Showing posts with label Iterative Development. Show all posts
Showing posts with label Iterative Development. Show all posts

Tuesday, 28 August 2012

"Is Everything Alright?” may not be Alright

Incremental deliveries are not just good enough. You should ensure that they are really adding up to the Constructive feedback in achieving your long-term goals, than just aiming for "Everything is Alright" with every incremental delivery.

When our team first started with the Iterative model of development, the information we used to exchange with the customer, used to be something like this:
  • Implemented ABC, XYZ, … features
  • Fixed the Bugs: XXXX, YYYY, ….
Okay, Alright(X) In the follow-up calls with the customer, we used to ask very generic questions like “Is Everything Alright?” OR “Are there any issues?” and we would be eager to hear from the customer side “Okay, everything is just perfect”. We didn’t use to pay enough attention to ‘How they were/should be testing the deliveries'. Often, we used to have a demo, but one cannot cover everything in a demo.

The customer came back much later in the project cycle reporting a few serious bugs which should have, ideally, been reported before. They cost us more considering the amount of refactoring and the other overhead (planning, prioritizing, additional testing etc.) required. This meant that the iterative development didn’t completely serve its purpose, though it made a few things better in handling the project internally.

It took us sometime before we realized a few facts:
  • ‘Incremental Deliveries Are Not Just Good Enough’ and we need to do everything right to ensure the customer would be testing those deliverables diligently in providing the Right feedback at the Right time.
  • Don’t even aim for “All is Well” reply from the customer in the beginning cycles of the project. It is not something to cheer upon. You should welcome Specific and Tangible feedback and Rework but don’t try to avoid it. Anyways you would be working on it later and you cannot escape from it.
  • Also, we became conscious about the addendum information that we had to deliver along with the incremental deliveries.
-Demo
-Exact list of Testcases (Acceptance Tests); Scenarios that the customer can explore.
-What's Pending (Something for which you delivered a temporary or an incomplete solution) and ‘Known Failures’
-'In Process’ user documentation, if any
-Any other useful notes, that is not taught by your Agile coach but contextual.
  • Welcome those short-term hits for the long-term Quality goals.
  • Communicate ‘What has to be Tested’ along with ‘What is Added and Fixed’ because ‘the later a bug is caught, the costlier it is’.
  • ‘You test it Now’ is more practical than ‘You should have reported it before’.

When your requirements are highly unstable and volatile, use the incremental deliveries as artifacts to decide the direction you need to travel than just an instrument to validate what was done. This simple perception change brings in a lot of difference in your approach and methods you follow to receive the feedback. The same might be applicable, but to a lesser degree, in the scenarios where your requirements are fairly stable.

"Remember that the incremental deliveries are, in general, the production quality code drops and not miniature or prototype versions". So, even the validation has to be done on the similar lines. 

Sometimes, it is important to know ‘who would be testing your deliveries from the customer side’. It would be very useful getting in direct contact with them rather than getting the second-hand information from someone who is working with you on the requirements and estimations etc.. This will help process the information better and faster.




Verification QuesionsAre you ensuring that your incremental code drops being tested diligently from the customer side?

Are you talking to the right person from the customer side who is validating your deliveries?

Is the feedback you are receiving ‘Specific and Constructive’?



(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)

Monday, 16 July 2012

TDD is a Control System

Apart from executing Test Driven Development as a "Test First, Test Early" approach, it can be seen as a very basic Control System. So, it is not just about the "when" part (developing the testcases before writing the code) but it is also about the "what" and "how" part of it.

- What is the input you are considering for developing the testcases? Is it just the feature under development or is there something else?
- How are you implementing the iterative model for a progressive development towards Quality by constantly improvising the Quality of your testcases in the project cycle.

What is it about?

Let us say you are developing the testcases for your current iteration (or what you may formally call as a sprint in Agile terminology). Then, how about these feedbacks which go into your TDD model?

- First Level feedback from the internal testing done by the Team
- Customer Feedback for the Current Sprint
- Customer Feedback from the previous Sprint
- Customer Feedback from the Future Sprint


The first 2 items need no explanation and if I write something about them here, it would be just to fill up some space on this post. The last 2 items need a little mention as they are sometimes missed out in the project execution.


How about a small example here?

Let us say that you had delivered Login feature to the customer in your previous iteration and the customer reported a couple of errors around the error handling mechanism in the Login module. Okay, what a big deal and you fixed them more quicker than the customer reported. :) Cool.

Again the customer kept on reporting the similar issues with the other features as well. Well, this is definitely not a good news for both the customer and you, right?


- Customer Feedback from the previous Sprint

--- How the customer is trying to use the application?
--- What's that he is expecting from the application - functional, non-functional, error-handling, usability etc.
--- What are the lessons that you need to consider for the future iterations that would possibly have an impact on the design, testing coverage etc.

We used to call this as "Bug Analysis or Generic Analysis". It doesn't matter what name you give to it and whether you do it in the sprint review meeting or some other informal meeting. What matters is ensuring the "generic lessons are communicated to the team". Suppose in the example above, if the error handling issue is communicated to the respective developer only and another developer does the same mistake in another piece of code again, the purpose of TDD (and the Iterative and Incremental development) can be seen as falling short.


- Customer Feedback from the Future Sprint

Assume, in the example above, you had delivered a Registration feature before Login feature. But the error handling issue was reported on the Login module which you delivered now. The issue is valid for the Registration module as well and it is just that the customer had not caught that. Does that mean, you don't need to fix it in the Registration module because it was not reported there? Obviously, No.

This is what I mean by "Feedback from Future Cycle". This would fix the errors before they are reported. 'Bug is not reported' doesn't mean 'bug is not present'. Just like we do, customers also tend to report different types of errors at different times during project cycle. So, "Don't change the working code" can not applied literally here.



Are you ensuring that the team participates actively in the generic feedback analysis and shares that knowledge than treating it as a mere 'Bug Fix and Verification' cycle?

How useful was the feedback on the current iteration for the Quality of the iteration and for the overall Quality of the project itself?

How better is your TDD coverage in this iteration than the last iteration (Going- forward)?

Are you ensuring that a quick revisit to the previous iterations being done based on the feedback from the current iteration (Going-backward)?


"An effective analysis of a bug reported can help you fix multiple bugs in your software before they are reported"

(Attribution: Images downloaded from FreeDigitalPhotos.Net)

Tuesday, 10 July 2012

Understanding Test Driven Development

One of my friends had told me that they were implementing Agile practices and Test Driven Development in their organization. When asked “What is Test Driven Development”, he answered “the testcases are developed first and the software is written to the testcases”. He also explained that it helps catch the errors sooner than later. This is true. But he couldn’t explain anything more than the DNA of the Test Driven Development as to what were the other benefits that you can reap of the Test Driven Development (TDD) if you implement it in a right way.

 
Well, let’s not do something just for the sake of it.


It is perfectly true that the TDD model helps you catch the errors sooner than later. But more than that, TDD offers the implementation of “microscopic” approach in the project implementation through Incremental and Iterative cycles.



First, what is it all about testcases?
  • Test Cases are direct means of communication among the developers, users and other stakeholders to understand the system in correct and in comprehension and arrive at a common and formal page of contract.
  • It is the language where the stakeholders talk to one another at a lower level of implementation.
  • Test Cases are nothing but “executable specifications” of the system that you are developing. A good set of testcases is nothing but “working code in plain English”. Did we get it right before jumping on to TDD?

Theoratically, there are different models practised for the implementation of TDD.
-----------------------------------------------------------------------------------------
  • In a pure and religious Agile model, the developer owns the responsibilities of test case development and the execution, along with the developing the code to those testcases.
  • In other models, there is still a dedicated Testing team and the Testing team develops the testcases in parallel with the code constuction.
-----------------------------------------------------------------------------------------

Irrespective of the model your team is following, you need to ensure you are developing the comprehensive (Test for Good & Test for Bad; Functional & Non-Functional) set of testcases before jumpstarting to coding your iteration or sprint or feature - whatever the TDD in picture is for. Otherwise, TDD comes of no additional value in your project.
 
To illustrate with an example from my experience, we used to implement TDD as follows:
 -----------------------------------------------------------------------------------------
  • Depending on the team composition, team skills or the interest of the individuals (which you need to respect), the team members used to play varying roles.
    • Development-Only
    • Testing-Only
    • Both Development and Testing
  • There were no hard and fast rules on the above team composition but the team would do it through "collaborative planning". Yes, "team collaboration" should be applied in planning phases as well, as a side note.
  • Developers used to wear those Testing Hats in the development of the critical pieces of the software. This could be from business requirements point of view or technical implementation point of view.
  • The Test Engineer or the TDD-Developer will come with his first set of estcases and call for a "review" meeting with other team members.
  • And believe me - these review meetings are the places where a lot of brainstroming used to be done and people get into a lot of interesting discussions and raise a lot of questions and get answers. This also proved to be a very effective informal platform for the Knowledge Transition.
  • Interestingly the "take away" items from these TDD meetings used to be:
    • Expected Results answered (DONE)
    • Expected Results unanswered (TODO item for the Developer, TODO for the Test Engineer, or a QUESTION to the customer)
    • New Testcases developed
    • New Testcases to be developed
    • Overall an "executable specifications" coming into a shape on a "collaborative platform".
    • Everyone understanding of the system a little more and a little better.
    • ...
  • The developer would now start with the code and possibly with those TODO comments which will need to be answered shortly.
-----------------------------------------------------------------------------------------

Refer to my article "TDD is a Control System" to understand a few more practical facts of paying continuous attention to improve the Quality of TDD beyond an iteration cycle.



If you a Project Manager:
  • What is that you are doing to ensure your team is practising the right TDD principles than mere "Test First, Test Early" approach?
  • How are you ensuring the collaboration of the individuals that the "Wisdom of the group" is used for building up the "Executable Specifications" or formally "Testcases"?
If you are a Developer or Test Engineer:
  • Did you ensure you've got the testcases "correct and complete" before constructing the code to that list? Rememeber TDD is all about minimizing the rework and refactoring.
  • Are you ensuring you are communicating to the Test Engineer beyond formal meetings?
  • Did you get those unanswered questions in your TODO list and in your TODO comments in the code before they end up as a BUG reported by someone else at a later stage?
  
"Right Practice is what you need to ensure Quality, name of the Process makes no difference" 


(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)