Showing posts with label Test Driven Development. Show all posts
Showing posts with label Test Driven Development. Show all posts

Thursday, 2 August 2012

Discover the Design, Don’t Impose

"When we are asked to carry out the software design, the thing that immediately strikes to some of us is - a big picture of Design Patterns, Block Diagrams, UML Diagrams, CASE tools etc. And this is where some of us either shy away from or misinterpret the design job. But design is mainly about our approach to the problem and all the above are merely tools and means that may complement the job."

Let us look into why is that so?

In the article “What is TDD made for?”, we discussed the approach to TDD by doing a small exercise on “building an elevator system”.

Let me copy & paste some text from that article for our quick reference:

 
 
What are the different requests and how they need to be processed?
(External Request) Elevator has to go to a floor if there is User request from that floor.
----Elevator has to move UP if the request is from an upper floor.
----Elevator has to move DOWN if the request is from a lower floor.
(Internal Request) Elevator has to go to a floor if there is User(inside elevator) request to go to that floor.
----Elevator has to move UP if the request is to an upper floor.
----Elevator has to move DOWN if the request is to a lower floor.

When does the elevator stop?
----Elevator has to STOP at a floor if there is an external request or internal request to stop at that floor?

What happens when the elevator stops at a floor?
---- Open the Door
---- Update the internal request as DONE or DISCARD.
---- Allow time for passenger to Get-In and Get-Out
---- Handle max-load checking, if any
---- Close the door
---- Update the external request as DONE or DISCARD.
---- (Well, I am not writing all other features (light, fan, displays etc.) of our modern elevators as we are not really going to write the software here)


Well, this has also ended up as a TODO for us but we have added very good set of testcases coming up in the form of “executable specifications”, and what do good “executable specifications” do? They feed to the design and that’s what has happened now – we need another Black Box((a decision-making method) to decide whether to move up or move down. Also, we have identified the parameters(variables) that these decision-making methods need.

Along with this, we have identified different actions (Just execute them – moveUp, moveDown, Open, Close). Also, we have, almost, identified the Data Structures to store the User Requests (Internal and External). Overall, our skeleton code is pretty much visible now.




 Now, let us analyze what we have done purely from a design point of view.

The first step in design is problem analysis - Get it Right in Plain Text.

ID-10088170
If we look at the above example, Software Design can be seen as a hidden characteristic of a given problem. The more we try to understand the problem behavior and its constraints (may be you can formally call it as analysis), the better input for the design comes out. Thus the design is discovered than applied.

--Let us not try to solve the problem before understanding it. A problem that is well understood is half-solved.

--Problem analysis doesn’t have a language or technology associated with it. It is merely to do with our cognitive (analytical and reasoning) approach which is always of paramount importance.

--A good problem analysis would give us better requirements for the design.

Here, in this example, we have derived the following design requirements in  a plain text like this:

---- We need "decision making" mechanisms for:
------- Deciding the Direction the Elevator moves
------- Deciding whether the Elevator stops at a Floor

---- We need "action" mechanisms for:
------- Open
------- Close
------- moveUp
------- moveDown

(Please note that the list above is conceptual, not complete)



The next step is program analysis - Get it Right with your technology.

ID-10091668
Now that we know what is expected out of the design, it is the stage for the program analysis - how do we implement the design in our programming language or with our technology, and this is where our technical skillset comes into picture.

For example, once we have understood the basic in-built design of the above elevator problem...

---- If we are implementing in a procedural language, we would probably be passing different datastructures from method to method or probably handle them as global variables or something similar, and read and manipulate them from different methods.

---- If it is OOAD, all the above actions would be encapsulated in an Elevator Object and with our problem analysis, it has already come out as a State Machine Model for us.


So, the design has to be seen as a problem analysis followed by program analysis. Merely knowing a set of design patterns or frameworks cannot make someone good at design. 

It’s about a smooth transition of a business problem into technical problem and then to a technical solution by applying analytical and reasoning skills.
Of course, there might be requirements that are technical in nature where our problem analysis and technical analysis may overlap to some extent but our logical approach would be the same. Same goes with the requirements that are high-level where the design decisions would be of the nature like - which WebService to use or which XML Library to use etc.

What if a proper Problem Analysis is NOT done?
--If a proper (I'm not saying perfect) problem analysis is not done, we are adding more variables to the equation. We are mixing up number of business logic variables and programming logic variables. And I don't need to explain further on what it costs.
--There would be considerable refactoring and rework done adding to longer implementation cycles.




Has your team discussed the problem enough before discussing the solution(design)?

Are you ensuring that your team is doing enough exercise to get the 'design requirements' in plain text?

Is your team doing a smooth transition of problem to technology, than jumping straight on to a technical solution?



(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)

Saturday, 28 July 2012

What is TDD made for? Requirements? Design? or Code?

What is Test Driven Development made for? Is it for requirements? Design? or Code?

First, TDD conveys the message – “You know it well, before you code it”. It basically encourages that the Quality is built into our software proactively by “development through testing” than reactively by “testing after development”. So, we use the testcases for constructing rather than validating.

Next, the questions that may arise are  – “How would I implement TDD cycle within a given iteration or sprint?”.
---Should I implement a Waterfall cycle within the iteration (Get all testcases ready first, then design, code and test)?
---OR should I write the code test-by-test basis, and progressively refactor the code so that the design evolves through refactoring? (Lean within lean development)

Oops, the seminar which I attended on Agile didn’t cover this? :(

It is always easy to suggest one approach or the another, but before we come to any conclusion, let us do a small exercise on TDD.

Let us say our current iteration is “implementing elevator(Lift)” for an on-going software project. Don’t ask me 'what is that project about?'. :). I’ve picked up this example as I thought it would be better to convey the concept through a real world example which almost all of us see and use day-in and day-out.

Our Architect is now kick-starting with writing the testcases with a brainstorming session. Let our Mr. Architect speak it out in his own words…
--------------------------------------------------------------------------
Pass-1:
What are the different requests and how they need to be processed?
(External Request) Elevator has to go to a floor if there is User request from that floor.
----Elevator has to move UP if the request is from an upper floor.
----Elevator has to move DOWN if the request is from a lower floor.
(Internal Request) Elevator has to go to a floor if there is User(inside elevator) request to go to that floor.
----Elevator has to move UP if the request is to an upper floor.
----Elevator has to move DOWN if the request is to a lower floor.

When does the elevator stop?
----Elevator has to STOP at a floor if there is an external request or internal request to stop at that floor?

What happens when the elevator stops at a floor?
---- Open the Door
---- Update the internal request as DONE or DISCARD.
---- Allow time for passenger to Get-In and Get-Out
---- Handle max-load checking, if any
---- Close the door
---- Update the external request as DONE or DISCARD.
---- (Well, I am not writing all other features (light, fan, displays etc.) of our modern elevators as we are not really going to write the software here)

...(Okay, my requirements are evolving well. TDD is proving good, thanks to my Agile coach.)...

ID-10075634
Then, one junior engineer in my team has brought up a question:

What if the Elevator is going down and there is a request from a floor(on its route) to go up? Should the elevator stop to board the User on its way-down or should it stop on its way-up?”.

(This junior has joined the industry hardly an year back and doesn’t carry a “Software Architect” title as I do, but has brought up very good design point. Does it what “Wisdom of the Group” mean?)

Well, the answer is “Let’s check with the customer on what he wants but there is a need to have a mechanism to decide whether to stop or not on the way and we would take this as a Black Box (a decision-making method) and it would be handy from code maintenance point of view too even if the requirement changes in future”.

ID-10057522Then, someone else, handling another feature, has raised another question:

What if there are requests from upper floor and lower floors at the same? How do we decide the direction the elevator has to move?”

I seriously envy she has asked this question but I have started building up a few more testcases on top of what she has asked and 'it is only getting better'…
  • ID-10081549----If both the requests are to move UP
  • ----If both of them are to move DOWN
  • ----If one of them is to move UP and one of them is to move DOWN
  • ----Is there a Least Travel requirement to be handled?…
Well, this has also ended up as a TODO for us but we have added very good set of testcases coming up in the form of “executable specifications”, and what do good “executable specifications” do? They feed to the design and that’s what has happened now – we need another Black Box((a decision-making method) to decide whether to move up or move down. Also, we have identified the parameters(variables) that these decision-making methods need.

Along with this, we have identified different actions (Just execute them – moveUp, moveDown, Open, Close). Also, we have, almost, identified the Data Structures to store the User Requests (Internal and External). Overall, our skeleton code is pretty much visible now.

Of course, there are still a few questions to be answered but they are not impediments in that we still have the  job to do for the next few days before the questions are answered. We don’t need to blame someone that we are awaiting answers to start coding. We have “blanks” (TODO) to fill in our design and we knew where the blanks exactly are.
--------------------------------------------------------------------------

Now, let’s get back to those questions that we had earlier:

---Should we implement a Waterfall cycle within the iteration?
----Even if the scope is an iteration, practically we cannot get all the testcases right or wait to get them right to start with your design and code. First let us organize our testcases properly and design would be discovered automatically.
----Thus, it also helps proper work distribution if multiple people are involved in the implementation.
----Random organizaion of testcases wouldn't help a good design and proper work distribution.

---OR should we write the code test-by-test basis and progressively refactor the code so that the design evolves through refactoring?
----Looks like, one-size-does-not-fit-all. It may still work, but there would be a lot of code-refactoring, test-and-retest cycles and they consume a lot of time.
----Also, remember that Automated Unit tests are not possible for all scenarios of software development and let us not ignore this fact.
----Refactoring doesn’t just mean abstracting the reusable code and thus evolving the design with a test-by-test approach. Let us not make it too lean in the name of lean development.
----If a few minutes of 'collective thinking' could save hours together at later point of time, then do that NOW.

So, let us go with the facts (than what was taught) and use a right mix of waterfall and test-by-test models as per the context.

Note: If you observe the elevator functionality derived from these testcases, it can be easily seen that the Elevator works like a State Machine (OOAD design). But, the objective of this article is not to explain any particular design pattern. Maybe, we would take it some other time.


Are your testcases well-organized for a good TDD implementation?

Are they proven good for an organized task distribution? Don't forget this project management issue.

Is your TDD approach feeding enough for a good design?

Are you ensuring that you are implementing the 'team collaboration' and 'wisdom of the group' in a right way for well-organized testcases?

Do you have an eye for minimizing the rework by spending the right efforts at the right time?

(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)

Monday, 16 July 2012

TDD is a Control System

Apart from executing Test Driven Development as a "Test First, Test Early" approach, it can be seen as a very basic Control System. So, it is not just about the "when" part (developing the testcases before writing the code) but it is also about the "what" and "how" part of it.

- What is the input you are considering for developing the testcases? Is it just the feature under development or is there something else?
- How are you implementing the iterative model for a progressive development towards Quality by constantly improvising the Quality of your testcases in the project cycle.

What is it about?

Let us say you are developing the testcases for your current iteration (or what you may formally call as a sprint in Agile terminology). Then, how about these feedbacks which go into your TDD model?

- First Level feedback from the internal testing done by the Team
- Customer Feedback for the Current Sprint
- Customer Feedback from the previous Sprint
- Customer Feedback from the Future Sprint


The first 2 items need no explanation and if I write something about them here, it would be just to fill up some space on this post. The last 2 items need a little mention as they are sometimes missed out in the project execution.


How about a small example here?

Let us say that you had delivered Login feature to the customer in your previous iteration and the customer reported a couple of errors around the error handling mechanism in the Login module. Okay, what a big deal and you fixed them more quicker than the customer reported. :) Cool.

Again the customer kept on reporting the similar issues with the other features as well. Well, this is definitely not a good news for both the customer and you, right?


- Customer Feedback from the previous Sprint

--- How the customer is trying to use the application?
--- What's that he is expecting from the application - functional, non-functional, error-handling, usability etc.
--- What are the lessons that you need to consider for the future iterations that would possibly have an impact on the design, testing coverage etc.

We used to call this as "Bug Analysis or Generic Analysis". It doesn't matter what name you give to it and whether you do it in the sprint review meeting or some other informal meeting. What matters is ensuring the "generic lessons are communicated to the team". Suppose in the example above, if the error handling issue is communicated to the respective developer only and another developer does the same mistake in another piece of code again, the purpose of TDD (and the Iterative and Incremental development) can be seen as falling short.


- Customer Feedback from the Future Sprint

Assume, in the example above, you had delivered a Registration feature before Login feature. But the error handling issue was reported on the Login module which you delivered now. The issue is valid for the Registration module as well and it is just that the customer had not caught that. Does that mean, you don't need to fix it in the Registration module because it was not reported there? Obviously, No.

This is what I mean by "Feedback from Future Cycle". This would fix the errors before they are reported. 'Bug is not reported' doesn't mean 'bug is not present'. Just like we do, customers also tend to report different types of errors at different times during project cycle. So, "Don't change the working code" can not applied literally here.



Are you ensuring that the team participates actively in the generic feedback analysis and shares that knowledge than treating it as a mere 'Bug Fix and Verification' cycle?

How useful was the feedback on the current iteration for the Quality of the iteration and for the overall Quality of the project itself?

How better is your TDD coverage in this iteration than the last iteration (Going- forward)?

Are you ensuring that a quick revisit to the previous iterations being done based on the feedback from the current iteration (Going-backward)?


"An effective analysis of a bug reported can help you fix multiple bugs in your software before they are reported"

(Attribution: Images downloaded from FreeDigitalPhotos.Net)

Tuesday, 10 July 2012

Understanding Test Driven Development

One of my friends had told me that they were implementing Agile practices and Test Driven Development in their organization. When asked “What is Test Driven Development”, he answered “the testcases are developed first and the software is written to the testcases”. He also explained that it helps catch the errors sooner than later. This is true. But he couldn’t explain anything more than the DNA of the Test Driven Development as to what were the other benefits that you can reap of the Test Driven Development (TDD) if you implement it in a right way.

 
Well, let’s not do something just for the sake of it.


It is perfectly true that the TDD model helps you catch the errors sooner than later. But more than that, TDD offers the implementation of “microscopic” approach in the project implementation through Incremental and Iterative cycles.



First, what is it all about testcases?
  • Test Cases are direct means of communication among the developers, users and other stakeholders to understand the system in correct and in comprehension and arrive at a common and formal page of contract.
  • It is the language where the stakeholders talk to one another at a lower level of implementation.
  • Test Cases are nothing but “executable specifications” of the system that you are developing. A good set of testcases is nothing but “working code in plain English”. Did we get it right before jumping on to TDD?

Theoratically, there are different models practised for the implementation of TDD.
-----------------------------------------------------------------------------------------
  • In a pure and religious Agile model, the developer owns the responsibilities of test case development and the execution, along with the developing the code to those testcases.
  • In other models, there is still a dedicated Testing team and the Testing team develops the testcases in parallel with the code constuction.
-----------------------------------------------------------------------------------------

Irrespective of the model your team is following, you need to ensure you are developing the comprehensive (Test for Good & Test for Bad; Functional & Non-Functional) set of testcases before jumpstarting to coding your iteration or sprint or feature - whatever the TDD in picture is for. Otherwise, TDD comes of no additional value in your project.
 
To illustrate with an example from my experience, we used to implement TDD as follows:
 -----------------------------------------------------------------------------------------
  • Depending on the team composition, team skills or the interest of the individuals (which you need to respect), the team members used to play varying roles.
    • Development-Only
    • Testing-Only
    • Both Development and Testing
  • There were no hard and fast rules on the above team composition but the team would do it through "collaborative planning". Yes, "team collaboration" should be applied in planning phases as well, as a side note.
  • Developers used to wear those Testing Hats in the development of the critical pieces of the software. This could be from business requirements point of view or technical implementation point of view.
  • The Test Engineer or the TDD-Developer will come with his first set of estcases and call for a "review" meeting with other team members.
  • And believe me - these review meetings are the places where a lot of brainstroming used to be done and people get into a lot of interesting discussions and raise a lot of questions and get answers. This also proved to be a very effective informal platform for the Knowledge Transition.
  • Interestingly the "take away" items from these TDD meetings used to be:
    • Expected Results answered (DONE)
    • Expected Results unanswered (TODO item for the Developer, TODO for the Test Engineer, or a QUESTION to the customer)
    • New Testcases developed
    • New Testcases to be developed
    • Overall an "executable specifications" coming into a shape on a "collaborative platform".
    • Everyone understanding of the system a little more and a little better.
    • ...
  • The developer would now start with the code and possibly with those TODO comments which will need to be answered shortly.
-----------------------------------------------------------------------------------------

Refer to my article "TDD is a Control System" to understand a few more practical facts of paying continuous attention to improve the Quality of TDD beyond an iteration cycle.



If you a Project Manager:
  • What is that you are doing to ensure your team is practising the right TDD principles than mere "Test First, Test Early" approach?
  • How are you ensuring the collaboration of the individuals that the "Wisdom of the group" is used for building up the "Executable Specifications" or formally "Testcases"?
If you are a Developer or Test Engineer:
  • Did you ensure you've got the testcases "correct and complete" before constructing the code to that list? Rememeber TDD is all about minimizing the rework and refactoring.
  • Are you ensuring you are communicating to the Test Engineer beyond formal meetings?
  • Did you get those unanswered questions in your TODO list and in your TODO comments in the code before they end up as a BUG reported by someone else at a later stage?
  
"Right Practice is what you need to ensure Quality, name of the Process makes no difference" 


(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)