Showing posts with label Customer Collaboration. Show all posts
Showing posts with label Customer Collaboration. Show all posts

Tuesday, 28 August 2012

"Is Everything Alright?” may not be Alright

Incremental deliveries are not just good enough. You should ensure that they are really adding up to the Constructive feedback in achieving your long-term goals, than just aiming for "Everything is Alright" with every incremental delivery.

When our team first started with the Iterative model of development, the information we used to exchange with the customer, used to be something like this:
  • Implemented ABC, XYZ, … features
  • Fixed the Bugs: XXXX, YYYY, ….
Okay, Alright(X) In the follow-up calls with the customer, we used to ask very generic questions like “Is Everything Alright?” OR “Are there any issues?” and we would be eager to hear from the customer side “Okay, everything is just perfect”. We didn’t use to pay enough attention to ‘How they were/should be testing the deliveries'. Often, we used to have a demo, but one cannot cover everything in a demo.

The customer came back much later in the project cycle reporting a few serious bugs which should have, ideally, been reported before. They cost us more considering the amount of refactoring and the other overhead (planning, prioritizing, additional testing etc.) required. This meant that the iterative development didn’t completely serve its purpose, though it made a few things better in handling the project internally.

It took us sometime before we realized a few facts:
  • ‘Incremental Deliveries Are Not Just Good Enough’ and we need to do everything right to ensure the customer would be testing those deliverables diligently in providing the Right feedback at the Right time.
  • Don’t even aim for “All is Well” reply from the customer in the beginning cycles of the project. It is not something to cheer upon. You should welcome Specific and Tangible feedback and Rework but don’t try to avoid it. Anyways you would be working on it later and you cannot escape from it.
  • Also, we became conscious about the addendum information that we had to deliver along with the incremental deliveries.
-Demo
-Exact list of Testcases (Acceptance Tests); Scenarios that the customer can explore.
-What's Pending (Something for which you delivered a temporary or an incomplete solution) and ‘Known Failures’
-'In Process’ user documentation, if any
-Any other useful notes, that is not taught by your Agile coach but contextual.
  • Welcome those short-term hits for the long-term Quality goals.
  • Communicate ‘What has to be Tested’ along with ‘What is Added and Fixed’ because ‘the later a bug is caught, the costlier it is’.
  • ‘You test it Now’ is more practical than ‘You should have reported it before’.

When your requirements are highly unstable and volatile, use the incremental deliveries as artifacts to decide the direction you need to travel than just an instrument to validate what was done. This simple perception change brings in a lot of difference in your approach and methods you follow to receive the feedback. The same might be applicable, but to a lesser degree, in the scenarios where your requirements are fairly stable.

"Remember that the incremental deliveries are, in general, the production quality code drops and not miniature or prototype versions". So, even the validation has to be done on the similar lines. 

Sometimes, it is important to know ‘who would be testing your deliveries from the customer side’. It would be very useful getting in direct contact with them rather than getting the second-hand information from someone who is working with you on the requirements and estimations etc.. This will help process the information better and faster.




Verification QuesionsAre you ensuring that your incremental code drops being tested diligently from the customer side?

Are you talking to the right person from the customer side who is validating your deliveries?

Is the feedback you are receiving ‘Specific and Constructive’?



(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)

Friday, 10 August 2012

Is Agile limited to Development projects?

Can we practice Agile in a maintenance project?
Can we follow Agile in executing a migration project?
Can we use Agile in …?
...
Is Agile limited to only development projects?

What drives us to adapt to a new process or alter an existing process? It is always this golden objective - Quality. Different projects carry different Quality criteria. Before trying to find an answer for “How do we deliver Quality in this project?”, we need to answer “What does Quality mean for this project?”

Let me try to illustrate this with an example of how we implemented Agile in a Production Support project.

*A quick background of the project:
  • The project was about handling the business-critical Production Support for a telecom client. This was handled offshore completely.
  • The customer’s business was tied up with different Service Partners to provide end-to-end service to their customers. Taking this to technical terminology, it was all integrated on an SOA based architecture.
  • Often there were service failures due to various Technical and Operational reasons and our job was to analyze those failures and handle them offline.

ID-10091674*So, what was Quality meant for this project?
  • Reliability - As we worked on the production issues, it was a MUST that our solutions were highly Reliable.
  • Better and Faster - It was DESIRED that we provide continuously better and faster solutions, considering the huge customer-base of our customer.
  • Timely & Consistent communication with the business users - We had to work with the onsite team, working on the other side of the globe. So, our communication with them had to be -
    • Timely - Delays, due to timezone differences, should be minimized so that end-customer concerns are addressed at the soonest. Not a typical requirement in Development projects.
    • Consistent - Our communication with the onsite team should always reflect that either we know something or don't know something as a Team but not as Individuals. It is highly DESIRED from business point of view.
  • Continuous Learning of the System: It was a complex system and so it was DESIRED that we learn the system better and better as we go dealing with various issues.
  • Customer satisfaction through End-Customer satisfaction: "Our customer will be happy, if their customers are happy". This is the simple business rule that we always respected.
ID-10036034

*So, how did we implement Agile here?

The daily Scrum meetings played a major role here and this is how we tuned our Scrum approach in dealing the issues reported by the end-customers.
  • What is New (Either we didn’t handle this before or this is slightly different from what we handled before)?
Can we handle this ourselves? If so, who is the best person to handle or help? Or, do we need to approach the onsite team? (Providing ‘Reliable’ solutions)
This is towards the “Reliability” criteria.
  • Are there any Bulk (High in number) and Repeated (being seen more often) issues?
This is towards identifying the need for building new tools or modifying the existing tools (Automated or Semi-Automated) to meet “Better and Faster” criteria.
  • Any escalations?
Business SLAs take priority over having a perfect solution (‘End-Customer satisfaction’).





-- Timely communication with the onsite team and continuous learning of the customer’s business were the by-products of these discussions.

-- It also helped us address the Single-Point-Of-Failures (SPOF here is Knowledge being confined to individuals). SPOF would, in general, have more impact in a Production environment than in a Development environment.

-- We never discussed the issues that were routine and they were far off the Scrum meetings. Our meetings intended to be meaningful and action-oriented. We didn’t make them daily status meetings. Our team was enabled to take care of the routine issues by themselves.


What is meant to be conveyed through this example?

-- Challenges are different for different projects. Some projects would be more about ‘What to Implement’  whereas some projects would be more about ‘How to Implement’?

-- Our project was more about ‘What to Implement’ than 'How to Implement’. I must say that technologically it was not so challenging but we were constantly identifying and implementing the tools(Java/J2EE and PL/SQL based - just to add) that create value for the customer's business. So, it was more of a business challenge.

-- There was no concept of continuous delivery, iterative development, early feedback etc. in this particular customer engagement. From a different sense, our solutions were reaching to the customer on a daily basis (unlike in a typical Development project).

-- To state the fact, it took us some time to figure out what exactly were the expectations and the Quality criteria. Every project would see this phase, particularly when your team is handling a new type of project. Some conscious experiments and trial-and-errors are required to get on to the right track.


The subtle message that is intended to be conveyed is - ‘Work on identifying your Quality goals and customize your practices to achieve those goals’.





What are the Quality objectives of your project? How is it different from the other projects that you executed?
How can you apply the tools(practices) available for the job-in-hand? What works for you, and what does not work for you?


(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)

Monday, 16 July 2012

TDD is a Control System

Apart from executing Test Driven Development as a "Test First, Test Early" approach, it can be seen as a very basic Control System. So, it is not just about the "when" part (developing the testcases before writing the code) but it is also about the "what" and "how" part of it.

- What is the input you are considering for developing the testcases? Is it just the feature under development or is there something else?
- How are you implementing the iterative model for a progressive development towards Quality by constantly improvising the Quality of your testcases in the project cycle.

What is it about?

Let us say you are developing the testcases for your current iteration (or what you may formally call as a sprint in Agile terminology). Then, how about these feedbacks which go into your TDD model?

- First Level feedback from the internal testing done by the Team
- Customer Feedback for the Current Sprint
- Customer Feedback from the previous Sprint
- Customer Feedback from the Future Sprint


The first 2 items need no explanation and if I write something about them here, it would be just to fill up some space on this post. The last 2 items need a little mention as they are sometimes missed out in the project execution.


How about a small example here?

Let us say that you had delivered Login feature to the customer in your previous iteration and the customer reported a couple of errors around the error handling mechanism in the Login module. Okay, what a big deal and you fixed them more quicker than the customer reported. :) Cool.

Again the customer kept on reporting the similar issues with the other features as well. Well, this is definitely not a good news for both the customer and you, right?


- Customer Feedback from the previous Sprint

--- How the customer is trying to use the application?
--- What's that he is expecting from the application - functional, non-functional, error-handling, usability etc.
--- What are the lessons that you need to consider for the future iterations that would possibly have an impact on the design, testing coverage etc.

We used to call this as "Bug Analysis or Generic Analysis". It doesn't matter what name you give to it and whether you do it in the sprint review meeting or some other informal meeting. What matters is ensuring the "generic lessons are communicated to the team". Suppose in the example above, if the error handling issue is communicated to the respective developer only and another developer does the same mistake in another piece of code again, the purpose of TDD (and the Iterative and Incremental development) can be seen as falling short.


- Customer Feedback from the Future Sprint

Assume, in the example above, you had delivered a Registration feature before Login feature. But the error handling issue was reported on the Login module which you delivered now. The issue is valid for the Registration module as well and it is just that the customer had not caught that. Does that mean, you don't need to fix it in the Registration module because it was not reported there? Obviously, No.

This is what I mean by "Feedback from Future Cycle". This would fix the errors before they are reported. 'Bug is not reported' doesn't mean 'bug is not present'. Just like we do, customers also tend to report different types of errors at different times during project cycle. So, "Don't change the working code" can not applied literally here.



Are you ensuring that the team participates actively in the generic feedback analysis and shares that knowledge than treating it as a mere 'Bug Fix and Verification' cycle?

How useful was the feedback on the current iteration for the Quality of the iteration and for the overall Quality of the project itself?

How better is your TDD coverage in this iteration than the last iteration (Going- forward)?

Are you ensuring that a quick revisit to the previous iterations being done based on the feedback from the current iteration (Going-backward)?


"An effective analysis of a bug reported can help you fix multiple bugs in your software before they are reported"

(Attribution: Images downloaded from FreeDigitalPhotos.Net)

Tuesday, 10 July 2012

Speaking the Customer Language

Well, before you misinterpret the subject line, it is not about whether to speak English or Hindi J. It is about the format of our communication in implementing the user requirements.

Let me try to illustrate this with an example from my own experience.

A few years back, I was part of a project in Banking domain. The solution that we were implementing dealt with different departments of a major bank (like Call Center, Core Banking & Credit Card department etc). I don’t know how many levels of communication and the information transition was happening from the stage customer providing his requirements to the stage the requirements were ending as Java code, but at the level of module leaders and individual developers, we used to communicate in a language like this:

ü There are n number of screens in the front-end modulerepresenting different ypes of end-user requests.

ü The X screen on the Call Center module will have so and so fields and some particular fields are numeric whereas some are alphanumeric etc. (There were some fields which we didn’t even know what they stand for).

ü Upon hitting the Submit button, the back-end will translate the data in a pre-defined XML format and send it to Core Banking module or Credit Card module. Some of the responses for these requests are synchronous and some of them are asynchronous. (We didn’t know why most of these responses has to be synchronous or asynchronous)

ü

Well, our team really worked hard and implemented the solution ‘to the requirements that we understood’. The design was good ‘to the requirements that we were given’. The testing was carried out ‘to the technical facts interpreted’ and we evaluated the solution against the ‘technical language’ that we had been speaking thus far.
It was the time for the System Integration Testing (SIT) where we went to the customer site for integrating our solution with the customer’s existing IT infrastructure. Surprisingly, it was not just the integration issues that we dealt there but major portion of them were pure Business Functionality issues that we had not paid any attention to in understanding. We didn’t understand them right and we didn’t speak them right when we implemented our solution. At the customer site, we were working along with a couple of IT engineers and Business Analysts. The business analysts cared the least about our Java or XML or how many modules that our solution was made up of. They were worried about the provision to have different business requests processed in the system (for them each input form on your front-end was a business case and was a real-world scenario) and when the processing was not successful, they had to be informed through the system with proper user-readable messages/alerts. Business SLAs, if any, were to be considered too. These are just a few examples to highlight what we had missed in our implementation.
Seriously, that SIT phase was the time we worked closely with the customer and understood most of the requirements – the real business requirements this time around; we also understood what our Java Exceptions mean to their business and what do with different Exceptions and also started speaking the customer’s language. We had to rewrite good part of business functionality but end of the day, as an individual, I learnt the hard lesson forever, that is “speak customer language”  before you speak “technical language”. Had we done it in the beginning itself, we would have saved a lot of time altogether.
Remember that every customer would love to speak to more of their business terminology than your technical terminology. For example, ‘Call Center department will send x request to the Core Banking department’ would sound better to them than ‘Call Center module will send x message to the Core Banking module’.



-------------------------------------------------------------------------------------------------------------------------------------
Has your team understood the business problem well before you jump start with design and code? If you don’t know the customer’s business already, work with the customer in understanding the same. Openness is very important here.
Are you asking the right questions to the customer to understand his business? This also helps the customer understand and have a measure of what your team know and what they don’t know and arrange for any training material, if any or take any other appropriate steps to educate your team more.
Is your team discussing the business case scenarios in the design or review meetings OR are they always discussing the technical problems?
Are your incremental deliveries being evaluated by the right team from the customer side?
Does your team know who the end users of the solution would be and adding the "customer perspective" in the construction phase?
-------------------------------------------------------------------------------------------------------------------------------------

"Software contruction isn't just about solving a set of technical problems but it is about solving business problems through technology"

(Attribution: Images on this post are downloaded from FreeDigitalPhotos.Net)