Session title: Scrum introduction (and how it’s used by the Microsoft Dynamics NAV team)
Date: Thursday, 27 september 2012
Time: 16:00 - 17:30
Level: 200 - Intermediate
Speakers: Jens Møller-Pedersen, Gaurav Roy
Session description:
The Nav product organization has, during the last 2 years, transitioned from a classic staged development process, into an agile software development methodology, driven by Scrum.
In this session we will take a look at that change, and provide you with our view on what has been the most rewarding, and the most challenging part of the transition.
- What is in a Sprint, and how long is it ?
- How do you stay agile with 20 scrums team ?
- Are you Done yet ?
- Where is my spec ?
- Is your backlog well groomed ?
- Is your WIP too big ?
Use this thread to post your questions, comments or things you would like to see in this session at the conference.
Comments
I had a few questions after the session on Thursday but wasn't able to get them answered before the Q&A Session was over.
1st - Gaurav said that one principal was never to carry debt from one sprint to another. What we find in our team is that when our 2 week sprint is over we sometimes have bugs found in our development or incomplete stories. Does this mean that these stories should be carried into the new sprint as top priority? In our Scrum training we were told that in every sprint the priority can change and therefore other stories may be of higher priority. This means that we sometimes end up with stories that have been started and not completed added either to the end of a new sprint or not at all until later on, sometimes months later. By then the story is not fresh in our minds. Is this then not recommended?
2nd - There was Acceptance testing to be written and Acceptance Criteria before starting any development. I understood that the team would define the acceptance criteria but who should be testing that the development meets the acceptance Criteria. Currently our PM would do this. Which brings me to the next question.
2nd - What is the 'definition of done'? Is this when the story is 'accepted' by the PM as fulfilling the Acceptance Criteria? Or is this when the developers have completed the development and it has passed testing and based on the acceptance criteria? Therefore is 'done' based on the developers or the business approving the work has met the acceptance criteria?
Thank you very much for an informative session. It was very well presented and took away alot of useful information that can be used within our team.
Regards,
Sasha
Possibly due to its age and intended use, the NAV development environment, IMHO, is not particularly forgiving when it comes to refactoring an application or feature. What can be done confidently in a matter of seconds in e.g. C#, could easily take 15 minutes in C/AL. Without any form of reflection, finding and fixing scattered dependencies takes time.
Do you have any experiences or best practices to share regarding refactoring of C/SIDE applications?
Jan Hoek
Product Developer
Mprise Products B.V.
Jan Hoek
Product Developer
Mprise Products B.V.
Answer to 1) Sharing my experience only.
Maybe your deliverable are not granular enough. think about it if you can break it further. I know in the ERP world it wouldnt make sense many time to actually ship a small deliverable unless other related functionality is also developed but still we have had better results by breaking deliverables smaller. In that way they dont stay open foreever and also the problem of keeping it half open and they de-prioritizing it never comes.
Ideally we should finish a deliverable in one go but like you said other higher prio deliverables can come through and the present one can go on hold. But this is undesirable.
Also if you were using Kanban instead of Scrum(what we are moving into) then "have to finish in 2 weeks" doesnt come into play. Work on your most important deliverable till you finish and then the next one.
Answer to 2)
Whether the deliverable meets the acceptance criteria or not should be defined by the automated acceptance tests which you should write before you even start development. All the tests need to pass for the deliverable to closed.
Definition of done:
a)Planned automation completed
b)Acceptance tests are passing
c) Necessary documentation is in place
d) No bugs are open related to delieverable
When a group of deliverables are done which make sense from a business point of view then what we do is schedule a E2E testing (manual) session within the team
involving the PM,Dev and the TEST. Here everyone should test manually E2E cases and log all bugs and close them before you ship. We also Kanban these logged bugs.
So were are continuosly Kanban - ing on delieverables and bugs.
a) Refactoring - Tooling wise there is not much I can say. But my recent experience says as soon as you think you have enough automation, you are basically open to refactor. You can find a lot of obsolete code during this process - things that doesnt make sense anymore.
Your acceptance tests should aim to be a comprehensive documentation of what you support. If your acceptance tests do not break if you remove a block of code and no one knows why it is there then most probably it can be removed. Obviously comprehensive automation of code where the functions are insanely long is very difficult.
So instead of looking in the function and trying to guess what automation should be written to cover it maybe its better sometimes to let the PM decide "what do we intend to support" and only automate that before you refactor.
b) "Never to carry debts" - this was basically aimed to reduce the "final end game time" which means whatever deliverables are closed can potentially be shipped. No bugs, documentation or automation backlog. But again as i mentioned before for an ERP application you might need to finish a group of delieverable and follow it up with an E2E testing phase before you ship.