Image source

This blog lists down the common things which developers and QAs should consider while doing project planning exercise. However before jumping on to the list, let me cover some background first.

@ThoughtWorks we follow XP practices. Any typical project starts with an Inception. It is a sort of workshop which typically starts with discovering what problem client is trying to solve. And it ends with a rough plan of how this problem can be solved.

To come up with a plan or a project schedule we

  1. Prioritize the requirements and finalize the scope
  2. Break down the scope into features and user storiesç
  3. Estimate user stories
  4. Calculate raw velocity
  5. Plan iterations and milestones

Assuming the user stories are created, we use complexity based estimation technique to estimate stories.

Complexity based estimates

Some people use Fibonacci numbers while some uses T-shirt size (Small, medium, large) nomenclature to estimate the stories. Important thing to consider, this number denotes the complexity of the task but not the effort required to accomplish a task. Team picks up the simplest story from the story deck and estimate it. Then this story is used as a base to relatively estimate other stories.

Next step is to do a Raw velocity exercise.

Raw velocity exercise

  • After estimating the entire scope now it’s essential to come up with a plan based on efforts i.e. plan in man hours/ months. This is what the client is actually looking for. To convert the estimates to a rough plan we do a Raw velocity exercise as follows.
    • BA/ PM from Team picks up a subset of the stories and hides estimates of those.
    • Team is asked to tell how many of those can be done in 1 iteration by 1 dev pair (Iteration is typically of 1, 2 or 3 weeks). This exercise is repeated for 3-4 subsets of stories.
    • BA/PM will average out the number which denotes the average velocity at which Team is going to churn the scope.
  • Based on the velocity and possible parallelization the number of iterations required to complete the scope are calculated. With this approximate end date is calculated.

Simple story so far, right? However I have seen people tend to forget things which they should ideally consider while doing the raw velocity exercise. This leads to incorrect calculations and wrong expectation settings with clients.

Checklist

Following is a checklist which I follow. Some of these will impact the velocity calculation while some items deserves a separate story.

  • Existing code base understanding
  • Environment setup (Dev, QA, UAT, Prod)
  • API
    • Documentation e.g. Swagger
    • Monitoring (Kibana, Grafana, Prometheus etc.)
    • Liveness endpoints
    • Readiness endpoint
  • Loggers
  • Security
    • Threat modelling
    • CIA
    • PII
    • SSL
    • Vaults
  • Tech. analysis of next iteration stories
  • Addressing tech. debts
  • Setting up correct ‘Test pyramid’
    • Integration tests
    • Functional tests
    • Consumer driven contract tests
    • E2E tests (if functional tests mock other systems)
    • Setting up mock server for testing
  • Quality metrics
    • Code coverage
    • Sonarcube
    • IDE setup e.g. stylecheck plugin, following single indentation scheme
    • Code reviews (minimal)
  • Git
    • Pre-commit, pre-push hooks to run the unit/integration tests
    • Scan repo for vulnerabilities
    • Activating 2FA
  • CI Pipelines
    • Build, unit tests & integration tests
    • Functional tests
    • Smoke tests
    • Consumer driven contract test pipeline
    • Deployment pipelines
  • Exploratory testing
  • End to end integration stories if development is done against mocks
  • Automating release cuts
  • Documentation (design diagrams, architecture diagrams, workflow diagrams)
  • Automating dev machine setup
  • Ramping up client devs if they are part of the development team
  • Appropriate versioning for CI artifacts
  • Tech. showcases within the Team or with client
  • Crash reporting tools integration like Crashalytics
  • User action tracker tools like Omniture/ Heap analytics