Not all metrics are good and using poor metrics can have a serious negative impact on your organization. Maximize the value of your software development lifecycle (SDLC) metrics by using this thoughtful and judicious approach to ranking and selecting them.
In a Metric Fixated world, it is easy to fall prey to the siren-song of “what get measured gets done,” but the truth is that blind faith in metrics is a recipe for disaster. So how does IT leadership select and manage their SDLC metrics in an effective way to maximize the value while minimizing the risks? Follow these simple steps:
This is the last in a series of three tech briefs on the topic of SDLC metrics. In the first brief (UNDERSTAND), we presented the dark side of metrics, and warned of the serious dangers associated with poor selection and management of metrics. In the second (KNOW), we presented a set of simple-to-follow rules to avoid common pitfalls associated with bad metrics. In this tech brief (RANK AND SELECT), we present a straightforward approach to helping your organization evaluate and select the most effective SDLC metrics for your needs.
The secret to effective selection of SDLC metrics is to choose the fewest number of metrics possible while simultaneously selecting those that maximize value and minimize the serious risks associated with bad metrics and never using them for reward or punishment, but rather to help your development team(s) solve problems and achieve goals.
To do this, start by identifying your development team’s prioritized business-aligned goals (i.e. what goals does the team need to achieve to support the business’ goals, ranked in order of importance). Do this by working collaboratively with the business to understand their needs and translate them into a prioritized list of development team goals (keep this list short (2-4 as a guide) because your team can only effectively focus on a few goals at once). As an example, below is a hypothetical set of prioritized team goals that have been aligned to the business:
With your prioritized goals in hand, it’s time to identify potential metrics for use by your team. Begin by compiling a list of potential metrics to be used (ignore any metrics that are not relevant to achieving your prioritized business-aligned goals). For each potential metric, capture a name and brief description, then identify how the metric would be gathered and reported (think carefully about how to gather each metric in a way that reduces the risk of things like gaming and unintended consequences). As an example, the table below captures five potential SDLC metrics (in practice, your team should identify 10-15 potential metrics to be evaluated):
Now it’s time to evaluate and rank your list of potential SDLC metrics so that you can then short list the best metrics to meet your team’s needs. To do this, your team will score each metric against a set of defined “pros” and “cons.” We recommend you use the following pro/con criteria (but feel free to adjust these to the individual needs of your organization):
Now, using a five-point scale from low to high (low, med-low, medium, med-high, and high) have your development team score each of the potential metrics against the pro/con criteria. In the example below, we have scored each of the five metrics relative to one another (we used our prioritized business-aligned goals from above to determine the Value to Team scores). Note that we have used traffic light color coding (i.e. green is “good,” red is “bad”) to help visually compare scores:
Once scoring is complete, have your development team roughly sort the list of potential metrics from best to worst (this sorting has already been carried out in the table above). You can “eyeball it” based on colors, or come up with some simple formula like a weighted average score of pros/cons for each metric (you don’t need to be precise here as long as they are roughly ordered from “best” to “worst”).
Now, working from top to bottom, have your team openly discuss each metric, compare it to the others, and decide on which to include in your team’s new set of SDLC metrics. You won’t necessarily take the very top scoring metrics on the list because some of them could overlap in some way. We suggest reducing your list down to the 3-5 best SDLC metrics for the team (feel free to adjust this range for your organization’s specific needs, but remember that each new metric you add to the list will diminish the value of every other metric, because your team can only focus on a few things at a time).
At this point, your team will have collaboratively settled on the smallest set of SDLC metrics they believe will help them the most in achieving their prioritized business-aligned goals. This buy-in from the team is critical to effective adoption and use of SDLC metrics. Remember never to use these metrics for punishment or reward (which will only undermine their effectiveness). Your team can now put these metrics into place.
As one last step, schedule a follow-up SDLC metrics discussion with the team (three to six months later is suggested, but do what works best for your circumstances) to review the metrics for effectiveness and decide whether any changes are needed (whether it is changes to which metrics are gathered or how they are gathered). Remember that the team’s needs and goals can change over time, and so your SDLC metrics should be reviewed and updated regularly to maintain their effectiveness.
If you would like to know more about this approach to metrics, speak to your Info-Tech Engagement Representative about the soon-to-be-released SDLC Metrics blueprint.
Is it true that everything that can go wrong will go wrong? Don’t bet on it to not.
While Microsoft is not a prominent player in the RPA space now with its Power Automate solution, compared to Blue Prism, UiPath, and Automate Anywhere, its latest acquisition of Softomotive, maker of WinAutomation, demonstrates Microsoft’s dedication to mature and expand its RPA offerings.
Test data management tools offer you the ability to provision, mask, and govern the access and use of your test data, alleviating these manual, laborious and error-prone tasks from your testing, operations, and DBA teams.
When trying to implement Agile as a defined process, Scrum turned BAs or other roles into order takers with the title “product owner.” This undermines the entire value proposition of product management.
Agile systems delivery (implemented through Scrum) is quickly becoming an accepted norm in IT. But using Scrum successfully in an organization requires a deep understanding of how it works and why. For example, many of our members don’t understand the importance of selecting a Product Owner who has three ears.
Reeling from the pandemic response executed by governments all the over world, companies are accelerating their implementation of low-cost automation. That bodes well for UiPath – a leader in RPA aiming to go public this year.
Thor, the Norse God of Thunder, tells Jane Foster, the woman he’s trying to impress, that on his home world of Asgard, the realm eternal, science and magic are two sides of the same coin. Had Jane been a part of the operations teams at Google (or other mature online service providers), she would have immediately realized we have a similar technology right here on good old Earth. We call the science site reliability engineering (SRE), and service level objectives (SLO) is the magic behind it. SRE is a powerful concept for organizations that are serious about keeping their customers happy. It is therefore important for them to develop well-thought-out SLOs and make certain that management is intellectually equipped to derive valuable business perspectives from them.
Hell hath no fury like a customer not being able to access an online service when they want to. They expect the online services to always be on, always be accessible, and always treat them like there’s no one else in the world who matters more. Thank heavens then for giving these online services the ability to use site reliability engineering (SRE) to keep their customers happy, engaged, and most importantly, feeling valued.
Info-Tech members moving to Agile are frequently unsure of the role of PMs and the PMO in an Agile environment. Any organization used to traditional (Waterfall) project management will need to make adjustments in support of Agile or risk losing the benefits.