In recent years the quantity and quality of tools that engineering leaders rely on to do their jobs have increased.
Every aspect of the product development lifecycle has a category of tools dedicated to it which streamlines a process or provides visibility into a process/the software itself.
These tools considered together contain all of the data needed to understand in great detail the current status, progress and performance of an engineering department and the products it is building.
Data is spread too thinly
The problem is that this data is spread thinly over 6, 7, 8+ tools. It is often presented in an opaque manner that you would need a degree in each tool to fully comprehend.
To get a full understanding of where your team is you would need to check your source control tool, your project tracking tool, your monitoring/observability tool, your user analytics tool, your code quality tool, your security analysis tool, your incident management tool and your customer insights tool.
Even once you have managed to build a readable dashboard in each tool you will then have 8 or so dashboards to check every time you want a complete overview of how your team are performing.
This is obviously a very inefficient way for engineering leaders to spend their time.
Reporting progress
An equally large problem arises when it’s time to share the insights and overview from all of these tools to other engineering leaders and to non-engineering stakeholders.
For the vast majority of engineering managers this involves some form of manual reporting.
Visiting each tool, navigating to the relevant dashboard in that tool, updating a few filters, cleaning the data and then copy and pasting over to a doc file or a spreadsheet.
Again, a very time consuming activity especially as the data is out of date the moment it has been added to the report and the process will need to be repeated every time a stakeholder asks for an update.
Because of this, engineering leaders will typically only concentrate on a few of the tools in their toolset and ignore the rest.
Reports will consist of a Jira burndown chart, some engineering metrics and maybe some information on any incidents that occured.
9 times out of 10 they will fail to mention the actual value delivered by the team or anything about the insights and information readily available in their other tools.
What good reporting looks like
Great reports include data from every relevant tool source and connections are made between the different insights to create a more holistic view of how the team is performing.
From combining the Jira burndown chart with data from your user analytics tool you can get a good idea of whether the work this sprint has driven the desired user behaviour.
From combining engineering metrics with your monitoring tool data you can tell if that increase in deployments has affected API reliability or not.
From combining API reliability metrics with customer insights tool data you can know whether that increase in reliability has improved customer NPS or not.
Combining data in this way and gaining valuable insights is time consuming and very difficult using simple documentation tools or spreadsheets. This on top of how much time is wasted collecting the data in the first place is why we are building Vizval.
Where Vizval helps
Vizval is your single source of truth for all product development data. It integrates with each of your existing product development tools, automates data collection and gives you an overview of your teams status and progress that is impossible with manual reporting.
It provides you with easily shareable dashboards and reports that make it easy to demonstrate the value your team has delivered with your boss and the wider company.
Book a demo to learn more with the button below!