Learn about Bug Count, including how to measure it, and leverage it in dashboards and visualizations with Metabase.
Bug count is meant to tell you the average amount of bugs per a designated amount of lines of code. The purpose of this is to see how often end users are likely to encounter a bug and highlight areas of instability. It’s inevitable that bugs will come up, but it’s important to have a realistic outlook on how often that should be happening. Bug count specifically focuses on how often bugs are making it past testing and into deployment. The main goal is to come up with new methods to detect bugs before deployment more often, as it’s more time-consuming to fix a bug than it is to take extra time writing a line of code.
Get StartedBug count is represented in a ratio based on an average. You’ll be creating a ratio based on the total lines of code and the number of bugs detected. For example, let’s say you have 12,000 lines of code and 135 bugs. You want to figure out how many bugs are getting through per 1,000 lines of code. First, we’ll divide 12,000 by 1,000 to get 12. Then, we’ll divide 135 by 12 to get 11.25. That means your bug count is 11.25:1000. This is about average if not below average from what you might see. The average developer creates around 70 bugs per 1000 lines of code, and an average of 15:1000 makes it to end users.
Get everyone on the same page by collecting your most important metrics into a single view.
Take your data wherever it needs to go by embedding it in your internal wikis, websites, and content.
Empower your team to measure their own progress and explore new paths to achieve their goals.
That's right, no sales calls necessary—just sign up, and get running in under 5 minutes.
We connect to the most popular production databases and data warehouses.
Invite your team and start building dashboards—no SQL required.