Skip to main content

Introducing QA-Board

ยท 4 min read
Arthur Flam

We are happy to introduce QA-Board (source), a run-tracker with advanced visualizations for algorithm and software engineers.

QA-Board logo

Tracking quality is hardโ€‹

Tests are not enough when the focus is quality and performance. Whether you need to improve algorithms or make performance-sensitive code more efficient, all sorts of metrics and visualizations are required. Engineers usually start this evaluation process by writing scripts or notebooks that test their solution on limited samples. They then look at the results and iterate.

While it's very convenient at first, very soon keeping track of versions or comparing features gets challenging. There are a lot of "logistics" to get right:

  • How to share results?
  • What about source control and CI integration?
  • How to start distributed tuning experiments?
  • How to identify regressions?

We wanted to solve those recurrent issues with a simple solution adaptable to many projects.

QA-Board's storyโ€‹

Our business unit develops IP for image sensors. What was a closely-knit 15 person team became an over-300-person organization. The complexity and pace of our projects kept growing. As you may know, Samsung is now working on image sensors with groundbreaking resolution (200MP and beyond), capable of AI and packed with innovative features, including cutting-edge image processing IPs.

We're hiring at Samsung's Israel R&D Center - our goal is to become the 1st image sensor manufacturer worldwide.

As we were experiencing growing pains in our development processes, we set up an infrastructure team to change the way we work. What we emphasized were software-engineering best practices, tooling, reproducibility, and our mission to improve cross-team collaboration.

As part of our work on algorithms for our innovative DVS sensor, I had created what became QA-Board. When I joined this new infrastructure team we expanded QA-Board's scope.

Use-Casesโ€‹

QA-Board has become a key collaborative tool. Our main use-cases are:

  • Sharing links with all the info (command, output files, logs...).
  • Work-from-home: engineers can share 108MP+ images thanks to the IIIF protocol.
  • Integration: links to and from git repositories and their Continuous Integration. From QA-Board, users can directly access build artifacts, trigger automated jobs, and when needed they can build dashboards or scripts they query QA-Board's API.
  • Visualizations: everything can be compared, and thanks to the many different types of visualizations (images/plots/text/html/video...), users can easily create the reports they need.
  • Tuning: QA-Board distributes runs to our cluster. Users can easily start tuning experiments that enable feature flags or tweak parameters. We've integrated scikit-optimize for black-box optimization.
  • Regression: users can check the progress on various metrics, and when needed, identify which commit caused a regression.
  • Performance engineering: save rr/perf recordings, view flame graphs, benchmark drivers, and track metrics for regressions.

Here are some screenshots (from slide 7):

What's next?โ€‹

Our goal is to make QA-Board the best general-purpose run-tracker. We want to see it used for performance optimization, algorithm development, model comparaisons in operational research, web page performance tracking...

To achieve those goals, we'll need:

  • User feedback, issues and feature requests.
  • Community contributions, for instance integrating more file viewers: e.g. support for common plot formats like vega or highcharts...
How to get in touch?

Join our issue tracker to report bugs or suggest features, or feel free to start a chat with the maintainers.

How to get started using QA-Board?โ€‹

Head over to the docs. If you run into issues contact us: we'll help you.