You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks to @baparham, we now have a MVP benchmark implementation from #51. This approach uses the benchmark package so that executing dart run benchmark runs all the *_benchmark.dart within the benchmark directory. Having this setup is a great as a development tool, it allows us to quantify performance differences between different implementations. Iterating on this further, we can invest into making this a validation system that runs in CI.
Work to be done:
integrate running benchmark in CI env. + store results as CI run artifact
run integration, create a baseline file from that contains benchmark name, output value and allowed deviation percentage
integrate validation step that verifies that output didn't breach the baseline deviated output value, fail run if it did
Thanks to @baparham, we now have a MVP benchmark implementation from #51. This approach uses the benchmark package so that executing
dart run benchmark
runs all the*_benchmark.dart
within the benchmark directory. Having this setup is a great as a development tool, it allows us to quantify performance differences between different implementations. Iterating on this further, we can invest into making this a validation system that runs in CI.Work to be done:
cc @lukas-h
The text was updated successfully, but these errors were encountered: