v0.15.2
What's Changed
- Separate the LMM block into OpenAI and CogVLM by @EmilyGavrilenko in #549
- Do not log warning when usage payload is put back into the queue by @grzegorz-roboflow in #551
- Initialize usage_collector._async_lock only if async look can be obtained by @grzegorz-roboflow in #552
- Cache workflow specification for offline use by @sberan in #555
- Custom Metadata block for Model Monitoring users by @robiscoding in #553
- Allow zone defined as input parameter to be passed to perspective_correction by @grzegorz-roboflow in #559
- Increment
num_errors
only ifpingback
initialized by @iurisilvio in #527 - SAM2 by @hansent and @probicheaux in #557
Full Changelog: v0.15.1...v0.15.2