You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello guys, I'm interested in this project and wondering why the inference pool is specified when starting up and only one pool can be defined? Is it intentionally or just convenient for poc? Can we configure multiple inference pools by reconciling InferencePool resource?
The text was updated successfully, but these errors were encountered:
The APIs don't enforce a 1:1 extension to InferencePool relationship. The current reference implementation of the extension does make that assumption however, but this is more to start simple and also for sharding and scalability.
I don't believe that the ext-proc has to fundamentally be 1:1, so please feel free to make a proposal to change the code to support more than one InfernecePool. The key is to ensure that it scales well.
The APIs don't enforce a 1:1 extension to InferencePool relationship. The current reference implementation of the extension does make that assumption however, but this is more to start simple and also for sharding and scalability.
I don't believe that the ext-proc has to fundamentally be 1:1, so please feel free to make a proposal to change the code to support more than one InfernecePool. The key is to ensure that it scales well.
Hello guys, I'm interested in this project and wondering why the inference pool is specified when starting up and only one pool can be defined? Is it intentionally or just convenient for poc? Can we configure multiple inference pools by reconciling
InferencePool
resource?The text was updated successfully, but these errors were encountered: