What about API calls ? Is it possible or will be in the future?
Hi!You can run the model in Triton Inference Server as per manual and use its standard API.
· Sign up or log in to comment