MLEM now offers deployment to Kubernetes and Sagemaker with a single command.
RabbitMQ is a widely used open source message broker.
MLEM allows you to serve your model via RabbitMQ. This means that your model can run as a service, consuming messages with input data and producing messages with predictions.
$ pip install mlem[rmq] # or $ pip install pika
Hi! We didn't get to writing the User Guide for RabbitMQ, but we'll be happy to help you with that! Just reach us out in Discord or GitHub issues!