MLEM has a number of abstract base classes that anyone can implement to extend to add new capabilities to MLEM.
Transient fields are used to hold some operational object and are not saved when an object is dumped. After opening objects with transient fields they will be empty until you load the object.
Here is the list of all MLEM ABCs.
Represents a MLEM Object
For more info and list of subtypes look here
Represents different types of requirements for MLEM Object.
installable- a Python requirement typically installed through
pip. Can have specific version and alternative package name.
custom- a Python requirement in the form of a local
.pyfile or a python package. Contains name and source code for the module/package.
unix- unix package typically installed through
Represents some file format that MLEM can try to import.
pickle- simply unpickle the contens of file and use default MLEM object analyzer. Works with pickled files.
pandas- try to read a file into
pandas.DataFrame. Works with files saved with Pandas in formats like
csv, json, excel, parquet, feather, stata, html, parquet. Some formats require additional dependencies.
This class is basically a wrapper for all Model classes of different libraries. Yes, yet another standard. If you want to add support for your ML Model in MLEM, this is what you implement!
This class is polymorphic, which means that it can have more fields depending on implementation.
io- an instance of
ModelIO, a way to save and load the model
method- a string-to-signature mapping which holds information about available model methods
model(transient) - will hold the actual model object, if it was loaded
There are implementations of this class for all supported libraries:
The one notable implementation is
callable: it treats any Python callable
object as a model with a single method
__call__. That means you can turn
functions and class methods into MLEM Models as well!
Represents a way that model can be saved and loaded. A required field of
ModelType class. If a ML library has its own way to save and load models, it
There are implementations for all supported libraries:
simple_pickle is available, which simply pickles the model
sklearn, for example).
There is also separate
pickle implementation, which can detect other model
types inside your object and use their IO's for them. This is very handy when
you for example wrap your
torch NN with a Python function: the function part
will be pickled, and the NN will be saved using
Holds metadata about data, like type, dimensions, column names etc.
data(transient) - underlying data object, if it was read
primitive- any of the Python primitives
tuple- a tuple of objects, each can have different type
list- a list of objects, but they should be the same type
tuple_like_list- a list of objects, each can have different type
dict- a dictionary, each key can have different type
pd.DataFrame. Holds info about columns, their types and indexes
pd.Series. Holds info about columns, their types and indexes
np.ndarray. Holds info about type and dimensions
np.number. Holds info about type
xgboost.DMatrix. Holds info about feature names and their types
lightgbm.Dataset. Holds information about inner data object (dataframe or ndarray)
torch.Tensor. Holds information about type and dimensions
unspecified- Special dataset type when no dataset info was provided
Holds all the information needed to read dataset.
data_type- resulting data type
Writes data to files, producing a list of
Artifact and corresponding
Represents a file saved in some storage.
local- local file
fsspec- file in remote file system
dvc- file in dvc cache
Defines where the artifacts will be written. Produces corresponding
local- store files on the local file system
fsspec- store files in some remote file system
dvc- store files locally, but try to read them from DVC cache if they are absent
Represents an interface for service runtime. Provides a mapping method name to its signature. Also provides executor functions for those methods.
simple- base class for interfaces created manually. Will expose subclass methods marked with
model- dynamically create interface from
Runs configured interface, exposing its methods as endpoints.
rmq- creates a queue in
RabbitMQinstance and a consumer for each interface method
Clients for corresponding servers
http- makes request for http servers like
rmq- client for
Declaration for creating a
build from model. You can learn more about building
in this User Guide
pip- create a directory with Python package from model
whl- create a
.whlfile with Python package
docker_dir- create a directory with context for Docker image building
docker- build a Docker image from model
Declaration of target environment for deploying models.
heroku- an account on heroku platform
Declaration and state of deployed model.
env_link- link to targeted environment
env(transient) - loaded targeted environment
model_link- link to deployed model object
model(transient) - loaded model object
state- deployment state
heroku- app deployed to Heroku platform
Represents state of the deployment
heroku- state of the deployed Heroku app