ML Framework Prerequisites Model support Model format Model size GPU acceleration Upload model to OpenSearch The model_config object Example request Example response Load...
ML inference processor Syntax Configuration parameters Using the processor Example: Externally hosted model Response Example: Local model Response ML inference processor ...
ML extensibility Prerequisites Adding trusted endpoints Setting up connector access control Node settings Next steps Related articles This version of the OpenSearch documen...
ML Framework Prerequisites Model support Model format Model size GPU acceleration Upload model to OpenSearch The model_config object Example request Example response Load...
ML extensibility Prerequisites Adding trusted endpoints Setting up connector access control Node settings Next steps Related articles ML extensibility Machine learning (ML...
MLOnnx Example Methods MLOnnx Exports a traditional machine learning model (i.e. scikit-learn) to ONNX. Example See the link below for a detailed example. Notebook Descr...
MLOnnx Example Methods __call__(self, model, task='default', output=None, opset=12) special MLOnnx Exports a traditional machine learning model (i.e. scikit-learn) to ON...
ML inference processor Syntax Configuration parameters Using the processor Response Limitation ML inference processor The ml_inference processor is used to generate infer...