书栈网 · BookStack 本次搜索耗时 0.035 秒,为您找到 3526 个相关结果.
  • ML framework

    ML Framework Prerequisites Model support Model format Model size GPU acceleration Upload model to OpenSearch The model_config object Example request Example response Load...
  • ML inference

    ML inference processor Syntax Configuration parameters Using the processor Example: Externally hosted model Response Example: Local model Response ML inference processor ...
  • ML extensibility

    ML extensibility Prerequisites Adding trusted endpoints Setting up connector access control Node settings Next steps Related articles This version of the OpenSearch documen...
  • ML framework

    ML Framework Prerequisites Model support Model format Model size GPU acceleration Upload model to OpenSearch The model_config object Example request Example response Load...
  • 在安卓上使用ML Kit标注图片

    1553 2018-05-11 《ML Kit 中文文档》
    在安卓上使用ML Kit标注图片 在您开始之前 在设备上标注图片 配置图像标注器 运行图像标注器 获取被标注的对象的相关信息 基于云端的图片标注 配置图像标注器 运行图像标注器 从图像中创建一个FirebaseVisionImage 对象。如果您使用Bitmap 则图像标签器运行速度最快或者如果您使用camera2 API,一种名为media.I...
  • ML extensibility

    ML extensibility Prerequisites Adding trusted endpoints Setting up connector access control Node settings Next steps Related articles ML extensibility Machine learning (ML...
  • ML ONNX

    MLOnnx Example Methods MLOnnx Exports a traditional machine learning model (i.e. scikit-learn) to ONNX. Example See the link below for a detailed example. Notebook Descr...
  • ML ONNX

    MLOnnx Example Methods __call__(self, model, task='default', output=None, opset=12) special MLOnnx Exports a traditional machine learning model (i.e. scikit-learn) to ON...
  • ML inference

    ML inference processor Syntax Configuration parameters Using the processor Response Limitation ML inference processor The ml_inference processor is used to generate infer...
  • Nuxt Kit

    Nuxt Kit Usage Install dependency Import kit utilities Nuxt Kit Nuxt Kit provides composable utilities to make interacting with Nuxt Hooks and Nuxt Builder Core and develo...