书栈网 · BookStack 本次搜索耗时 0.029 秒,为您找到 814 个相关结果.
  • Run Executors on GPU

    Run Executors on GPU Prerequisites Setting up the executor Using GPU locally Using GPU in a container Using GPU with Hub Executors Conclusion Run Executors on GPU Tadej...
  • Using ML models within OpenSearch

    Using ML models within OpenSearch GPU acceleration Related articles Using ML models within OpenSearch Introduced 2.9 To integrate machine learning (ML) models into your Ope...
  • Using ML models within OpenSearch

    Using ML models within OpenSearch GPU acceleration Related articles Using ML models within OpenSearch Introduced 2.9 To integrate machine learning (ML) models into your Open...
  • Using ML models within OpenSearch

    Using ML models within OpenSearch GPU acceleration Related articles Using ML models within OpenSearch Introduced 2.9 To integrate machine learning (ML) models into your Open...
  • 3.15 数值稳定性和模型初始化

    3.15 数值稳定性和模型初始化 3.15.1 衰减和爆炸 3.15.2 随机初始化模型参数 3.15.2.1 PyTorch的默认随机初始化 3.15.2.2 Xavier随机初始化 小结 参考文献 3.15 数值稳定性和模型初始化 理解了正向传播与反向传播以后,我们来讨论一下深度学习模型的数值稳定性问题以及模型参数的初始化方法。深度模...
  • nn package

    nn package Example 1: ConvNet Forward and Backward Function Hooks Example 2: Recurrent Net nn package 译者:@unknown 校对者:@bringtree 我们重新设计了 nn package, 以便与 autograd 完全集成....
  • 序列模型和 LSTM 网络(长短记忆网络)

    序列模型和 LSTM 网络(长短记忆网络) Pytorch 中的 LSTM 例子: 用 LSTM 来进行词性标注 练习: 使用字符级特征来增强 LSTM 词性标注器 序列模型和 LSTM 网络(长短记忆网络) 译者:@JingTao 、@friedhelm739 之前我们已经学过了许多的前馈网络. 所谓前馈网络, 就是网络中不会保存状...
  • Training Loop

    Learner, Metrics, and Basic Callbacks Utils function replacing_yield [source] mk_metric [source] save_model [source] load_model [source] class Learner [source] PyT...
  • 介绍

    介绍 简介 由来 特性 丰富的调度策略 增强型的Job管理能力 异构设备的支持 性能优化 生态 介绍 最近更新于 Sep 27, 2020 简介 Volcano是CNCF 下首个也是唯一的基于Kubernetes的容器批量计算平台,主要用于高性能计算场景。它提供了Kubernetes目前缺 少的一套机制,这些机制通常是机器学...
  • Tensors

    Tensors Inplace / Out-of-place Zero Indexing (零索引) No camel casing Numpy Bridge 将 torch Tensor 转换为 numpy Array 将 numpy Array 转换为 torch Tensor CUDA Tensors Tensors 译者:@u...