admin管理员组文章数量:1356548
I have found several documentation and tools for converting ONNX model to float16, but none of them supports converting to bfloat16.
The model is originally trained using tensorflow and converted to ONNX. I do not have access to the original tensorflow code/models.
Do you think I can convert the model to tensorflow then convert back to onnx as bfloat16? Will this quantization "dilutes" the trained model weights?
Please advice.
I have looked into tools for converting onnx models, but only fp16 is found, not bfp16.
本文标签: pythonONNXHow do I convert ONNX float32 model to bfloat16Stack Overflow
版权声明:本文标题:python - ONNX - How do I convert ONNX float32 model to bfloat16? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1743946141a2566439.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论