all AI news
Export PyTorch Model to ONNX – Convert a Custom Detection Model to ONNX
July 3, 2023, 12:30 a.m. | Sovit Ranjan Rath
DebuggerCafe debuggercafe.com
In this article, we train a custom PyTorch RetinaNet model and export it to ONNX format. Further, we run inference using it on the CUDA device.
The post Export PyTorch Model to ONNX – Convert a Custom Detection Model to ONNX appeared first on DebuggerCafe.
article cuda deep learning detection export format inference object-detection onnx onnx-runtime pytorch pytorch retinanet retinanet
More from debuggercafe.com / DebuggerCafe
Fine-Tuning GPT2 for Text Generation
5 days, 22 hours ago |
debuggercafe.com
FasterViT for Semantic Segmentation
1 week, 5 days ago |
debuggercafe.com
FasterViT for Image Classification
2 weeks, 5 days ago |
debuggercafe.com
Introduction to GPT-1 and GPT-2
1 month, 1 week ago |
debuggercafe.com
Multi Class Segmentation using Mask2Former
1 month, 3 weeks ago |
debuggercafe.com
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Director, Clinical Data Science
@ Aura | Remote USA
Research Scientist, AI (PhD)
@ Meta | Menlo Park, CA | New York City