Support Yolov5(4.0)/Yolov5(5.0)/YoloR/YoloX/Yolov4/Yolov3/CenterNet/CenterFace/RetinaFace/Classify/Unet. use darknet/libtorch/pytorch/mxnet to onnx to tensorrt
-
Updated
Aug 2, 2021 - C++
Support Yolov5(4.0)/Yolov5(5.0)/YoloR/YoloX/Yolov4/Yolov3/CenterNet/CenterFace/RetinaFace/Classify/Unet. use darknet/libtorch/pytorch/mxnet to onnx to tensorrt
Torchserve server using a YoloV5 model running on docker with GPU and static batch inference to perform production ready and real time inference.
Serve pytorch inference requests using batching with redis for faster performance.
Ray Saturday Dec 2022 edition
Batch LLM Inference with Ray Data LLM: From Simple to Advanced
Support batch inference of Grounding DINO. "Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection"
Torchfusion is a very opinionated torch inference on datafusion.
This repository provides sample codes, which enable you to learn how to use auto-ml image classification, or object detection under Azure ML(AML) environment.
LightGBM Inference on Datafusion
This repo simulates how an ML model moves to production in an industry setting. The goal is to build, deploy, monitor, and retrain a sentiment analysis model using Kubernetes (minikube) and FastAPI.
We perform batch inference on lead scoring task using Pyspark.
MLOps project that recommends movies to watch implementing Data Engineering and MLOps best practices.
Batch LLM Inference with Ray Data LLM: From Simple to Advanced
Add a description, image, and links to the batch-inference topic page so that developers can more easily learn about it.
To associate your repository with the batch-inference topic, visit your repo's landing page and select "manage topics."