OceanGPT-X
Collection
4 items • Updated • 2
How to use zjunlp/Ocean-router with ultralytics:
from ultralytics import YOLOvv11
model = YOLOvv11.from_pretrained("zjunlp/Ocean-router")
source = 'http://images.cocodataset.org/val2017/000000039769.jpg'
model.predict(source=source, save=True)Two-stage lightweight classifiers used to dynamically route marine images to specialized detectors based on image modality and content.
| File | Task | Architecture | Input/Output |
|---|---|---|---|
cls_bio_sonar/best.pt |
Sonar vs. Biological routing | YOLOv11-cls | Image → [sonar_prob, bio_prob] |
fish_coral_cls/best.pt |
Fish vs. Coral routing | YOLOv5 | Image → [fish_prob, coral_prob] |
from ultralytics import YOLO
router = YOLO("cls_bio_sonar/best.pt")
results = router.predict("input.jpg")
Requires the official YOLOv5 repo
import torch
model = torch.hub.load("ultralytics/yolov5", "custom", path="fish_coral_cls/best.pt", force_reload=True)
results = model("input.jpg")
cls_bio_sonar decides if input is sonar or biological. If biological, fish_coral_cls routes to the appropriate species detector.