About
Train RF-DETR models
Train RF-DETR object detection models.
🚀 Use with Ikomia API
1. Install Ikomia API
We strongly recommend using a virtual environment. If you're not sure where to start, we offer a tutorial here.
pip install ikomia
2. Create your workflow
from ikomia.dataprocess.workflow import Workflow# Init your workflowwf = Workflow()# Add dataset loadercoco = wf.add_task(name="dataset_coco")coco.set_parameters({"json_file": "path/to/json/annotation/file","image_folder": "path/to/image/folder","task": "detection",})# Add training algorithmtrain = wf.add_task(name="train_rf_detr", auto_connect=True)# Launch your training on your datawf.run()
☀️ Use with Ikomia Studio
Ikomia Studio offers a friendly UI with the same features as the API.
- If you haven't started using Ikomia Studio yet, download and install it from this page.
- For additional guidance on getting started with Ikomia Studio, check out this blog post.
📝 Set algorithm parameters
model_name
(str) - default 'rf-detr-base': Name of the RF-DETR pre-trained model. Other model available:- rf-detr-large
batch_size
(int) - default '8': Number of samples processed before the model is updated.epochs
(int) - default '100': Number of complete passes through the training dataset.dataset_split_ratio
(float) – default '0.9': Divide the dataset into train and evaluation sets ]0, 1[.input_size
(int) - default '560': Size of the input image.weight_decay
(float) - default '0.000125': Amount of weight decay, regularization method.workers
(int) - default '0': Number of worker threads for data loading (per RANK if DDP).lr
(float) - default '0.00025': Initial learning rate. Adjusting this value is crucial for the optimization process, influencing how rapidly model weights are updated.lr_encoder
(float) - default '1.5e-4': Separate learning rate for the encoder parameters. Allows fine-tuning at a different rate than the rest of the model.output_folder
(str, optional): path to where the model will be saved.early_stopping
(bool) - default 'False': Whether to enable early stopping during training. This stops training if performance stops improving after a certain number of epochs.early_stopping_patience
(int) - default '10': Number of consecutive validation checks with no improvement before early stopping is triggered. Only applicable ifearly_stopping=True
.
Parameters should be in strings format when added to the dictionary.
from ikomia.dataprocess.workflow import Workflow# Init your workflowwf = Workflow()# Add dataset loadercoco = wf.add_task(name="dataset_coco")coco.set_parameters({"json_file": "path/to/json/annotation/file","image_folder": "path/to/image/folder","task": "detection",})# Add training algorithmtrain = wf.add_task(name="train_rf_detr", auto_connect=True)train.set_parameters({"model_name": "dfine_m","epochs": "100","batch_size": "6","input_size": "560","dataset_split_ratio": "0.9","workers": "0", # Recommended to set to 0 if you are using Windows"weight_decay": "1e-4","lr": " 1e-4","output_folder": "Path/To/Output/Folder", # Default folder : runs})# Launch your training on your datawf.run()
Developer
Ikomia
License
Apache License 2.0
A permissive license whose main conditions require preservation of copyright and license notices. Contributors provide an express grant of patent rights. Licensed works, modifications, and larger works may be distributed under different terms and without source code.
Permissions | Conditions | Limitations |
---|---|---|
Commercial use | License and copyright notice | Trademark use |
Modification | State changes | Liability |
Distribution | Warranty | |
Patent use | ||
Private use |
This is not legal advice: this description is for informational purposes only and does not constitute the license itself. Provided by choosealicense.com.