Skip to content

How to perform inference with a model I trained with docTR? #568

Answered by fg-mindee
kforcodeai asked this question in Q&A
Discussion options

You must be logged in to vote

Hello @K-for-Code 👋

The factory function ocr_predictor is a bit more high-level than that, but you can easily achieve what you want :)
Here is a short example of to do this:

import os

os.environ["USE_TORCH"] = "1"

import torch
from doctr.models.predictor import OCRPredictor
from doctr.models.detection.predictor import DetectionPredictor
from doctr.models.recognition.predictor import RecognitionPredictor
from doctr.models.preprocessor import PreProcessor
# from doctr.models.utils import load_pretrained_params

# Instantiate your model here
det_model = ...
reco_model = ...
# Load the checkpoints you produced
# load_pretrained_params(det_model, "<URL_TO_DET_CHECKPOINT>")
# load_pretrained_…

Replies: 4 comments 22 replies

Comment options

You must be logged in to vote
15 replies
@fg-mindee
Comment options

@kforcodeai
Comment options

@fg-mindee
Comment options

@kforcodeai
Comment options

@fg-mindee
Comment options

Answer selected by fg-mindee
Comment options

You must be logged in to vote
2 replies
@fg-mindee
Comment options

@adesgautam
Comment options

Comment options

You must be logged in to vote
1 reply
@frgfm
Comment options

frgfm Aug 5, 2022
Maintainer

Comment options

You must be logged in to vote
4 replies
@felixdittrich92
Comment options

@shahdghorsi
Comment options

@shahdghorsi
Comment options

@felixdittrich92
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
topic: documentation Improvements or additions to documentation module: models Related to doctr.models ext: references Related to references folder
7 participants