Skip to content

Add support for remote GPUs (with async inference!) #2172

@MRiabov

Description

@MRiabov

Hello,
I'm a student in not the first-world country, and unforturnately, I don't own a PC that would have an NVidia GPU - it costs about $1200 for a decent setup. On the other hand, it costs only $0.12-0.24/hr to rent RTX 4090 instances, so it's pretty cheap to simply rent a computer whenever I need to data collect/train.

But, to my knowledge LeRobot - unlike e.g. most LLM or vision trainers - runs only locally. I haven't tried, but given Async Inference it should be very feasible to make streaming to a local browser from a remote instance. In particular, for data collection.

This will make robotics dataset generation (significantly) more accessible.

I may be able to PR this one, it should be straightforward.

Cheers.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementSuggestions for new features or improvementsquestionRequests for clarification or additional information

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions