-
Notifications
You must be signed in to change notification settings - Fork 10
FACSvatar
This guide explains how to use FACSvatar inside SEE to drive facial animation using FACS (Facial Action Coding System) data. FACSvatar allows manual or automated streaming of FACS values (for example from CSV files or OpenFace) into SEE.
FACSvatar documentation reference: FACSvatar
FACSvatar in SEE consists of several modules located under:
Tools/FACSvatar/.
These modules allow you to:
- Stream FACS data via a bridge
- Convert FACS Action Units (AUs) to blendshape values
- Send data manually (CSV, custom scripts)
- Use a modified version of OpenFace for live FACS extraction
To function correctly, two FACSvatar modules must always be running:
process_bridgeprocess_facstoblend
FACSvatar requires a dedicated Python environment.
An environment.yml file is provided under Tools/FACSvatar. You can use this with Anaconda. Contains all Python Packages you need for running FACSvatar.
The following components are required at runtime:
- SEE running
- An active FACS input source
process_bridge/main.pyprocess_facstoblend/main.py
FACSvatar directly controls facial blendshapes. If FACSvatar is used, other systems that override facial animation should be disabled, for example:
- SALSA Lip Sync
- Other facial animation or blendshape controllers
Leaving these enabled may result in overridden, unstable, or reset facial animations.
1. Launch SEE as usual.
2. Prepare an Input Source:
-
Option A: Modified OpenFace (Live Capture)
- Use the FACSvatar-modified OpenFace version
- Setup instructions are provided in the FACSvatar documentation
- Enables real-time facial motion capture
- Use the FACSvatar-modified OpenFace version
-
Option B: Offline CSV Files
- Use pre-recorded CSV files containing FACS AU values compatible with FACSvatar
- example can be found here
Tools\FACSvatar\input_facsfromcsv
-
Option C: Custom Script
- Implement a custom sender using ZeroMQ
- Stream FACS AU values directly to FACSvatar
3. Start the FACSvatar Bridge
- The bridge module handles communication between input sources and SEE.
- Path:
Tools/FACSvatar/process_bridge/. - Run:
main.py.
- Path:
4. Start FACS-to-Blendshape Conversion
- This module converts FACS Action Units into avatar blendshape values.
- Path:
Tools/FACSvatar/process_facstoblend/ - Run:
main.py
- Path:
5. Send FACS Data
Once both required modules are running, FACS data can be streamed:
- Live from modified OpenFace
- From CSV files
- From a custom ZeroMQ script