Phoenix LiveViews that can be used with Membrane Components from membrane_webrtc_plugin.
It's a part of the Membrane Framework.
The package can be installed by adding membrane_webrtc_live
to your list of dependencies in mix.exs
:
def deps do
[
{:membrane_webrtc_live, "~> 0.1.0"}
]
end
Membrane.WebRTC.Live
comes with two Phoenix.LiveView
s:
Membrane.WebRTC.Live.Capture
- exchanges WebRTC signaling messages betweenMembrane.WebRTC.Source
and the browser. It expects the sameMembrane.WebRTC.Signaling
that has been passed to the relatedMembrane.WebRTC.Source
. As a result,Membrane.Webrtc.Source
will return the media stream captured from the browser, whereMembrane.WebRTC.Live.Capture
has been rendered.Membrane.WebRTC.Live.Player
- exchanges WebRTC signaling messages betweenMembrane.WebRTC.Sink
and the browser. It expects the sameMembrane.WebRTC.Signaling
that has been passed to the relatedMembrane.WebRTC.Sink
. As a result,Membrane.WebRTC.Live.Player
will play media streams passed to the relatedMembrane.WebRTC.Sink
. Currently supports up to one video stream and up to one audio stream.
To use Phoenix.LiveView
s from this repository, you have to use related JS hooks. To do so, add the following code snippet to assets/js/app.js
import { createCaptureHook, createPlayerHook } from "membrane_webrtc_live";
let Hooks = {};
const iceServers = [{ urls: "stun:stun.l.google.com:19302" }];
Hooks.Capture = createCaptureHook(iceServers);
Hooks.Player = createPlayerHook(iceServers);
and add Hooks
to the WebSocket constructor. It can be done in a following way:
new LiveSocket("/live", Socket, {
params: SomeParams,
hooks: Hooks,
});
To see the full usage example, you can go to example_project/
directory in this repository (take a look especially at example_project/assets/js/app.js
and example_project/lib/example_project_web/live_views/echo.ex
).
Copyright 2025, Software Mansion
Licensed under the Apache License, Version 2.0