Skip to content

dwyl/PWA-Liveview

Repository files navigation

Phoenix LiveView + SolidJS PWA

An example of a Progressive Web App (PWA) combining Phoenix LiveView's real-time collaborative capabilities with a reactive UI (SolidJS) offline-first ready.

Table of Contents

What?

Context: we want to experiment PWA webapps using Phoenix LiveView.

What are we building? A two pages webapp. On the first page, we mimic a shopping cart where users can pick items until stock is depleted, at which point the stock is replenished. On the second page, we propose an interactive map with a form where two users can edit collaboratively a form to display markers on the map.

Why?

Traditional Phoenix LiveView applications face several challenges in offline scenarios:

  1. no Offline Interactivity: Some applications need to maintain interactivity even when offline, preventing a degraded user experience.

  2. no Offline Navigation: UI may need to navigate through pages.

  3. WebSocket Limitations: LiveView's WebSocket architecture isn't naturally suited for PWAs, as it requires constant connection for functionality. When online, we use Phoenix.Channel for real-time collaboration.

  4. State Management: Challenging to maintain consistent state across network interruptions between the client and the server. We use different approaches based on the page requirements:

    • CRDT-based synchronization (Y.js featuring IndexedDB and y_ex) for Stock Manager page and SQLite for server-side state management synchronization
    • Local state management (Valtio) for Flight Map page
  5. Build tool: We need to build a Service Worker to cache HTML pages and static assets as WebSocket-rendered pages require special handling for offline access. We use Vite as the build tool to bundle and optimize the application and enable PWA features seamlessly

Key points for offline collaborative

Design goals

  • collaborative (online): Clients sync via pubsub updates when connected, ensuring real-time consistency.
  • Offline-First: The app remains functional offline (through reactive JS components), with clients converging to the correct state on reconnection.
  • Business Rules for the stock page: When users resync, the server enforces a "lowest stock count" rule: if two clients pick items offline, the server selects the lowest remaining stock post-merge, rather that summing te reduction, for simplicity.

Architecture

You have both CRDT-based synchronization (for convergence) and server-enforced business rules (for consistency).

We have two Layers of Authority:

  1. CRDT Sync Layer (Collaborative). Clients and server synchronize using Yjs’s CRDTs to merge concurrent edits deterministically. Clients can modify their local Y-Doc freely (offline or online).

  2. Business Rules Layer (Authoritative). The server is authoritative: it validates updates upon the business logic (e.g., stock validation), and broadcasts the canonical state to all clients. Clients propose changes, but the server decides the final state (e.g., rejecting overflows, enforcing stock limits).

Implementation highlights (stock page)

  • Offline capabilities: Edits are saved to y-indexeddb and sent later Service Worker caches assets via VitePWA for full offline functionality.

  • Synchronization Flow: Client sends all pending Yjs updates on (re)connection. The client updates his local Y-Doc with the server responses. Y-Doc mutations trigger UI rendering, and reciprocally, UI modifications update the Y -Doc and propagate mutations to the server.

  • Server Processing:: Merges updates into the SQLite3-stored Y-Doc (using y_ex). Applies business rules (e.g., "stock cannot be negative").Broadcasts the approved state. Clients reconcile local state with the server's authoritative version

  • Data Transport: Use Phoenix.Channel to transmit the Y-Doc state as binary. This minimizes bandwith usage It decouples CRDT synchronization from the LiveSocket. Implementation heavily inspired by the repo https://github.com/satoren/y-phoenix-channel made by the author of y_ex.

  • Component Rendering Strategy:

    • online: use LiveView hooks
    • offline: hydrate the cached HTML documents with reactive JavaScript components

Usage

1/ IEX session dev setup

# install all dependencies including Vite
mix deps.get
cd assets && pnpm install
# start Phoenix server, it will also compile the JS
cd .. && iex -S mix phx.server

2/ Docker container in local mode=prod

docker compose up --build

Tech overview

Component Role
Vite Build tool
SQLite Persistent storage of latest Yjs document
Phoenix LiveView UI rendering, incuding hooks
PubSub / Phoenix.Channel Broadcast/notifies other clients of updates / conveys CRDTs binaries
Yjs / Y.Map Holds the CRDT state client-side (shared)
y-indexeddb Persists state locally for offline mode
SolidJS renders reactive UI using signals, driven by Yjs observers
Hooks Injects communication primitives and controls JavaScript code
Service Worker / Cache API Enable offline UI rendering and navigation by caching HTML pages and static assets
Leaflet Map rendering
MapTiler enable vector tiles
WebAssembly container  high-performance calculations for map "great-circle" routes use Zig code compiled to WASM

Diagrams

Architecture diagram
flowchart TD
    subgraph "Client"
        UI["UI Components\n(SolidJS)"]
        YDoc["Local Y-Doc\n(CRDT State)"]
        IndexedDB["IndexedDB\n(Offline Storage)"]
        ServiceWorker["Service Worker\n(Asset Caching)"]
        YObserver["Y.js Observer\n(State Change Listener)"]

        UI <--> YDoc
        YDoc <--> IndexedDB
        YObserver --> UI
        YDoc --> YObserver
    end

    subgraph "Communication Layer"
        PhoenixChannel["Phoenix Channel\n(Binary Updates Transport)"]
        ConnectionCheck["Connection Status\nMonitoring"]
    end

    subgraph "Server (Elixir)"
        YjsChannel["Yjs Channel\n(YjsChannel Module)"]
        DocHandler["DocHandler\n(Database Interface)"]
        YEx["Yex (y-crdt)\n(CRDT Processing)"]
        BusinessRules["Business Rules\n(apply_if_lower?)"]
        DB["SQLite\n(Persisted Y-Doc)"]

        YjsChannel <--> DocHandler
        YjsChannel <--> YEx
        YjsChannel <--> BusinessRules
        DocHandler <--> DB
    end

    YDoc <--> PhoenixChannel
    PhoenixChannel <--> YjsChannel
    ConnectionCheck --> PhoenixChannel

    class BusinessRules highlight
    class YDoc,YEx highlight
Loading

Server Authority in collaborative mode
sequenceDiagram
    participant ClientA
    participant ClientB
    participant Server
    participant DB

    Note over ClientA,ClientB: Online Scenario
    ClientA->>Server: "init-client" (join channel)
    Server->>DB: Fetch Y-Doc state
    DB-->>Server: Y-Doc (current counter)
    Server-->>ClientA: "init" (binary update)
    ClientA->>ClientA: Apply update (Yjs)
    ClientA->>Server: "yjs-update" (local edit, e.g., counter=5)
    Server->>DB: Load Y-Doc
    Server->>Server: apply_if_lower?(old, new)
    alt Business Rule Passes (new ≤ old)
        Server->>DB: Save merged Y-Doc
        Server->>ClientA: "pub-update" (ack)
        Server->>ClientB: "pub-update" (broadcast)
    else Reject (new > old)
        Server->>ClientA: "pub-update" (revert to server state)
    end

    Note over ClientA,ClientB: Offline Scenario
    ClientB->>ClientB: Local edit (counter=3, offline)
    ClientB->>ClientB: Save to y-indexeddb
    ClientB->>Server: Reconnect
    ClientB->>Server: "yjs-update" (queued offline edits)
    Server->>DB: Load Y-Doc
    Server->>Server: apply_if_lower?(old=5, new=3)
    Server->>DB: Save merged Y-Doc (counter=3)
    Server->>ClientA: "pub-update" (counter=3)
    Server->>ClientB: "pub-update" (ack)
Loading

Detailled Sync flow sequence
sequenceDiagram
    participant Client
    participant Channel as Phoenix.Channel
    participant YEx as Yex (y-crdt)
    participant DocHandler as DocHandler
    participant DB as SQLite

    Note over Client,DB: Initial Connection Flow
    Client->>Channel: join("yjs-state", {userID, max})
    Channel->>Channel: handle_info(:after_join)
    opt If no doc exists
        Channel->>YEx: Yex.Doc.new()
        Channel->>YEx: Set initial counter value (max)
        Channel->>DocHandler: update_doc(update)
        DocHandler->>DB: Store initial Y-Doc
    end

    Client->>Channel: "init-client"
    Channel->>DocHandler: get_y_doc()
    DocHandler->>DB: Fetch stored Y-Doc
    DB-->>DocHandler: Binary Y-Doc state
    DocHandler-->>Channel: Binary Y-Doc state
    Channel->>YEx: Build ydoc from DB state
    YEx-->>Channel: Y-Doc object
    Channel->>YEx: encode_state_as_update(ydoc)
    YEx-->>Channel: Binary update
    Channel-->>Client: "init" with binary update
    Client->>Client: Y.applyUpdate(ydoc, update, "init")
    Client->>Client: Update UI via ymap.observe

    Note over Client,DB: Update Flow (Client to Server)
    Client->>Client: Local UI action
    Client->>Client: handleUpdate(newValue)
    Client->>Client: ydoc.transact()
    Client->>Client: ymap.set("counter", newValue)
    Client->>Client: Y-Doc generates update
    Client->>Client: handleYUpdate triggered (origin: local)
    alt If online
        Client->>Channel: "yjs-update" with binary update
        Channel->>DocHandler: get_y_doc()
        DocHandler->>DB: Fetch current state
        DB-->>DocHandler: Current binary state
        DocHandler-->>Channel: Current binary state
        Channel->>YEx: Build ydoc from DB state
        YEx-->>Channel: Y-Doc object
        Channel->>YEx: Get current counter value
        YEx-->>Channel: old_value
        Channel->>YEx: Apply client update
        YEx-->>Channel: Updated Y-Doc
        Channel->>YEx: Get new counter value
        YEx-->>Channel: new_value

        alt If apply_if_lower?(old_value, new_value)
            Channel->>YEx: encode_state_as_update(ydoc)
            YEx-->>Channel: Merged binary update
            Channel->>DocHandler: update_doc(merged_doc)
            DocHandler->>DB: Store updated Y-Doc
            Channel-->>Client: "ok" response
            Channel->>Channel: broadcast "pub-update" to other clients
        else Reject update (business rule)
            Channel-->>Client: "pub-update" with current server state
            Client->>Client: Y.applyUpdate(ydoc, serverState, "remote")
            Client->>Client: UI updates via ymap.observe
        end
    else If offline
        Client->>Client: Changes stored in IndexedDB
    end

    Note over Client,DB: Reconnection Flow
    Client->>Client: Detect reconnection
    Client->>Client: syncWithServer()
    Client->>Client: Y.encodeStateAsUpdate(ydoc)
    alt If empty state
        Client->>Channel: "init-client"
        Channel-->>Client: "init" with current state
    else Send local changes
        Client->>Channel: "yjs-update" with local changes
        Note right of Channel: Same flow as Update Flow
    end
Loading

Demo Pages

Stock Manager

Available at /.

Stock Manager Screenshot

Flight Map

Available at /map.

It displays an interactive and collaborative (two-user input) route planning with vector tiles.

Flight Map Screenshot

Key features:

  • Valtio-based local state management
  • WebAssembly-powered great circle calculations
  • Efficient map rendering with MapTiler and vector tiles
  • Works offline for CPU-intensive calculations

Mapping

  • Uses vector tiles instead of raster tiles for efficient caching
  • Significantly smaller cache size (vector data vs. image files)
  • Better offline performance with less storage usage
  • Smooth rendering at any zoom level without pixelation

Collaborative input

The UI displays a form with two inputs, which are pushed to Phoenix and broadcasted via Phoenix PubSub. A marker is drawn by Leaflet to display the choosen airport on a vector-tiled map using MapTiler.

Client state management

We used Valtio, a browser-only state manager for the geographical points based on proxies. It is lightweight perfect for ephemeral UI state when complex conflict resolution isn't needed.

Great circle computation with a WASM module

Zig is used to compute a "great circle" between two points, as a list of [lat, long] spaced by 100km. The Zig code is compiled to WASM and available for the client JavaScript to run it. Once the list of successive coordinates are in JavaScript, Leaflet can use it to produce a polyline and draw it into a canvas. We added a WASM module to implement great circle route calculation as a showcase of WASM integration:

check the folder "/zig-wasm"

Airport dataset

We use a dataset from https://ourairports.com/.

We stream download a CSV file, parse it (NimbleCSV) and bulk insert into an SQLite table. Check <"/lib/solidyjs/db/Airports.ex">

When a user mounts, we read from the database and pass the data asynchronously to the client via the liveSocket on te first mount. We persist the data in localStorage for client-side search. The socket "airports" assign is then pruned to free the server's socket.

Navigation

The user navigates between two pages which use the same live_session, with no full page reload.

When the user goes offline, we have the same smooth navigation thanks to the HTML and assets caching, as well as the usage of y-indexeddb.

The Full Lifecycle

  • Initial Load: App determines if online/offline and sets up accordingly
  • Going Offline: Triggers component initialization and navigation setup
  • Navigating Offline: Fetches cached pages, cleans up components, updates DOM, re-renders components
  • Going Online: user expects a page refresh and Phoenix LiveView reinitializes.

The key points are:

  • [memory leaks] since the reactive components are "phx-udpate= 'ignore'", they have they own lifecycle. The cleanup of these "external" components (subscriptions, listeners, disposal of the components) is essential to remove memory leaks and component ghosting.
  • [smooth navigation] the navigation links are preventDefault(). Then, we get the corresponding cached page via a fetch(path). It is intercepted by the Service Worker who delivers the correct page.
  • [hydrate the DOM] we parseDom the received HTML text and inject the desired DOM container with the expected ids, and hydrate it with desired reactive components.

The diagrams illustrates four key areas of the offline navigation system:

  1. Startup

The app starts by checking connection status Based on the status, it either initializes LiveView (online) or offline components.

flowchart TD
  subgraph "Status Management"
        A[App Starts] --> B{Check Connection}
        B -->|Online| C[Initialize LiveSocket with Hooks]
        B -->|Offline| D[Run initOfflineComponents]
  end
Loading
  1. Status polling

A polling mechanism continuously checks server connectivity.

When status changes, custom events trigger appropriate handlers.

flowchart TD
    subgraph "Polling"
        P[Polling checkServer] --> Q{Connection Status Changed?}
        Q -->|Dispatch Event| R[To Offline]
        Q -->|Dispatch Event| S[To Online]
        R --> T[Run initOfflineComponents]
        S --> V[Page Reload]
    end
Loading
  1. Offline rendering

The renderCurrentView() function is the central function that manages components.

The cleanupOfflineComponents() function is calls the stored cleanup functions for each component type. Each cleanup function properly disposes of its resources to avoid memory leaks.

It then renders the appropriate components (Stock, Map, Form) based on the current page. Each component's cleanup function is stored for later use Navigation listeners are attached to handle client-side routing without page reload.

flowchart TD
    subgraph "Offline Rendering"
        D[Run initOfflineComponents] --> F[Clean Up Existing Components]
        F -->E[renderCurrentView]
        E --> G1[Render Stock Component]
        E --> H1[Render Map Component]
        E --> I1[Render Form Component]
        G1 --> J[Store Cleanup Functions]
        H1 --> J
        I1 --> J
        J --> M[Attach Navigation Listeners]
    end
Loading
  1. Offline navigation

When a user clicks a navigation link, handleOfflineNavigation() intercepts the click. It prevents the default page load behavior. Updates the URL using History API (no page reload) Fetches the cached HTML for the target page from the Service Worker cache Critical Step: Updates the DOM structure with the new page's HTML Re-renders components for the new page context Re-attaches navigation listeners.

flowchart TD
    subgraph "Offline Navigation"
        N[User Clicks Link] --> O[handleOfflineNavigation]
        O --> AA[Prevent Default]
        AA --> BB[Update URL with History API]
        BB --> CC[Fetch Cached HTML]
        CC --> DD[Parse HTML]
        DD --> FF[Update DOM Structure]
        FF --> GG[Render New Components]
        GG --> HH[Reattach Navigation Listeners]
    end
Loading

[Optional] Page Caching

Direct usage of Cache API instead of Workbox

We can use the Cache API as an alternative to Workbox to cache pages. The important part is to calculate the "Content-Length" to be able to cache it.

Note: we cache a page only once by using a Set

// Cache current page if it's in the configured routes
async function addCurrentPageToCache({ current, routes }) {
  await navigator.serviceWorker.ready;
  const newPath = new URL(current).pathname;

  // Only cache configured routes once
  if (!routes.includes(newPath) || AppState.paths.has(newPath)) return;

  if (newPath === window.location.pathname) {
    AppState.paths.add(newPath);
    const htmlContent = document.documentElement.outerHTML;
    const contentLength = new TextEncoder().encode(htmlContent).length;

    const response = new Response(htmlContent, {
      headers: {
        "Content-Type": "text/html",
        "Content-Length": contentLength,
      },
      status: 200,
    });

    const cache = await caches.open(CONFIG.CACHE_NAME);
    return cache.put(current, response);
  }
}

// Monitor navigation events
navigation.addEventListener("navigate", async ({ destination: { url } }) => {
  return addCurrentPageToCache({ current: url, routes: CONFIG.ROUTES });
});

Configuration and settings

PWA and Workbox Caching Strategies

Vite generates the Service Worker - based on the workbox config - and the manifest in the "vite.config.js" file.

Server configuration

Phoenix settings: dev build and watcher
# endpoint.ex
def static_paths do
  ~w(assets fonts images favicon.ico robots.txt sw.js manifest.webmanifest)
end

The watcher config is:

# config/dev.exs
:solidyjs, SolidyjsWeb.Endpoint,
   watchers: [
     npx: [
       "vite",
       "build",
       "--mode",
       "development",
       "--watch",
       "--config",
       "vite.config.js",
       cd: Path.expand("../assets", __DIR__)
     ],
   ]
CSP rules and evaluation

The application implements security CSP headers set by a plug: BrowserCSP.

We mainly protect the "main.js" file - run as a script in the "root.html" template - is protected with a dynamic nonce.

Detail of dynamic nonce
defmodule SoldiyjsWeb.BrowserCSP do
  @behaviour Plug

  def init(opts), do: opts

  def call(conn, _opts) do
    nonce = :crypto.strong_rand_bytes(16) |> Base.encode16(case: :lower)
    Plug.Conn.assign(conn, :csp_nonce, nonce)
  end
end
# root.html.heex
<script nonce="<%= assigns[:csp_nonce] %>">
  // Your inline script here
</script>

```elixir
defp put_csp_headers(conn) do
  nonce = conn.assigns[:csp_nonce] || ""

  csp_policy = """
  script-src 'self' 'nonce-#{nonce}' 'wasm-unsafe-eval' https://cdn.maptiler.com;
  object-src 'none';
  connect-src 'self' http://localhost:* ws://localhost:* https://api.maptiler.com https://*.maptiler.com;
  img-src 'self' data: https://*.maptiler.com https://api.maptiler.com;
  worker-src 'self' blob:;
  style-src 'self' 'unsafe-inline';
  default-src 'self';
  frame-ancestors 'self' http://localhost:*;
  base-uri 'self'
  """
  |> String.replace("\n", " ")

  put_resp_header(conn, "content-security-policy", csp_policy)
end

The nonce-xxx attribute is an assign populated in the plug BrowserCSP. Indeed, the "root" template is rendered on the first mount, and has access to the conn.assigns.

➡️ Link to check the endpoint: https://csp-evaluator.withgoogle.com/


Screenshot 2025-05-02 at 21 18 09
WASM CSP rule

The WASM module needs 'wasm-unsafe-eval' as the browser runs eval.

User token

We also protect the custom socket with a "user_token", generated by the server with Phoenix.Token.

If you inject an inline script <script>window.userToken=<%= @user_token %></script>, you would need another nonce or use "unsage-inline".

Instead, we pass it as a data-attirbute and save it in sessionStorage .

  • In the router, we populate the session with a Phoenix "user_token".
  • On the first live-mount (in the shared on_mount of the live_session), we can access the session.
  • We can then assign the socket.
  • In the "app.html.heex" layout, we use a simple <div> and set data-user-token={@user_token}. Indeed this template can access the liveSocket assigns.
  • in the JavaScript, we access this div and read the data-attribute.

Yjs

Documentation source

Yjs initialization

On each connection, a client starts a new local YDoc instance with an IndexedDB instance.

Yjs initialization
// Initialize Y.js with IndexedDB persistence
async function initYJS() {
  const Y = await import("yjs");
  const { IndexeddbPersistence } = await import("y-indexeddb");

  // Create a new Y.js document with IndexedDB storage
  const storeName = "app-store";
  const ydoc = new Y.Doc();
  const provider = new IndexeddbPersistence(storeName, ydoc);

  // Wait for initial sync from IndexedDB
  await provider.whenSynced;

  return ydoc;
}

UI layer: a SolidJS component

It basically render the counter that contains:

  • a local "onClick" handler that mutates the Y.Map type of the YDoc,
  • an "observer" on the type Y.Map of the YDoc. It updates the "signal", wether from a local - from the onClick - or remote - from the hook - mutation of the YDoc.
SolidJS Stock component
// ❗️ do not destructure the "props" argument
(props) => {
  const ymap = props.ydoc.getMap("data");
  const [localStock, setLocalStock] = createSignal(
    ymap.get("counter") || defaultValue
  );

  // Handle local updates in a transaction
  const handleUpdate = (newValue) => {
    ydoc.transact(() => {
      ymap.set("counter", newValue);
    }, "local");
    setLocalStock(newValue);
  };
// Listen to any change (local above or remote)
ymap.observe((event) => {
  if (event.changes.keys.has("counter")) {
    setLocalStock(ymap.get("counter"));
  }
});

render(...)
}

Server-Side Implementation

The server-side implementation uses:

  • the Elixir port y_ex of y-cdrt, the Rust port of Yjs server-side.
  • an SQlite3 database to persist the Yjs document.

It uses Phoenix.Channel to convey data between the server and the client. This is because we pass directly binary data: this removes the need to encode/decode in Base64 with the LiveSocket, thus lowers the data flow, but also decouples UI related work from data flow.

State Synchronization Flow

sequenceDiagram
    participant User

    participant SolidJS
    participant Yjs
    participant Hook
    participant Channel
    participant OtherClients
    participant Db

    User->>SolidJS: Interact with UI
    SolidJS->>Yjs: Update local state
    Yjs->>Hook: Trigger update event
    Hook->>Channel: Send state to server
    Channel->>Yjs: merge db state with update
    Channel->>Db: persist merged state
    Channel->>PubSub: Broadcast update
    PubSub->>OtherClients: Distribute changes
    OtherClients->>Yjs: Apply update
    Yjs->>SolidJS: Update UI
Loading

Misc

Common pitfall of combining LiveView with CSR components

The client-side rendered components are manually mounted via hooks. They will leak or stack duplicate components if you don't cleanup and unmount them. You can use the destroyed callback where you can use SolidJS makes this easy with a cleanupSolid callback (where you take a reference to the SolidJS component in the hook).

"Important" Workbox settings

navigateFallbackDenylist excludes LiveView critical path

injectManifest: {
  injectionPoint: undefined,
},

Set:

clientsClaim: true,
skipWaiting: true,

With clientsClaim: true, you take control of all open pages as soon as the service worker activates.

With skipWaiting: true, new service worker versions activate immediately.

Icons

You will need is to have at least two very low resolution icons of size 192 and 512, one extra of 180 for OSX and one 62 for Microsoft, all placed in "/priv/static/images".

Check Resources

Manifest

The "manifest.webmanifest" file will be generated from "vite.config.js".

{
  "name": "ExLivePWA",
  "short_name": "ExLivePWA",
  "start_url": "/",
  "display": "standalone",
  "background_color": "#ffffff",
  "lang": "en",
  "scope": "/",
  "description": "A Phoenix LiveView PWA demo webapp",
  "theme_color": "#ffffff",
  "icons": [
    { "src": "/images/icon-192.png", "sizes": "192x192", "type": "image/png" },
    { "src": "/images/icon-512.png", "sizes": "512x512", "type": "image/png" }
  ]
}

✅ Insert the links to the icons in the (root layout) HTML:

<!-- root.html.heex -->
<head>
  [...] <link rel="icon-192" href={~p"/images/icon-192.png"} /> <link
  rel="icon-512" href={~p"/images/icon-512.png"} />
  <link rel="icon" href="/favicon.ico" sizes="48x48" />
  <link rel="manifest" href="/manifest.webmanifest" />
  [...]
</head>

Performance

Through aggressive caching, code splitting strategies and compression (to limit MapTiler and Leaflet sizes), we get:

  • First Contentful Paint (FCP): 0.3s
  • Full Page Render (with map and WASM): 0.5s

These metrics are achieved through:

  • Efficient WASM module loading and integration via Vite
  • Vector tiles for minimal map cache size
  • Y.js CRDT for conflict-free state sync
  • Strategic asset caching with workbox
  • Code splitting with dynamic imports
  • Optimized bundling with Vite
Screenshot 2024-12-28 at 04 45 26

Resources

Besides Phoenix LiveView:

License

MIT License