Skip to content

uncefact/project-identity-resolver

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Storage Service

The storage service directory contains an Express REST API that provides endpoints to encrypt and store documents.

Overview

The service offers the following functionality:

  • Hash Computation: Computes the SHA-256 hash of a given document to ensure data integrity.
  • Encryption: Encrypts the document using AES-256-GCM for enhanced security.
  • Storage: Stores the encrypted document using the specified storage adapter (local file system, AWS, Digital Ocean or Google Cloud Storage).
  • Data Retrieval: Upon successful storage, the service returns:
    • The hash of the original document.
    • A decryption key for the encrypted document (if applicable).
    • The URI of the stored encrypted document.

Choosing Your Storage Endpoint

This service offers two ways to store data, depending on whether your data is public or private.

Public Data → /documents

For data that doesn't require protection. The service stores it as-is and returns:

  • A URI (the location of your stored data)
  • A hash (a fingerprint to verify the data hasn't changed)

Private Data → /credentials

For sensitive data that needs protection. The service automatically encrypts your data before storage — you don't need to encrypt it yourself.

The response includes:

  • A URI (the location of your stored data)
  • A hash (a fingerprint to verify the data hasn't changed)
  • A key (your unique decryption key)

Save this key securely — it's the only way to decrypt your data later.

Learn more about storage options

Prerequisites

Environment Variables

An example environment file .env.example is provided in the storage service directory. Copy and rename it to .env:

cp .env.example .env

Then modify the variables as required. The default values should be sufficient for local development.

Usage

# Install dependencies
yarn install

# Build the app
yarn build

# Run the app and watch for changes
yarn dev

# Start the server once built
yarn start

# Run linter
yarn lint

# Run unit tests
yarn test

# Run e2e tests
yarn test:e2e

# Run all tests (unit and e2e)
yarn test:all

Configuration

Configure the storage service using the following environment variables:

Authentication

  • API_KEY: Required. The API key used to authenticate upload requests to /credentials and /documents endpoints. The service will not start without this variable set.

Storage Configuration

  • STORAGE_TYPE: The type of storage to use (local or gcp or aws, or digital_ocean).
  • LOCAL_DIRECTORY: The directory for local storage (default: uploads in the current directory).
  • GOOGLE_APPLICATION_CREDENTIALS: The path to the GCP service account file (if using GCP).
  • REGION: The region to use (if using AWS or Digital Ocean).
  • AWS_ACCESS_KEY_ID: The access key to use (if using AWS or Digital Ocean).
  • AWS_SECRET_ACCESS_KEY: The secret access key to use (if using AWS or Digital Ocean).

Storage Types

Local Storage

For development purposes, use the local storage service, which stores files in the local file system.

Example:

# Set the storage type to local
export STORAGE_TYPE=local

# Run the app
yarn dev

The Swagger UI is available at http://localhost:3333/api-docs.

Google Cloud Storage

For production environments, use Google Cloud Storage to store files in a GCP bucket.

Example:

# Set the storage type to gcp
export STORAGE_TYPE=gcp

# Set the path to the GCP service account file
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account-file.json

# Build the app
yarn build

# Run the app
yarn start

Amazon Web Services (AWS)

For the production environment, we recommend using IAM roles to enhance security and eliminate the need to hardcode AWS credentials. To more details about using IAM roles, please refer to the AWS documentation. Use Amazon Web Services to store files in an S3 bucket.

Example:

# Set the storage type to aws
export STORAGE_TYPE=aws

# Set the AWS region
export REGION=ap-southeast-2
export AWS_ACCESS_KEY_ID=AWS_ACCESS_KEY_ID # Local development only
export AWS_SECRET_ACCESS_KEY=AWS_SECRET_ACCESS_KEY # Local development only

# Build the app
yarn build

# Run the app
yarn start

Digital Ocean (DO)

Example:

# Set the storage type to digital_ocean
export STORAGE_TYPE=digital_ocean

# Set the DO configuration
export REGION=syd1
export AWS_ACCESS_KEY_ID=DO_ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=DO_SECRET_ACCESS_KEY

# Build the app
yarn build

# Run the app
yarn start

Cryptography

The cryptography service uses the following algorithms:

  • Hash Algorithm: SHA-256
  • Encryption Algorithm: AES-256-GCM

Authentication

All upload endpoints (POST /credentials and POST /documents) require API key authentication via the X-API-Key header.

Example:

curl -X POST http://localhost:3333/api/1.0.0/credentials \
-H "Content-Type: application/json" \
-H "X-API-Key: your-api-key-here" \
-d '{"bucket": "verifiable-credentials", "data": {"field1": "value1"}}'

If the API key is missing or invalid, the service will return a 401 Unauthorized response.

Docker

To run the storage service using Docker:

# Build the image
docker build -t storage-service:latest .

# Start the container using local storage
# Configure your .env file first with API_KEY and other required variables
docker run -d --env-file .env -p 3333:3333 \
storage-service:latest

# Start the container using Google Cloud Storage
# Update STORAGE_TYPE=gcp in your .env file and mount the service account file
docker run -d --env-file .env -p 3333:3333 \
-v /path/to/local/gcp/service-account-file.json:/tmp/service-account-file.json \
storage-service:latest

# Start the container using Amazon Web Services (AWS)
# Update STORAGE_TYPE=aws and AWS credentials in your .env file
docker run -d --env-file .env -p 3333:3333 \
storage-service:latest

# Start the container using Digital Ocean (DO)
# Update STORAGE_TYPE=digital_ocean and credentials in your .env file
docker run -d --env-file .env -p 3333:3333 \
storage-service:latest

Documentation

The project uses Docusaurus for documentation management. Documentation versions are managed through a release script and automated pipeline.

Release Script

The scripts/release-doc.js script automates the process of creating new documentation versions:

  • Reads the documentation version from version.json
  • Creates Docusaurus version using docVersion value from version.json file

To manually create a new documentation version:

# Run the release script
yarn release:doc

Documentation Pipeline

The documentation is automatically built and deployed using GitHub Actions through the build_publish_docs.yml pipeline. This pipeline:

  1. Triggers on:
  • Manual workflow dispatch
  • (TODO) Push to main branch once enabled
  1. Performs the following steps:
  • Checks out the repository
  • Sets up Node.js 18 with Yarn cache
  • Installs documentation dependencies
  • Builds the static documentation site
  • Deploys to GitHub Pages using gh-pages branch

The pipeline uses environment variables for configuration:

  • DOCS_BASE_URL: Base URL for documentation hosting
  • DOCS_URL: Documentation site URL

The built documentation is published to the gh-pages branch using the GitHub Actions bot.

Release Guide

To release a new version, ensure we have the version.json file updated with the new version number. Then, create a new release tag with the following steps:

  1. Create a new release branch from next with the version number as the branch name.
  2. Update the version.json file with the new version number.
  3. Generate new documentation version using the release script yarn release:doc.
  4. Check API documentation and update if necessary.
  5. Commit the changes and push the branch.
  6. Create a pull request from the release branch to main.
  7. Merge the pull request.
  8. Create a new release tag with the version number.
  9. Push the tag to the repository.

(*) With the version.json file, it contains the version number in the following format:

{
    "version": "MAJOR.MINOR.PATCH",
    "apiVersion": "MAJOR.MINOR.PATCH",
    "docVersion": "MAJOR.MINOR.PATCH",
    "dependencies": {}
}

We need to change manually the version, apiVersion, and docVersion fields.

About

Reference implementation of UNTP link resolver and credential store

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 6