For local development using a remote API server, configure the proxy settings in your local environment:
- Create or edit code/client/.env.development.local
- Set the following variables:
- VITE_PROXY_API_TARGET: Point to your API server
- VITE_GRAPHQL_SCHEMA_URL: Point to the- /graphqlendpoint
cd code/client
npm install
npm run dev-clientDefaults value are:
VITE_PROXY_API_TARGET=http://localhost:3000
VITE_GRAPHQL_SCHEMA_URL=http://localhost:4000/graphqlThis setup allows you to develop the client while connecting to a remote API instance.
To deploy for production one can use docker in prod mode.
First take a close look at docker/.env file which contains all configuration variables.
Here are a selection of the most important variables to review:
Memory allocation
NEO4J_HEAP=4096m
NEO4J_PAGECACHE=2g
ELASTICSEARCH_JVM_OPTS="-Xms4g -Xmx4g"Network configuration
ELASTICSEARCH_PORT=9200
ELASTICSEARCH_URL=http://elasticsearch:${ELASTICSEARCH_PORT}
NEO4J_PORT_BOLT=7687
NEO4J_URL=bolt://neo4j:${NEO4J_PORT_BOLT}
VITE_PDF_URL=http://localhost/pdf/By modifying those variables you can decide to spread the databases services on multiple machines. The PDF repository can also be set-up on a remote machhine. In that case make sure that the web server serving the PDF files allow CROSS Domain requests from the client domain.
Auth
NEO4J_LOGIN=neo4j
NEO4J_PASSWORD=l3tm31n!To start all the services at once on the same machine:
docker compose -f docker-compose.yml -f docker-compose.prod.yml -p letterbox upTo start only neo4j:
docker compose -f docker-compose.yml -f docker-compose.prod.yml -p letterbox up neo4jTo start only elasticsearch:
docker compose -f docker-compose.yml -f docker-compose.prod.yml -p letterbox up elasticsearchTo start server and client:
docker compose -f docker-compose.yml -f docker-compose.prod.yml -p letterbox up project letterboxFirst copy the CSV files into the docker project data folder
Add the dataset txt files into docker/project/data/messages/.
Add tags csv files into docker/project/data/tags/.
Tags must be named from entities type and contain tow columns: name and tags.
tags must be separated by a | separator.
To import all available data
cd code/server
npm run dataset:import
npm run initTo import only a subset using a regexp pattern on PDF filenames
cd code/server
npm run dataset:import -- 196.*
npm run initIf the PDF files need to be served directly by the docker client nginx, you have to place those files into docker/nginx/data/pdf without any subfolders.
We recommend using docker for local dev environment.
cd docker
docker compose -p letterbox upAdd the dataset txt files into docker/project/data/messages/.
Add tags csv files into docker/project/data/tags/.
Tags must be named from entities type and contain tow columns: name and tags.
tags must be separated by a | separator.
cat docker/project/data/tags/person.csv
name,tags
Joe Itch,lawyer|notaryRun import/index script
cd code/server
npm run dataset:importThis script does import in Neo4J and index into Elaticsearch
cd code/server
npm run initSee docker/.env for configuration details.
Web application is served at http://localhost. The Graphql API is served at http://localhost:4000. The Apollo web admin is served at http://localhost/graphql.
If you change the GraphQL schema on the server, you need to update the types on client side by running:
cd code/client
npm run generate