This repo shows a tiny producer and consumer using Avro with Confluent Schema Registry.
- Start infra: Postgrsql : Run Dockerfile and the docker compose file
docker compose up -d
# Kafka at localhost:29092, Schema Registry at http://localhost:8081it will create database schema with record
- Build all:
./mvnw -q -DskipTests package || mvn -q -DskipTests package- Run consumer:
cd consumer-app
../mvnw spring-boot:run || mvn spring-boot:run- Run producer (new terminal):
cd producer-app
../mvnw spring-boot:run || mvn spring-boot:run- Send a message:
curl -X POST http://localhost:8080/users -H "Content-Type: application/json" -d '{"id":"u-20","email":"[email protected]","phone":"2109456738","firstName":"Manthos","lastName":"Staurou","isActive":true,"age":35}'You should see the consumer log the received User record.
- Create account https://confluent.cloud/
- Update
common-schemas/src/main/avro/User.avsc(e.g., add optional field with default). - Use BACKWARD compatibility in confluent cloud and register the new version.
- Apps have
auto.register.schemas=falseanduse.latest.version=trueto force CI-driven registration.
Create a .env file or export env vars:
export CLOUD_BOOTSTRAP_SERVERS="pkc-xxxxx.us-central1.gcp.confluent.cloud:9092"
export CLOUD_API_KEY="******"
export CLOUD_API_SECRET="******"
export SR_URL="https://xxxxx.us-central1.gcp.confluent.cloud"
export SR_API_KEY="******"
export SR_API_SECRET="******"Then run with the cloud profile:
# Consumer
cd consumer-app
../mvnw spring-boot:run -Dspring-boot.run.profiles=cloud
# Producer
cd ../producer-app
../mvnw spring-boot:run -Dspring-boot.run.profiles=cloud