Skip to content

Commit 137ae4b

Browse files
committedMay 2, 2024·
docs: Added quick start section
1 parent 43d6987 commit 137ae4b

File tree

8 files changed

+230
-73
lines changed

8 files changed

+230
-73
lines changed
 

‎docs/guide/src/docs/asciidoc/_links.adoc

-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
1-
:link_releases: link:https://github.com/redis-field-engineering/redis-kafka-connect/releases[releases page]
21
:link_redis_enterprise: link:https://redis.com/redis-enterprise-software/overview/[Redis Enterprise]
32
:link_lettuce_uri: link:https://github.com/lettuce-io/lettuce-core/wiki/Redis-URI-and-connection-details#uri-syntax[Redis URI Syntax]
43
:link_redis_notif: link:https://redis.io/docs/manual/keyspace-notifications[Redis Keyspace Notifications]

‎docs/guide/src/docs/asciidoc/docker.adoc

-55
This file was deleted.

‎docs/guide/src/docs/asciidoc/index.adoc

+3-4
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,11 @@
66

77
include::{includedir}/_links.adoc[]
88

9-
:leveloffset: +1
10-
include::{includedir}/introduction.adoc[]
9+
:leveloffset: 1
10+
include::{includedir}/overview.adoc[]
11+
include::{includedir}/quickstart.adoc[]
1112
include::{includedir}/install.adoc[]
1213
include::{includedir}/connect.adoc[]
1314
include::{includedir}/sink.adoc[]
1415
include::{includedir}/source.adoc[]
15-
include::{includedir}/docker.adoc[]
1616
include::{includedir}/resources.adoc[]
17-
:leveloffset: -1

‎docs/guide/src/docs/asciidoc/install.adoc

+2-2
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Select one of the methods below to install {project-title}.
55

66
== Download
77

8-
Download the latest release archive from the link:{project-url}/releases[releases page].
8+
Download the latest release archive from https://github.com/{github-owner}/{github-repo}/releases[here].
99

1010
== Confluent Hub
1111

@@ -14,4 +14,4 @@ Download the latest release archive from the link:{project-url}/releases[release
1414

1515
== Manually
1616

17-
Follow the instructions in {link_manual_install}.
17+
Follow the instructions in {link_manual_install}

‎docs/guide/src/docs/asciidoc/introduction.adoc ‎docs/guide/src/docs/asciidoc/overview.adoc

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
[[_introduction]]
2-
= Introduction
1+
[[_overview]]
2+
= Overview
33

4-
{project-title} is used to import and export data between Apache Kafka and Redis.
4+
{project-title} is a Confluent-verified connector that stores data from Kafka topics into Redis and pushes data from Redis into Kafka topics.
55

66
image:redis-kafka-connector.svg[]
77

Original file line numberDiff line numberDiff line change
@@ -0,0 +1,212 @@
1+
[[_quick_start]]
2+
= Quick Start
3+
4+
This section shows how to configure the {project-title} to import/export data between Redis and Apache Kafka and provides a hands-on look at the functionality of the source and sink connectors.
5+
6+
== Requirements
7+
8+
Download and install the following software:
9+
10+
* https://docs.docker.com/get-docker/[Docker]
11+
* https://git-scm.com/book/en/v2/Getting-Started-Installing-Git[Git]
12+
13+
== Start the Sandbox
14+
15+
The sandbox starts the following Docker services:
16+
17+
* Redis Stack
18+
* Apache Kafka
19+
* Kafka Connect with the {project-title} installed
20+
21+
To start the sandbox run the following command:
22+
23+
`docker compose up`
24+
25+
After Docker downloads and starts the services you should see the following output:
26+
27+
[source,console]
28+
-----
29+
[+] Running 8/0
30+
✔ Container redis Created
31+
✔ Container zookeeper Created
32+
✔ Container broker Created
33+
✔ Container schema-registry Created
34+
✔ Container rest-proxy Created
35+
✔ Container connect Created
36+
✔ Container ksqldb-server Created
37+
✔ Container control-center Created
38+
-----
39+
40+
== Add Connectors
41+
42+
Now that the required services are up and running, we can add connectors to Kafka Connect to transfer data between Redis and Kafka:
43+
44+
* Add a sink connector to transfer data from Kafka to Redis
45+
* Add a source connector to transfer data from Redis to Kafka
46+
47+
=== Add a Datagen
48+
49+
https://github.com/confluentinc/kafka-connect-datagen/[Kafka Connect Datagen] is a Kafka Connect source connector for generating mock data.
50+
51+
Create the Datagen connector with the following command:
52+
53+
[source,console]
54+
-----
55+
curl -X POST -H "Content-Type: application/json" --data '
56+
{ "name": "datagen-pageviews",
57+
"config": {
58+
"connector.class": "io.confluent.kafka.connect.datagen.DatagenConnector",
59+
"kafka.topic": "pageviews",
60+
"quickstart": "pageviews",
61+
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
62+
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
63+
"value.converter.schemas.enable": "false",
64+
"producer.interceptor.classes": "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor",
65+
"max.interval": 200,
66+
"iterations": 10000000,
67+
"tasks.max": "1"
68+
}}' http://localhost:8083/connectors -w "\n"
69+
-----
70+
71+
This automatically creates the Kafka topic `pageviews` and produces data with a schema configuration from https://github.com/confluentinc/kafka-connect-datagen/blob/master/src/main/resources/pageviews_schema.avro
72+
73+
[NOTE]
74+
====
75+
Why do I see the message 'Failed to connect'?
76+
77+
It takes up to three minutes for the Kafka Connect REST API to start.
78+
If you receive the following error, wait three minutes and run the preceding command again.
79+
80+
`curl: (7) Failed to connect to connect port 8083: Connection refused`
81+
====
82+
83+
To confirm that you added the Datagen connector, run the following command:
84+
85+
`curl -X GET http://localhost:8083/connectors`
86+
87+
88+
=== Add a Sink Connector
89+
90+
The command below adds a {project-title} sink connector configured with these properties:
91+
92+
* The class Kafka Connect uses to instantiate the connector
93+
* The Kafka topic from which the connector reads data
94+
* The connection URI of the Redis database to which the connector writes data
95+
* The Redis command to use for writing data (`JSONSET`)
96+
* Key and value converters to correctly handle incoming `pageviews` data
97+
* A https://docs.confluent.io/platform/current/connect/transforms/overview.html[Single Message Transform] to extract a key from `pageviews` messages.
98+
99+
[source,console]
100+
-----
101+
curl -X POST -H "Content-Type: application/json" --data '
102+
{"name": "redis-sink-json",
103+
"config": {
104+
"connector.class":"com.redis.kafka.connect.RedisSinkConnector",
105+
"tasks.max":"1",
106+
"topics":"pageviews",
107+
"redis.uri":"redis://redis:6379",
108+
"redis.command":"JSONSET",
109+
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
110+
"value.converter": "org.apache.kafka.connect.storage.StringConverter",
111+
"value.converter.schemas.enable": "false",
112+
"transforms": "Cast",
113+
"transforms.Cast.type": "org.apache.kafka.connect.transforms.Cast$Key",
114+
"transforms.Cast.spec": "string"
115+
}}' http://localhost:8083/connectors -w "\n"
116+
-----
117+
118+
You can check that Kafka messages are being written to Redis with this command:
119+
120+
`docker compose exec redis /opt/redis-stack/bin/redis-cli "keys" "*"`
121+
122+
You should see the following output:
123+
124+
[source,console]
125+
-----
126+
1) "pageviews:6021"
127+
2) "pageviews:211"
128+
3) "pageviews:281"
129+
...
130+
-----
131+
132+
To retrieve the contents of a specific key use this command:
133+
134+
`docker compose exec redis /opt/redis-stack/bin/redis-cli "JSON.GET" "pageviews:1451"`
135+
136+
=> `"{\"viewtime\":1451,\"userid\":\"User_6\",\"pageid\":\"Page_35\"}"`
137+
138+
=== Add a Source Connector
139+
140+
The following command adds a source connector configured with these properties:
141+
142+
* The class Kafka Connect uses to instantiate the connector
143+
* The connection URI of the Redis database the connector connects to
144+
* The name of the Redis stream from which the connector reads messages
145+
* The Kafka topic to which the connector writes data
146+
147+
[source,console]
148+
-----
149+
curl -X POST -H "Content-Type: application/json" --data '
150+
{ "name": "redis-source",
151+
"config": {
152+
"tasks.max":"1",
153+
"connector.class":"com.redis.kafka.connect.RedisStreamSourceConnector",
154+
"redis.uri":"redis://redis:6379",
155+
"redis.stream.name":"mystream",
156+
"topic": "mystream"
157+
}
158+
}' http://localhost:8083/connectors -w "\n"
159+
-----
160+
161+
Now add a message to the `mystream` Redis stream:
162+
163+
`docker compose exec redis /opt/redis-stack/bin/redis-cli "xadd" "mystream" "*" "field1" "value11" "field2" "value21"`
164+
165+
Examine the topics in the Kafka UI: http://localhost:9021 or http://localhost:8000/.
166+
The `mystream` topic should have the previously sent stream message.
167+
168+
169+
== End-to-end Example
170+
171+
The project {project-scm}[repository] contains a script that runs all the steps shown previously.
172+
173+
Clone the {project-scm}[{project-name}] repository and execute `run.sh` in `docker` directory:
174+
175+
[source,console,subs="attributes"]
176+
----
177+
git clone {project-scm}
178+
cd {project-name}
179+
./run.sh
180+
----
181+
182+
This will:
183+
184+
* Run `docker compose up`
185+
* Wait for Redis, Kafka, and Kafka Connect to be ready
186+
* Register the Confluent Datagen Connector
187+
* Register the Redis Kafka Sink Connector
188+
* Register the Redis Kafka Source Connector
189+
* Publish some events to Kafka via the Datagen connector
190+
* Write the events to Redis
191+
* Send messages to a Redis stream
192+
* Write the Redis stream messages back into Kafka
193+
194+
Once running, examine the topics in the Kafka http://localhost:9021/[control center]:
195+
196+
The `pageviews` topic should contain the 10 simple documents added, each similar to:
197+
198+
[source,json]
199+
----
200+
include::{includedir}/../resources/pageviews.json[]
201+
----
202+
203+
* The `pageviews` stream should contain the 10 change events.
204+
205+
Examine the stream in Redis:
206+
[source,console]
207+
----
208+
docker compose exec redis /usr/local/bin/redis-cli
209+
xread COUNT 10 STREAMS pageviews 0
210+
----
211+
212+
Messages added to the `mystream` stream will show up in the `mystream` topic

‎docs/guide/src/docs/asciidoc/sink.adoc

+7-6
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
[[_sink]]
22
= Sink Connector Guide
3+
:name: Redis Kafka Sink Connector
34

4-
The sink connector consumes records from a Kafka topic and writes the data to Redis.
5+
The {name} consumes records from a Kafka topic and writes the data to Redis.
56
It includes the following features:
67

78
* <<_sink_at_least_once_delivery,At least once delivery>>
@@ -11,17 +12,17 @@ It includes the following features:
1112

1213
[[_sink_at_least_once_delivery]]
1314
== At least once delivery
14-
The sink connector guarantees that records from the Kafka topic are delivered at least once.
15+
The {name} guarantees that records from the Kafka topic are delivered at least once.
1516

1617
[[_sink_tasks]]
1718
== Multiple tasks
1819

19-
The sink connector supports running one or more tasks.
20+
The {name} supports running one or more tasks.
2021
You can specify the number of tasks with the `tasks.max` configuration property.
2122

2223
[[_sink_data_structures]]
2324
== Redis Data Structures
24-
The sink connector supports the following Redis data-structure types as targets:
25+
The {name} supports the following Redis data-structure types as targets:
2526

2627
[[_collection_key]]
2728
* Collections: <<_sink_stream,stream>>, <<_sink_list,list>>, <<_sink_set,set>>, <<_sink_zset,sorted set>>, <<_sink_timeseries,time series>>
@@ -167,10 +168,10 @@ The Kafka record value must be a number (e.g. `float64`) as it is used as the sa
167168
[[_sink_data_formats]]
168169
== Data Formats
169170

170-
The sink connector supports different data formats for record keys and values depending on the target Redis data structure.
171+
The {name} supports different data formats for record keys and values depending on the target Redis data structure.
171172

172173
=== Kafka Record Keys
173-
The sink connector expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:
174+
The {name} expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:
174175

175176
[options="header",cols="h,1,1"]
176177
|====

‎docs/guide/src/docs/asciidoc/source.adoc

+3-2
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
[[_source]]
22
= Source Connector Guide
3+
:name: Redis Kafka Source Connector
34

4-
{project-title} includes 2 source connectors:
5+
The {name} includes 2 source connectors:
56

67
* <<_stream_source,Stream>>
78
* <<_keys_source,Keys>>
@@ -20,7 +21,7 @@ It includes the following features:
2021

2122
=== Delivery Guarantees
2223

23-
The stream source connector can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
24+
The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
2425
The default is at-least-once delivery.
2526

2627
[[_stream_source_at_least_once_delivery]]

0 commit comments

Comments
 (0)
Please sign in to comment.