Skip to content

Commit 7eb6d57

Browse files
committed
docs: Added Confluent Cloud documentation
1 parent 137ae4b commit 7eb6d57

File tree

3 files changed

+53
-1
lines changed

3 files changed

+53
-1
lines changed

docs/guide/src/docs/asciidoc/quickstart.adoc

+19-1
Original file line numberDiff line numberDiff line change
@@ -209,4 +209,22 @@ docker compose exec redis /usr/local/bin/redis-cli
209209
xread COUNT 10 STREAMS pageviews 0
210210
----
211211

212-
Messages added to the `mystream` stream will show up in the `mystream` topic
212+
Messages added to the `mystream` stream will show up in the `mystream` topic.
213+
214+
215+
== Confluent Cloud
216+
217+
This section describes configuration aspects that are specific to using {project-title} in Confluent Cloud.
218+
219+
=== Egress Endpoints
220+
221+
It is required to specify https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#cc-byoc-endpoints[egress endpoints] in order for the connector to reach the Redis database.
222+
223+
=== Sensitive Properties
224+
225+
The following are https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#sensitive[sensitive properties] that must be marked as such in Confluent Cloud UI.
226+
227+
* `redis.uri`: URI of the Redis database to connect to, e.g. `redis://redis-12000.redis.com:12000`
228+
* `redis.username`: Username to use to connect to Redis
229+
* `redis.password`: Password to use to connect to Redis
230+
* `redis.key.password`: Password of the private key file

docs/guide/src/docs/asciidoc/sink.adoc

+12
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,17 @@ It includes the following features:
1010
* <<_sink_data_structures,Redis Data Structures>>
1111
* <<_sink_data_formats,Supported Data Formats>>
1212

13+
== Class Name
14+
15+
The sink connector class name is `com.redis.kafka.connect.RedisSinkConnector`.
16+
17+
The corresponding configuration property would be:
18+
19+
[source,properties]
20+
----
21+
`connector.class = com.redis.kafka.connect.RedisSinkConnector`
22+
----
23+
1324
[[_sink_at_least_once_delivery]]
1425
== At least once delivery
1526
The {name} guarantees that records from the Kafka topic are delivered at least once.
@@ -20,6 +31,7 @@ The {name} guarantees that records from the Kafka topic are delivered at least o
2031
The {name} supports running one or more tasks.
2132
You can specify the number of tasks with the `tasks.max` configuration property.
2233

34+
2335
[[_sink_data_structures]]
2436
== Redis Data Structures
2537
The {name} supports the following Redis data-structure types as targets:

docs/guide/src/docs/asciidoc/source.adoc

+22
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,17 @@ It includes the following features:
1919
* <<_stream_source_schema,Schema>>
2020
* <<_stream_source_config,Configuration>>
2121

22+
=== Class Name
23+
24+
The stream source connector class name is `com.redis.kafka.connect.RedisStreamSourceConnector`.
25+
26+
The corresponding configuration property would be:
27+
28+
[source,properties]
29+
----
30+
`connector.class = com.redis.kafka.connect.RedisStreamSourceConnector`
31+
----
32+
2233
=== Delivery Guarantees
2334

2435
The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
@@ -115,6 +126,17 @@ Some preliminary sizing using Redis statistics and `bigkeys`/`memkeys` is recomm
115126
If you need assistance please contact your Redis account team.
116127
====
117128

129+
=== Class Name
130+
131+
The keys source connector class name is `com.redis.kafka.connect.RedisKeysSourceConnector`.
132+
133+
The corresponding configuration property would be:
134+
135+
[source,properties]
136+
----
137+
`connector.class = com.redis.kafka.connect.RedisKeysSourceConnector`
138+
----
139+
118140
[[_keys_source_config]]
119141
=== Configuration
120142
[source,properties]

0 commit comments

Comments
 (0)