You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING_GUIDE.md
+31-15
Original file line number
Diff line number
Diff line change
@@ -11,9 +11,15 @@ follow [Code of Conduct](CODE_OF_CONDUCT.md).
11
11
12
12
## Code Contributions
13
13
14
-
Most of the issues open for contributions are tagged with 'good first issue.' To claim one, simply reply with 'pick up' in the issue and the AutoMQ maintainers will assign the issue to you. If you have any questions about the 'good first issue' please feel free to ask. We will do our best to clarify any doubts you may have.
15
-
Start with
16
-
this [tagged good first issue](https://github.com/AutoMQ/automq-for-kafka/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)
14
+
### Finding or Reporting Issues
15
+
16
+
-**Find an existing issue:** Look through the [existing issues](https://github.com/AutoMQ/automq/issues). Issues open for contributions are often tagged with `good first issue`. To claim one, simply reply with 'pick up' in the issue and the AutoMQ maintainers will assign the issue to you. Start with
17
+
this [tagged good first issue](https://github.com/AutoMQ/automq-for-kafka/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22).
18
+
-**Report a new issue:** If you've found a bug or have a feature request, please [create a new issue](https://github.com/AutoMQ/automq/issues/new/choose). Select the appropriate template (Bug Report or Feature Request) and fill out the form provided.
19
+
20
+
If you have any questions about an issue, please feel free to ask in the issue comments. We will do our best to clarify any doubts you may have.
21
+
22
+
### Submitting Pull Requests
17
23
18
24
The usual workflow of code contribution is:
19
25
@@ -25,7 +31,7 @@ The usual workflow of code contribution is:
25
31
5. Push your local branch to your fork.
26
32
6. Submit a Pull Request so that we can review your changes.
27
33
7.[Link an existing Issue](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue)
28
-
that does not include the `needs triage` label to your Pull Request. A pull request without a linked issue will be
34
+
(created via the steps above or an existing one you claimed) that does not include the `needs triage` label to your Pull Request. A pull request without a linked issue will be
29
35
closed, otherwise.
30
36
8. Write a PR title and description that follows the [Pull Request Template](PULL_REQUEST_TEMPLATE.md).
31
37
9. An AutoMQ maintainer will trigger the CI tests for you and review the code.
@@ -34,15 +40,15 @@ The usual workflow of code contribution is:
34
40
35
41
Pull Request reviews are done on a regular basis.
36
42
37
-
> [!NOTE]
43
+
> [!NOTE]
38
44
> Please make sure you respond to our feedback/questions and sign our CLA.
39
45
>
40
46
> Pull Requests without updates will be closed due inactivity.
41
47
42
48
## Requirement
43
49
44
50
| Requirement | Version |
45
-
|------------------------|------------|
51
+
|----------------------|----------|
46
52
| Compiling requirements | JDK 17 |
47
53
| Compiling requirements | Scala 2.13 |
48
54
| Running requirements | JDK 17 |
@@ -58,17 +64,21 @@ Building AutoMQ is the same as Apache Kafka. Kafka uses Gradle as its project ma
58
64
It is not recommended to manually install Gradle. The gradlew script in the root directory will automatically download Gradle for you, and the version is also specified by the gradlew script.
59
65
60
66
### Build
67
+
61
68
```
62
69
./gradlew jar -x test
63
70
```
64
71
65
72
### Prepare S3 service
66
-
Refer to this [documentation](https://docs.localstack.cloud/getting-started/installation/) to install `localstack` to mock a local s3 service or use AWS S3 service directly.
73
+
74
+
Refer to this [documentation](https://docs.localstack.cloud/getting-started/installation/) to install `localstack` to mock a local s3 service or use AWS S3 service directly.
67
75
68
76
If you are using localstack then create a bucket with the following command:
Modify the `config/kraft/server.properties` file. The following settings need to be changed:
@@ -83,28 +93,34 @@ s3.region=us-east-1
83
93
# The bucket of S3 service to store data
84
94
s3.bucket=ko3
85
95
```
96
+
86
97
> Tips: If you're using localstack, make sure to set the s3.endpoint to http://127.0.0.1:4566, not localhost. Set the region to us-east-1. The bucket should match the one created earlier.
> tips: If you are using localstack, just use any value of access key and secret key. If you are using real S3 service, set `KAFKA_S3_ACCESS_KEY` and `KAFKA_S3_SECRET_KEY` to the real access key and secret key that have read/write permission of S3 service.
> tips: If you are using localstack, just use any value of access key and secret key. If you are using real S3 service, set `KAFKA_S3_ACCESS_KEY` and `KAFKA_S3_SECRET_KEY` to the real access key and secret key that have read/write permission of S3 service.
0 commit comments