Skip to content

Commit 10fce90

Browse files
committed
Docs: Update contributing guide regarding issue templates
1 parent 5d2d2a5 commit 10fce90

File tree

1 file changed

+31
-15
lines changed

1 file changed

+31
-15
lines changed

CONTRIBUTING_GUIDE.md

+31-15
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,15 @@ follow [Code of Conduct](CODE_OF_CONDUCT.md).
1111

1212
## Code Contributions
1313

14-
Most of the issues open for contributions are tagged with 'good first issue.' To claim one, simply reply with 'pick up' in the issue and the AutoMQ maintainers will assign the issue to you. If you have any questions about the 'good first issue' please feel free to ask. We will do our best to clarify any doubts you may have.
15-
Start with
16-
this [tagged good first issue](https://github.com/AutoMQ/automq-for-kafka/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)
14+
### Finding or Reporting Issues
15+
16+
- **Find an existing issue:** Look through the [existing issues](https://github.com/AutoMQ/automq/issues). Issues open for contributions are often tagged with `good first issue`. To claim one, simply reply with 'pick up' in the issue and the AutoMQ maintainers will assign the issue to you. Start with
17+
this [tagged good first issue](https://github.com/AutoMQ/automq-for-kafka/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22).
18+
- **Report a new issue:** If you've found a bug or have a feature request, please [create a new issue](https://github.com/AutoMQ/automq/issues/new/choose). Select the appropriate template (Bug Report or Feature Request) and fill out the form provided.
19+
20+
If you have any questions about an issue, please feel free to ask in the issue comments. We will do our best to clarify any doubts you may have.
21+
22+
### Submitting Pull Requests
1723

1824
The usual workflow of code contribution is:
1925

@@ -25,7 +31,7 @@ The usual workflow of code contribution is:
2531
5. Push your local branch to your fork.
2632
6. Submit a Pull Request so that we can review your changes.
2733
7. [Link an existing Issue](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue)
28-
that does not include the `needs triage` label to your Pull Request. A pull request without a linked issue will be
34+
(created via the steps above or an existing one you claimed) that does not include the `needs triage` label to your Pull Request. A pull request without a linked issue will be
2935
closed, otherwise.
3036
8. Write a PR title and description that follows the [Pull Request Template](PULL_REQUEST_TEMPLATE.md).
3137
9. An AutoMQ maintainer will trigger the CI tests for you and review the code.
@@ -34,15 +40,15 @@ The usual workflow of code contribution is:
3440

3541
Pull Request reviews are done on a regular basis.
3642

37-
> [!NOTE]
43+
> [!NOTE]
3844
> Please make sure you respond to our feedback/questions and sign our CLA.
3945
>
4046
> Pull Requests without updates will be closed due inactivity.
4147
4248
## Requirement
4349

4450
| Requirement | Version |
45-
|------------------------|------------|
51+
| ---------------------- | ---------- |
4652
| Compiling requirements | JDK 17 |
4753
| Compiling requirements | Scala 2.13 |
4854
| Running requirements | JDK 17 |
@@ -58,17 +64,21 @@ Building AutoMQ is the same as Apache Kafka. Kafka uses Gradle as its project ma
5864
It is not recommended to manually install Gradle. The gradlew script in the root directory will automatically download Gradle for you, and the version is also specified by the gradlew script.
5965

6066
### Build
67+
6168
```
6269
./gradlew jar -x test
6370
```
6471

6572
### Prepare S3 service
66-
Refer to this [documentation](https://docs.localstack.cloud/getting-started/installation/) to install `localstack` to mock a local s3 service or use AWS S3 service directly.
73+
74+
Refer to this [documentation](https://docs.localstack.cloud/getting-started/installation/) to install `localstack` to mock a local s3 service or use AWS S3 service directly.
6775

6876
If you are using localstack then create a bucket with the following command:
77+
6978
```
7079
aws s3api create-bucket --bucket ko3 --endpoint=http://127.0.0.1:4566
7180
```
81+
7282
### Modify Configuration
7383

7484
Modify the `config/kraft/server.properties` file. The following settings need to be changed:
@@ -83,28 +93,34 @@ s3.region=us-east-1
8393
# The bucket of S3 service to store data
8494
s3.bucket=ko3
8595
```
96+
8697
> Tips: If you're using localstack, make sure to set the s3.endpoint to http://127.0.0.1:4566, not localhost. Set the region to us-east-1. The bucket should match the one created earlier.
8798
8899
### Format
100+
89101
Generated Cluster UUID:
102+
90103
```
91104
KAFKA_CLUSTER_ID="$(bin/kafka-storage.sh random-uuid)"
92105
```
106+
93107
Format Metadata Catalog:
108+
94109
```
95110
bin/kafka-storage.sh format -t $KAFKA_CLUSTER_ID -c config/kraft/server.properties
96111
```
112+
97113
### IDE Start Configuration
98-
| Item | Value |
99-
|------------------------|------------|
100-
| Main | core/src/main/scala/kafka/Kafka.scala |
101-
| ClassPath | -cp kafka.core.main |
102-
| VM Options | -Xmx1G -Xms1G -server -XX:+UseZGC -XX:MaxDirectMemorySize=2G -Dkafka.logs.dir=logs/ -Dlog4j.configuration=file:config/log4j.properties -Dio.netty.leakDetection.level=paranoid |
103-
| CLI Arguments | config/kraft/server.properties|
104-
| Environment | KAFKA_S3_ACCESS_KEY=test;KAFKA_S3_SECRET_KEY=test |
105114

106-
> tips: If you are using localstack, just use any value of access key and secret key. If you are using real S3 service, set `KAFKA_S3_ACCESS_KEY` and `KAFKA_S3_SECRET_KEY` to the real access key and secret key that have read/write permission of S3 service.
115+
| Item | Value |
116+
| ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
117+
| Main | core/src/main/scala/kafka/Kafka.scala |
118+
| ClassPath | -cp kafka.core.main |
119+
| VM Options | -Xmx1G -Xms1G -server -XX:+UseZGC -XX:MaxDirectMemorySize=2G -Dkafka.logs.dir=logs/ -Dlog4j.configuration=file:config/log4j.properties -Dio.netty.leakDetection.level=paranoid |
120+
| CLI Arguments | config/kraft/server.properties |
121+
| Environment | KAFKA_S3_ACCESS_KEY=test;KAFKA_S3_SECRET_KEY=test |
107122

123+
> tips: If you are using localstack, just use any value of access key and secret key. If you are using real S3 service, set `KAFKA_S3_ACCESS_KEY` and `KAFKA_S3_SECRET_KEY` to the real access key and secret key that have read/write permission of S3 service.
108124
109125
## Documentation
110126

0 commit comments

Comments
 (0)