Skip to content

Removing an optional field in the consumer causes issues in decryption. #1397

@ulkar-sephora

Description

@ulkar-sephora

The schema compatibility is set to backward compatibility, allowing us to remove fields and add optional fields as the schema evolves. Currently, we have a Golang consumer that utilizes confluent-kafka-go to decrypt encrypted fields. The implementation follows the example provided in:
Confluent Kafka Avro v2 Consumer Encryption Example.

According to compatibility guidelines, when using backward compatibility, changes should first be applied to the consumer. For example, consider the following latest schema with five fields:

{
"namespace": "testing namespace",
"type": "record",
"name": "Test",
"fields": [
{"name": "testA", "type": "string"},
{"name": "testB", "type": {"type": "array", "items": "string"}},
{"name": "testC", "type": ["null", "string"], "default": null, "confluent:tags": ["Encrypt"]},
{"name": "testD", "type": "string"},
{"name": "testE", "type": ["null", "string"], "default": null, "confluent:tags": ["Encrypt"]}
]
}

Now, if we remove the field testD from the consumer first, the decryption works as expected for testC (which appears before the removed field). However, testE (which comes after the removed field) does not get decrypted.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions