Releases: fd4s/fs2-kafka
Releases · fd4s/fs2-kafka
fs2-kafka v0.19.6
Changes
- Fix a race condition which could result in duplicate records. Thanks @backuitist! (#105, #106)
Released on 2019-03-28.
fs2-kafka v0.19.5
Changes
- Fix
Acks#toString
andAutoOffsetReset#toString
. (#103)
Updates
Miscellaneous
- There is now a Gitter room for the library.
Released on 2019-03-27.
fs2-kafka v0.19.4
Additions
- Add improved support for unkeyed records. Thanks @ranjanibrickx! (#96, #97)
- Add
Deserializer#option
, andDeserializer.option
andunit
. - Add
HeaderDeserializer#option
, andHeaderDeserializer.option
andunit
. - Add
Serializer#option
, andSerializer.option
,asNull
,empty
andunit
. - Add
HeaderSerializer#option
, andHeaderSerializer.option
,asNull
,empty
andunit
.
- Add
Released on 2019-03-01.
fs2-kafka v0.19.3
Additions
- Add functions for working with consumer offsets. Thanks @backuitist! (#92, #93)
- Add
KafkaConsumer#assignment
. - Add
KafkaConsumer#position
. - Add
KafkaConsumer#seekToBeginning
. - Add
KafkaConsumer#seekToEnd
.
- Add
- Add
Attempt[A]
aliases for deserializers. (#95)- Add
Deserializer.Attempt[A] = Deserializer[Either[Throwable, A]]
. - Add
HeaderDeserializer.Attempt[A] = HeaderDeserializer[Either[Throwable, A]]
.
- Add
Released on 2019-02-27.
fs2-kafka v0.19.2
Additions
- Add
describeCluster
andcreateTopics
toKafkaAdminClient
. Thanks @danxmoran! (#88) - Add
maxPrefetchBatches
toConsumerSettings
. (#83)- Controls prefetching behaviour before backpressure kicks in.
- Use
withMaxPrefetchBatches
to change the default setting.
- Add several constructs for working with record headers. (#85)
- Add
HeaderDeserializer
for deserialization of record header values. - Add
HeaderSerializer
for serializing values to use as header values. - Add
Header.serialize
for serializing a value and creating aHeader
. - Add
Header#headers
for creating aHeaders
with a singleHeader
. - Add
Header#as
andattemptAs
for deserializing header values. - Add
Headers#withKey
and aliasapply
for extracting a singleHeader
. - Add
Headers#concat
for concatenating anotherHeaders
instance. - Add
Headers#asJava
for converting to Java Kafka-compatible headers. - Add
Headers.fromIterable
to createHeaders
fromIterable[Header]
. - Add
Headers.fromSeq
to createHeaders
fromSeq[Header]
.
- Add
- Add several constructs for working with record serialization. (#85)
- Add a custom
Serializer
to make it easier to create and compose serializers. - Add a custom
Deserializer
to make it easier to create and compose deserializers. - Add
ProducerSettings.apply
for using implicitSerializer
s for the key and value. - Add
ConsumerSettings.apply
for using implicitDeserializer
s for the key and value.
- Add a custom
Changes
- Change to make
fs2.kafka.Id
public. Thanks @chenharryhua! (#86, #87)
Updates
- Update Kafka to 2.1.1. Thanks @sebastianvoss! (#90, #91)
Documentation
- Add a technical details section explaining backpressure. Thanks @backuitist! (#82, #84)
Released on 2019-02-22.
fs2-kafka v0.19.1
fs2-kafka v0.19.0
Changes
- Add
KafkaProducer#producePassthrough
for only keeping the passthrough after producing. (#74) - Change
KafkaConsumer#stream
to be an alias forpartitionedStream.parJoinUnbounded
. (#78)- This also removes
ConsumerSettings#fetchTimeout
as it is now unused.
- This also removes
- Change to improve type inference of
ProducerMessage
. (#74, #76)- To support better type inference, a custom
fs2.kafka.ProducerRecord
has been added. - If you were using the Java
ProducerRecord
, change tofs2.kafka.ProducerRecord
.
- To support better type inference, a custom
- Change to replace
Sink
s withPipe
s, and usage ofStream#to
withStream#through
. (#73) - Remove
ProducerMessage#single
,multiple
, andpassthrough
. (#74)- They have been replaced with
ProducerMessage#apply
andProducerMessage#one
. - If you were previously using
single
in isolation, then you can now useone
. - For all other cases, you can now use
ProducerMessage#apply
instead.
- They have been replaced with
- Rename
KafkaProducer#produceBatched
toproduce
. (#74) - Remove the previous
KafkaProducer#produce
.- For previous behavior,
flatten
the result fromproduce
. (#74)
- For previous behavior,
Miscellaneous
- Change to include current year in license notices. (#72)
Released on 2019-01-18.
fs2-kafka v0.18.1
fs2-kafka v0.18.0
Additions
- Add support for default
ExecutionContext
forKafkaConsumer
s. (#60)
If you've been using theconsumerExecutionContextResource
context,
orconsumerExecutionContextStream
, then not providing a context
when creatingConsumerSettings
now yield the same result. - Add
KafkaConsumer#subscribeTo
for subscribing to topics with varargs. (#62) - Add
KafkaConsumer#seek
for setting starting offsets. Thanks @danielkarch. (#64)
Changes
- Change
KafkaConsumer#subscribe
to work for anyReducible
. (#62) - Change
KafkaConsumer#subscribe
to returnF[Unit]
instead ofStream[F, Unit]
. (#62) - Change
KafkaConsumer
requests to be attempted and errors returned directly. (#66) - Change to use internal singleton for
KafkaConsumer
poll requests. (#69)
Fixes
- Fix
toString
for custom exceptions. (#61) - Fix to always create new instances of
NotSubscribedException
. (#65) - Fix
KafkaConsumer
requests to check consumer has not shutdown. (#66) - Fix
Show[ProducerRecord[K, V]]
when partition isnull
. (#68)
Documentation
- Change to simplify the 'quick example' in the documentation. (#63)
Miscellaneous
- Change
OVO Energy Ltd
toOVO Energy Limited
in license texts. (#67)
Released on 2018-12-16.