Skip to content

Commit edfb9be

Browse files
authored
Update to FsKafka/FsKafka0 1.4.2 (#67)
1 parent 17e3857 commit edfb9be

23 files changed

+60
-1024
lines changed

CHANGELOG.md

+7
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,14 @@ The `Unreleased` section name is replaced by the expected version of next releas
1010

1111
### Added
1212
### Changed
13+
14+
- `Kafka`: Targets [`FsKafka`/`FsKafka0` v `1.4.2`](https://github.com/jet/FsKafka/blob/master/CHANGELOG.md#1.4.2) [#64](https://github.com/jet/propulsion/pull/64)
15+
1316
### Removed
17+
18+
- `Propulsion.Kafka0` Some `Propulsion.Kafka0`-namespaced shimming elements are now found in the `FsKafka` namespace in `FsKafka0` [#64](https://github.com/jet/propulsion/pull/64)
19+
- `Propulsion.Kafka`: `KafkaMonitor` is now found in the `FsKafka` namespace in `FsKafka`/FsKafka0` (NOTE: integration tests continue to live in this repo) [#64](https://github.com/jet/propulsion/pull/64)
20+
1421
### Fixed
1522

1623
- `Kafka`: Change buffer grouping to include `Topic` alongside `PartitionId` - existing implementation did not guarantee marking progress where consuming from more than one Topic concurrently [#63](https://github.com/jet/propulsion/pull/63)

Directory.Build.props

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
<Project ToolsVersion="15.0">
1+
<Project>
22
<PropertyGroup>
33
<Authors>@jet @bartelink @eiriktsarpalis and contributors</Authors>
44
<Company>Jet.com</Company>

Directory.Build.targets

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
<Project ToolsVersion="15.0">
1+
<Project>
22
<Target Name="ComputePackageVersion" AfterTargets="MinVer" Condition=" '$(BUILD_PR)' != '' AND '$(BUILD_PR)' != '%24(SYSTEM.PULLREQUEST.PULLREQUESTNUMBER)' ">
33
<PropertyGroup>
44
<PackageVersion>$(MinVerMajor).$(MinVerMinor).$(MinVerPatch)-pr.$(BUILD_PR)</PackageVersion>

README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,8 @@ The components within this repository are delivered as a multi-targeted Nuget pa
1111
- `Propulsion` [![NuGet](https://img.shields.io/nuget/v/Propulsion.svg)](https://www.nuget.org/packages/Propulsion/) Implements core functionality in a channel-independent fashion including `ParallelProjector`, `StreamsProjector`. [Depends](https://www.fuget.org/packages/Propulsion) on `MathNet.Numerics`, `Serilog`
1212
- `Propulsion.Cosmos` [![NuGet](https://img.shields.io/nuget/v/Propulsion.Cosmos.svg)](https://www.nuget.org/packages/Propulsion.Cosmos/) Provides bindings to Azure CosmosDb a) writing to `Equinox.Cosmos` :- `CosmosSink` b) reading from CosmosDb's changefeed by wrapping the [`dotnet-changefeedprocessor` library](https://github.com/Azure/azure-documentdb-changefeedprocessor-dotnet) :- `CosmosSource`. [Depends](https://www.fuget.org/packages/Propulsion.Cosmos) on `Equinox.Cosmos`, `Microsoft.Azure.DocumentDB.ChangeFeedProcessor`, `Serilog`
1313
- `Propulsion.EventStore` [![NuGet](https://img.shields.io/nuget/v/Propulsion.EventStore.svg)](https://www.nuget.org/packages/Propulsion.EventStore/). Provides bindings to [EventStore](https://www.eventstore.org), writing via `Propulsion.EventStore.EventStoreSink` [Depends](https://www.fuget.org/packages/Propulsion.EventStore) on `Equinox.EventStore`, `Serilog`
14-
- `Propulsion.Kafka` [![NuGet](https://img.shields.io/nuget/v/Propulsion.Kafka.svg)](https://www.nuget.org/packages/Propulsion.Kafka/) Provides bindings for producing and consuming both streamwise and in parallel. Includes a standard codec for use with streamwise projection and consumption, `Propulsion.Kafka.Codec.NewtonsoftJson.RenderedSpan`. Implements a `KafkaMonitor` that can log status information based on [Burrow](https://github.com/linkedin/Burrow). [Depends](https://www.fuget.org/packages/Propulsion.Kafka) on `FsKafka` v ` = 1.4.1`, `Serilog`
15-
- `Propulsion.Kafka0` [![NuGet](https://img.shields.io/nuget/v/Propulsion.Kafka0.svg)](https://www.nuget.org/packages/Propulsion.Kafka0/). Same functionality/purpose as `Propulsion.Kafka` but targets older `Confluent.Kafka`/`librdkafka` version for for interoperability with systems that have a hard dependency on that. [Depends](https://www.fuget.org/packages/Propulsion.Kafka0) on `Confluent.Kafka [0.11.3]`, `librdkafka.redist [0.11.4]`, `Serilog`
14+
- `Propulsion.Kafka` [![NuGet](https://img.shields.io/nuget/v/Propulsion.Kafka.svg)](https://www.nuget.org/packages/Propulsion.Kafka/) Provides bindings for producing and consuming both streamwise and in parallel. Includes a standard codec for use with streamwise projection and consumption, `Propulsion.Kafka.Codec.NewtonsoftJson.RenderedSpan`. [Depends](https://www.fuget.org/packages/Propulsion.Kafka) on `FsKafka` v ` = 1.4.2`, `Serilog`
15+
- `Propulsion.Kafka0` [![NuGet](https://img.shields.io/nuget/v/Propulsion.Kafka0.svg)](https://www.nuget.org/packages/Propulsion.Kafka0/). Same functionality/purpose as `Propulsion.Kafka` but uses `FsKafka0` instead of `FsKafka` in order to target an older `Confluent.Kafka`/`librdkafka` version pairing for interoperability with systems that have a hard dependency on that. [Depends](https://www.fuget.org/packages/Propulsion.Kafka0) on `Confluent.Kafka [0.11.3]`, `librdkafka.redist [0.11.4]`, `Serilog`
1616

1717
The ubiquitous `Serilog` dependency is solely on the core module, not any sinks, i.e. you configure to emit to `NLog` etc.
1818

azure-pipelines.yml

+2-15
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,14 @@
11
name: $(Rev:r)
22

33
# Summary:
4-
# Linux: Tests with netcoreapp3.1 using docker Kafka instance
5-
# net461 on Mono is not working, no investigation why as yet, but local run validates it
4+
# Linux: Tests using docker Kafka instance (note test suite does not run net461 as CK on mono is not supported)
65
# Windows: Builds and emits nupkg as artifacts
76
# MacOS: Builds only
87

98
jobs:
109
- job: Windows
1110
pool:
12-
vmImage: 'vs2017-win2016'
11+
vmImage: 'windows-latest'
1312
steps:
1413
- script: dotnet pack build.proj
1514
displayName: dotnet pack build.proj
@@ -24,12 +23,6 @@ jobs:
2423
pool:
2524
vmImage: 'ubuntu-latest'
2625
steps:
27-
- task: UseDotNet@2
28-
displayName: 'Install .NET Core sdk'
29-
inputs:
30-
packageType: sdk
31-
version: 3.1.101
32-
installationPath: $(Agent.ToolsDirectory)/dotnet
3326
- script: |
3427
docker network create kafka-net
3528
docker run -d --name zookeeper --network kafka-net --publish 2181:2181 zookeeper:3.4
@@ -52,11 +45,5 @@ jobs:
5245
pool:
5346
vmImage: 'macOS-latest'
5447
steps:
55-
- task: UseDotNet@2
56-
displayName: 'Install .NET Core sdk'
57-
inputs:
58-
packageType: sdk
59-
version: 3.1.101
60-
installationPath: $(Agent.ToolsDirectory)/dotnet
6148
- script: dotnet pack build.proj
6249
displayName: dotnet pack build.proj

build.proj

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
<Project ToolsVersion="15.0">
1+
<Project>
22

33
<Import Project="Directory.Build.props" />
44

src/Propulsion.Cosmos/Propulsion.Cosmos.fsproj

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<IsTestProject>false</IsTestProject>
77
<DisableImplicitFSharpCoreReference>true</DisableImplicitFSharpCoreReference>
88
<DisableImplicitSystemValueTupleReference>true</DisableImplicitSystemValueTupleReference>
9-
<GenerateDocumentationFile Condition=" '$(Configuration)' == 'Release' ">true</GenerateDocumentationFile>
9+
<GenerateDocumentationFile>true</GenerateDocumentationFile>
1010
</PropertyGroup>
1111

1212
<ItemGroup>

src/Propulsion.EventStore/Propulsion.EventStore.fsproj

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<IsTestProject>false</IsTestProject>
77
<DisableImplicitFSharpCoreReference>true</DisableImplicitFSharpCoreReference>
88
<DisableImplicitSystemValueTupleReference>true</DisableImplicitSystemValueTupleReference>
9-
<GenerateDocumentationFile Condition=" '$(Configuration)' == 'Release' ">true</GenerateDocumentationFile>
9+
<GenerateDocumentationFile>true</GenerateDocumentationFile>
1010
</PropertyGroup>
1111

1212
<ItemGroup>

src/Propulsion.Kafka/Bindings.fs renamed to src/Propulsion.Kafka/Binding.fs

+2-6
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,9 @@ open Serilog
66
open System
77
open System.Collections.Generic
88

9-
module Bindings =
10-
let mapMessage (x : ConsumeResult<string,string>) = x.Message
9+
module Binding =
1110
let mapConsumeResult (x : ConsumeResult<string,string>) = KeyValuePair(x.Message.Key,x.Message.Value)
12-
let inline partitionId (x : ConsumeResult<_,_>) = let p = x.Partition in p.Value
13-
let topicPartition (topic : string) (partition : int) = TopicPartition(topic, Partition partition)
14-
let partitionValue (partition : Partition) = let p = partition in p.Value
15-
let offsetUnset = Offset.Unset
11+
let makeTopicPartition (topic : string) (partition : int) = TopicPartition(topic, Partition partition)
1612
let createConsumer log config : IConsumer<string,string> * (unit -> unit) =
1713
let consumer = ConsumerBuilder.WithLogging(log, config)
1814
consumer, consumer.Close

src/Propulsion.Kafka/Consumers.fs

+9-9
Original file line numberDiff line numberDiff line change
@@ -70,10 +70,10 @@ type KafkaIngestionEngine<'Info>
7070
let mkSubmission topicPartition span : Submission.SubmissionBatch<'S, 'M> =
7171
let checkpoint () =
7272
counter.Delta(-span.reservation) // counterbalance Delta(+) per ingest, below
73-
Bindings.storeOffset log consumer span.highWaterMark
73+
Binding.storeOffset log consumer span.highWaterMark
7474
{ source = topicPartition; onCompletion = checkpoint; messages = span.messages.ToArray() }
7575
let ingest result =
76-
let message = Bindings.mapMessage result
76+
let message = Binding.message result
7777
let sz = approximateMessageBytes message
7878
counter.Delta(+sz) // counterbalanced by Delta(-) in checkpoint(), below
7979
intervalMsgs <- intervalMsgs + 1L
@@ -111,7 +111,7 @@ type KafkaIngestionEngine<'Info>
111111
submit()
112112
maybeLogStats()
113113
| false, Some intervalRemainder ->
114-
Bindings.tryConsume log consumer intervalRemainder ingest
114+
Binding.tryConsume log consumer intervalRemainder ingest
115115
finally
116116
submit () // We don't want to leak our reservations against the counter and want to pass of messages we ingested
117117
dumpStats () // Unconditional logging when completing
@@ -134,7 +134,7 @@ type ConsumerPipeline private (inner : IConsumer<string, string>, task : Task<un
134134
float config.Buffering.maxInFlightBytes / 1024. / 1024. / 1024., maxDelay.TotalSeconds, maxItems)
135135
let limiterLog = log.ForContext(Serilog.Core.Constants.SourceContextPropertyName, Core.Constants.messageCounterSourceContext)
136136
let limiter = Core.InFlightMessageCounter(limiterLog, config.Buffering.minInFlightBytes, config.Buffering.maxInFlightBytes)
137-
let consumer, closeConsumer = Bindings.createConsumer log config.Inner // teardown is managed by ingester.Pump()
137+
let consumer, closeConsumer = Binding.createConsumer log config.Inner // teardown is managed by ingester.Pump()
138138
consumer.Subscribe config.Topics
139139
let ingester = KafkaIngestionEngine<'M>(log, limiter, consumer, closeConsumer, mapResult, submit, maxItems, maxDelay, statsInterval=statsInterval)
140140
let cts = new CancellationTokenSource()
@@ -203,7 +203,7 @@ type ParallelConsumer private () =
203203
static member Start
204204
( log : ILogger, config : KafkaConsumerConfig, maxDop, handle : KeyValuePair<string, string> -> Async<unit>,
205205
?maxSubmissionsPerPartition, ?pumpInterval, ?statsInterval, ?logExternalStats) =
206-
ParallelConsumer.Start<KeyValuePair<string, string>>(log, config, maxDop, Bindings.mapConsumeResult, handle >> Async.Catch,
206+
ParallelConsumer.Start<KeyValuePair<string, string>>(log, config, maxDop, Binding.mapConsumeResult, handle >> Async.Catch,
207207
?maxSubmissionsPerPartition=maxSubmissionsPerPartition, ?pumpInterval=pumpInterval, ?statsInterval=statsInterval, ?logExternalStats=logExternalStats)
208208

209209
type EventMetrics = Streams.EventMetrics
@@ -299,7 +299,7 @@ module Core =
299299
stats : Streams.Scheduling.Stats<EventMetrics * 'Outcome, EventMetrics * exn>, statsInterval,
300300
?maximizeOffsetWriting, ?maxSubmissionsPerPartition, ?pumpInterval, ?logExternalState, ?idleDelay)=
301301
StreamsConsumer.Start<KeyValuePair<string, string>, 'Outcome>(
302-
log, config, Bindings.mapConsumeResult, keyValueToStreamEvents, prepare, handle, maxDop,
302+
log, config, Binding.mapConsumeResult, keyValueToStreamEvents, prepare, handle, maxDop,
303303
stats, statsInterval=statsInterval,
304304
?maxSubmissionsPerPartition=maxSubmissionsPerPartition,
305305
?pumpInterval=pumpInterval,
@@ -316,7 +316,7 @@ module Core =
316316
stats : Streams.Scheduling.Stats<EventMetrics * 'Outcome, EventMetrics * exn>, statsInterval,
317317
?maximizeOffsetWriting, ?maxSubmissionsPerPartition, ?pumpInterval, ?logExternalState, ?idleDelay, ?maxBatches) =
318318
StreamsConsumer.Start<KeyValuePair<string, string>, 'Outcome>(
319-
log, config, Bindings.mapConsumeResult, keyValueToStreamEvents, handle, maxDop,
319+
log, config, Binding.mapConsumeResult, keyValueToStreamEvents, handle, maxDop,
320320
stats, statsInterval,
321321
?maxSubmissionsPerPartition=maxSubmissionsPerPartition,
322322
?pumpInterval=pumpInterval,
@@ -359,7 +359,7 @@ type StreamNameSequenceGenerator() =
359359
member __.ConsumeResultToStreamEvent(toStreamName : ConsumeResult<_, _> -> StreamName)
360360
: ConsumeResult<string, string> -> Propulsion.Streams.StreamEvent<byte[]> seq =
361361
let toDataAndContext (result : ConsumeResult<string, string>) =
362-
let message = Bindings.mapMessage result
362+
let message = Binding.message result
363363
System.Text.Encoding.UTF8.GetBytes message.Value, null
364364
__.ConsumeResultToStreamEvent(toStreamName, toDataAndContext)
365365

@@ -382,7 +382,7 @@ type StreamNameSequenceGenerator() =
382382
member __.ConsumeResultToStreamEvent(toDataAndContext : ConsumeResult<_, _> -> byte[] * obj, ?defaultCategory)
383383
: ConsumeResult<string, string> -> Propulsion.Streams.StreamEvent<byte[]> seq =
384384
let toStreamName (result : ConsumeResult<string, string>) =
385-
let message = Bindings.mapMessage result
385+
let message = Binding.message result
386386
Core.parseMessageKey (defaultArg defaultCategory "") message.Key
387387
let toTimelineEvent (result : ConsumeResult<string, string>, index) =
388388
let data, context = toDataAndContext result

0 commit comments

Comments
 (0)