Download - How we built an event-time merge of two kafka-streams with spark-streaming

Transcript
Page 1: How we built an event-time merge of two kafka-streams with spark-streaming

How We Built an Event-Time Merge of Two Kafka-Streams with Spark Streaming

Ralf Sigmund, Sebastian SchröderOtto GmbH & Co. KG

Page 2: How we built an event-time merge of two kafka-streams with spark-streaming

Agenda• Who are we?• Our Setup• The Problem•Our Approach

• Requirements• Advanced Requirements• Lessons learned

Page 3: How we built an event-time merge of two kafka-streams with spark-streaming

Who are we?• Ralf

–Technical Designer Team Tracking–Cycling, Surfing–Twitter: @sistar_hh

• Sebastian–Developer Team Tracking–Taekwondo, Cycling–Twitter: @Sebasti0n

Page 4: How we built an event-time merge of two kafka-streams with spark-streaming

Who are we working for?

• 2nd largest ecommerce company in Germany• more than 1 million visitors every day• our blog: https://dev.otto.de• our jobs: www.otto.de/jobs

Page 5: How we built an event-time merge of two kafka-streams with spark-streaming

Our Setupotto.de

Varnish Reverse Proxy Tracking-Server

Vertical 1Vertical 0

Vertical n

Service 0

Service 1

Service n

Page 6: How we built an event-time merge of two kafka-streams with spark-streaming

The Problemotto.de

Varnish Reverse Proxy

Vertical 1Vertical 0

Vertical n

Service 0

Service 1

Service n

Has a limit on the size of messages

Needs to send large tracking messages

Tracking-Server

Page 7: How we built an event-time merge of two kafka-streams with spark-streaming

Our Approachotto.de

Varnish Reverse Proxy

Tracking-Server

Vertical 1Vertical 0

Vertical n

S0

S1

Sn

Spark Service

Page 8: How we built an event-time merge of two kafka-streams with spark-streaming

Requirements

Page 9: How we built an event-time merge of two kafka-streams with spark-streaming

Messages with the same key are merged

t: 1id: 1

t: 3id: 2

t: 2id: 3

t: 0id: 2

t: 4id: 0

t: 2id: 3

t: 1id: 1

t: 0id: 2

t: 3id: 2

t: 2id: 3

t: 2id: 3 t: 4

id: 0

Page 10: How we built an event-time merge of two kafka-streams with spark-streaming

Messages have to be timed out

t: 1id: 1

t: 3id: 2

t: 2id: 3

t: 0id: 2

t: 4id: 0

t: 2id: 3

t: 1id: 1

t: 0id: 2

t: 3id: 2

t: 2id: 3

t: 2id: 3 t: 4

id: 0

max(business event timestamp)

time-out after 3 sec

Page 11: How we built an event-time merge of two kafka-streams with spark-streaming

Event Streams might get stuck

t: 1id: 1

t: 3id: 2

t: 2id: 3

t: 0id: 2

t: 4id: 0

t: 2id: 3

t: 1id: 1

t: 0id: 2

t: 3id: 2

t: 2id: 3

t: 2id: 3 t: 4

id: 0

clock A

clock B

clock: min(clock A, clock B)

Page 12: How we built an event-time merge of two kafka-streams with spark-streaming

There might be no messages

t: 1id: 1

t: 4id: 2

t: 2id: 3

t: 5heart beat

t: 1id: 1

t: 4id: 2

t: 2id: 3

time-out after 3 sec

t: 5heart beat

clock: min(clock A, clock B) : 4

Page 13: How we built an event-time merge of two kafka-streams with spark-streaming

Requirements Summary• Messages with the same key are merged• Messages are timed out by the combined event

time of both source topics • Timed out messages are sorted by event time

Page 14: How we built an event-time merge of two kafka-streams with spark-streaming

Solution

Page 15: How we built an event-time merge of two kafka-streams with spark-streaming

Merge messages with the same Id•UpdateStateByKey, which is defined on pairwise RDDs

• It applies a function to all new messages and existing state for a given key

•Returns a StateDStream with the resulting messages

Page 16: How we built an event-time merge of two kafka-streams with spark-streaming

UpdateStateByKey

DStream

[Inpu

tMes

sage

]

DStream

[Merg

edMes

sage

]

1

2

3

1

2

3

History

RDD

[Merg

edMes

sage

]

1

2

3

UpdateFunction: (Seq[InputMessage], Option[MergedMessage]) => Option[MergedMessage]

Page 17: How we built an event-time merge of two kafka-streams with spark-streaming

Merge messages with the same Id

Page 18: How we built an event-time merge of two kafka-streams with spark-streaming

Merge messages with the same Id

Page 19: How we built an event-time merge of two kafka-streams with spark-streaming

Flag And Remove Timed Out Msgs

DStream

[Inpu

tMes

sage

]

DStream

[Merg

edMes

sage

]

1

MMTO2

2

3

History

RDD

[Merg

edMes

sage

]

MM

MMTO MM

TO

Filter (t

imed out)

Filter !(

timed out)

2

3

1 MM

Flag newly timed out

1 MM

3

needs globalBusiness Clock

UpdateFunction

Page 20: How we built an event-time merge of two kafka-streams with spark-streaming

Messages are timed out by event time•We need access to all messages from the current micro batch and the history (state) => Custom StateDStream

•Compute the maximum event time of both topics and supply it to the updateStateByKey function

Page 21: How we built an event-time merge of two kafka-streams with spark-streaming

Messages are timed out by event time

Page 22: How we built an event-time merge of two kafka-streams with spark-streaming

Messages are timed out by event time

Page 23: How we built an event-time merge of two kafka-streams with spark-streaming

Messages are timed out by event time

Page 24: How we built an event-time merge of two kafka-streams with spark-streaming

Messages are timed out by event time

Page 25: How we built an event-time merge of two kafka-streams with spark-streaming

Advanced Requirements

Page 26: How we built an event-time merge of two kafka-streams with spark-streaming

topi

c B

topi

c A

Effect of different request rates in source topics

10K

5K

0

10K

5K

0

catching upwith 5K per microbatch

data read too early

Page 27: How we built an event-time merge of two kafka-streams with spark-streaming

Handle different request rates in topics

•Stop reading from one topic if the event time of other topic is too far behind

•Store latest event time of each topic and partition on driver

•Use custom KafkaDStream with clamp method which uses the event time

Page 28: How we built an event-time merge of two kafka-streams with spark-streaming

Handle different request rates in topics

Page 29: How we built an event-time merge of two kafka-streams with spark-streaming

Handle different request rates in topics

Page 30: How we built an event-time merge of two kafka-streams with spark-streaming

Handle different request rates in topics

Page 31: How we built an event-time merge of two kafka-streams with spark-streaming

Handle different request rates in topics

Page 32: How we built an event-time merge of two kafka-streams with spark-streaming

Guarantee at least once semantics

o: 1t: a

o: 7 t: a

o: 2t: a

o: 5t: b

o: 9t: b

t: 6t: b

Current State Sink Topic

a-offset: 1b-offset: 5a-offset: 1

b-offset: 5a-offset: 1b-offset: 5

On Restart/RecoveryRetrieve last message

Timeout

Spark Merge Service

Page 33: How we built an event-time merge of two kafka-streams with spark-streaming

Guarantee at least once semantics

Page 34: How we built an event-time merge of two kafka-streams with spark-streaming

Guarantee at least once semantics

Page 35: How we built an event-time merge of two kafka-streams with spark-streaming

Everything put together

Page 36: How we built an event-time merge of two kafka-streams with spark-streaming

Lessons learned• Excellent Performance and Scalability• Extensible via Custom RDDs and Extension to DirectKafka

•No event time windows in Spark StreamingSee: https://github.com/apache/spark/pull/2633

• Checkpointing cannot be used when deploying new artifact versions

• Driver/executor model is powerful, but also complex

Page 37: How we built an event-time merge of two kafka-streams with spark-streaming

THANK YOU.Twitter: @sistar_hh@Sebasti0n

Blog: dev.otto.deJobs: otto.de/jobs