One year of measuring WebRTC service quality

23
Performance in the Wild: Varun Singh, CEO 18.12.2015 Upperside Paris One year of measuring WebRTC service quality http://geek-and-poke.com/ 1

Transcript of One year of measuring WebRTC service quality

Page 1: One year of measuring WebRTC service quality

Performance in the Wild:

Varun Singh, CEO

18.12.2015 Upperside Paris

One year of measuring WebRTC service quality

http://geek-and-poke.com/ 1

Page 2: One year of measuring WebRTC service quality

Analytics for WebRTC

• the first cloud-based monitoring and management service for WebRTC (audio and video).

• The team is contributing to the core standards that enable the WebRTC technology.

2

Page 3: One year of measuring WebRTC service quality

Monitoring What?

• Annoyances

• Transport quality

• Per-stream media quality

3

Page 4: One year of measuring WebRTC service quality

Overall Metrics

4

Page 5: One year of measuring WebRTC service quality

Conference Timeline

5

Page 6: One year of measuring WebRTC service quality

Disruptions

Disruption: loss of connectivity when network in ter faces change, low available capacity,or high delay

The light grey vertical lines show disruption, highlighted by the red bounding boxes.6

Page 7: One year of measuring WebRTC service quality

Disruptions and user behaviour

User Behaviour: The user tries to correct for the disruption by turning on and off video

7

Page 8: One year of measuring WebRTC service quality

Growth

Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec

purely web endpoints

support: • TURN, • VideoBridge

no mobile, yet

8

Page 9: One year of measuring WebRTC service quality

Browsers

Chrome Firefox Opera Temasys

95%

5%0.05%

9

Page 10: One year of measuring WebRTC service quality

Browsers

Chrome Firefox Opera Temasys

95%

5%0.05%

10

Page 11: One year of measuring WebRTC service quality

OSes

0

12,5

25

37,5

50

Android Windows Mac Linux

11

49%34%

13%

4%

using a browser

Page 12: One year of measuring WebRTC service quality

Number of Participants

0

10

20

30

40

50

60

70

80

90

100

2 3 4 5 6-8 >8

12

64%

30%

0,3%3%

Page 13: One year of measuring WebRTC service quality

Types of Relays

0

10

20

30

40

50

60

70

80

90

100

No Relay TURN/UDP TURN/TCP TURN/TLS

78%

13% 7% 2%

13

Page 14: One year of measuring WebRTC service quality

IPv6?

0

20

40

60

80

100

IPv4 IPv6

Not all is lost, for Europe alone

0

20

40

60

80

100

IPv4 IPv6

14

97%

83% 17%

Page 15: One year of measuring WebRTC service quality

how many ICE candidates?

15

0

20

40

60

80

100

0-4 5-8 8-20 >20

host stun turn

v4 v6

multi homed

24 candidates: 8%

68%

8%17%

7%

Page 16: One year of measuring WebRTC service quality

Setup times

0

10

20

30

40

50

<1s 1-2s 2-5s 5-10s >10s

16

47%

23%

14%

7%9%

Page 17: One year of measuring WebRTC service quality

Failure Reasons~9%calls fail

5% 95%<1%

~20% of the setup calls have issues

17

Page 18: One year of measuring WebRTC service quality

Time to Failure

0

10

20

30

40

50

<10s

10-30

s

30-60

s>60

s

18

People are very patient!

perhaps because no mobile

50%33%

7%10%

Page 19: One year of measuring WebRTC service quality

0

20

40

60

80

100

no rejoins once twice thrice

Churn* *participant rejoins the same call repeatedly

Average joins per participant per conference

76%

19

Page 20: One year of measuring WebRTC service quality

Distribution of RTTs

20

0

10

20

30

40

<20ms

20-50

ms

50-15

0ms

150-4

00ms

0.4-1s 1-3

s>3s

40%

21% 9%

11%4% 4%1%

Page 21: One year of measuring WebRTC service quality

Summary• Browsers: Chrome dominates WebRTC

• OSes: Windows and Mac are pretty even

• Participants: ~3 participants in a call on average

• Relays: ~20% sessions need a TURN server

• Failures: NAT/FW still causes most failures

• Churn: 25% of sessions have a participant rejoining

21

Page 22: One year of measuring WebRTC service quality

A Very Simple API• 3 lines of code —> 5 minutes.

1. include <script>

2. initialize() —> needs registration keys

3. addNewFabric(pc,…) —> call started

• optionally: send user events: audio muted/unmuted, video paused/resumed, call terminated, call held.

• optionally: reportError() —> call failed to be setup due to SDP or Firewalls.

• optionally: associateMstWithUserID() —> bridges using multiple media streams in a single peerconnection may want to correlate mediastream quality across endpoints.

• optionally: collect and send user feedback

http://www.callstats.io/api/22

Page 23: One year of measuring WebRTC service quality

Vision

Gather

Report

Analyze

FIX

Deploy?

webrtc-internal?

via email?

Has

this

bee

n re

port

ed b

efor

e?

Did it solve the issue?Until next report?

Collect

Diagnose

Fix

Deploy

“ A w o r l d w h e r e r e a l - t i m e

communication is both frictionless and

effortless to set up, operate, and scale.”

23