Download presentation
Presentation is loading. Please wait.
Published byKimberly Ford Modified over 8 years ago
1
1 SIP End-to-End Performance Metrics 70 th IETF Conference PMOL Daryl Malas
2
2 Problem Statement With widespread implementation of SIP the following problems have surfaced: –No standard method for measuring SIP performance –Enterprise and SIP Service Provider confusion on “How” –Enterprise and SIP Service Provider confusion on “Where” to measure –Current reliance on PSTN (and other) metrics Draft defines the “How” and the “Where” for common metrics applicable to ALL SIP applications.
3
3 Relevant Standards Dev IETF –PMOL WG to develop standard to benchmark end-to- end SIP application performance –BMWG developing standard to benchmark SIP networking device performance Other –SPEC developing industry-available test code for SIP benchmarking in accordance with IETF’s BMWG and PMOL standards –ITU-T (Study Group 12) focused on detailed failure/success scenarios and end-to-end benchmark guidance
4
4 History Introduced in SIPPING WG at 66 th IETF conference Confusion around scope relevant to BMWG SIP tests Confusion around draft home (e.g. IPPM, SIPPING, or BMWG) –Draft did not fit within the charter of these groups Unanimous SIPPING WG consensus of interest and to continue development of the draft Updates and continual progress with draft requested from WG and AD’s Discussion of further updates and progress in SIPPING WG at 67 th and 68 th IETF conference Draft still in search of home at 68 th conference Draft introduced in APM BOF at 69 th IETF conference
5
5 Draft Discussion Consensus on purpose and scope Framework Concept – LCR (Least Cost Routing) and QBR (Quality Based Routing)
6
6 Example Metric - SER Session Establishment Rate (SER) –Detect ability of a terminating UA or downstream proxy to successfully establish new sessions
7
7 “Real World” Application Enterprise with multiple upstream SIP Service Providers (SSP) SER SSP A – 68% SER SSP B – 87% What can the Enterprise do with this data?
8
8 Currently Defined Metrics Registration Request Delay (RRD) –Detect failures or impairments causing delays in REGISTER requests Session Request Delay (SRD) –Detect failures or impairments causing delays in new session requests Session Disconnect Delay (SDD) –Detect failures or impairments causing delays in ending a session Session Duration Time (SDT) –Detect problems (e.g. Poor audio quality) causing short sessions Average Hops per Request (AHR) –Detect inefficient routing and failure conditions due to number of elements traversed
9
9 Currently Defined Metrics Session Establishment Efficiency Rate (SEER) –Complements SER, but excludes potential effects of terminating UAS Session Defects (SD) –Detect consistent failures in dialog processing Ineffective Session Attempts (ISA) –Detect proxy or UA failures or congestion conditions causing setup requests Session Disconnect Failures (SDF) –Detects when an active session is terminated due to a failure condition Session Completion Rate (SCR) –Detects failures caused by downstream proxies not responding to new session setups Session Success Rate (SSR) –Ratio of successful sessions compared with sessions failing due to ISA or SDF
10
10 Next Steps Agreed upon set of metrics –Too many –Too few –Different Metrics Concerns or Questions Working Group Item?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.