Issues in Evaluation Criteria Document November 15, 2006.

Slides:



Advertisements
Similar presentations
Evaluation Criteria and Traffic Models Status Update Farooq Khan IEEE Plenary Meeting Portland, Oregon, USA July 12-16, 2004.
Advertisements

Evaluation Criteria and Traffic Models Status Update Farooq Khan IEEE Interim Meeting Berlin, Germany September 12-17, 2004.
Traffic Models: Status/Discussion July 22, 2003 N. K. Shankaranarayanan (Shankar) AT&T Labs-Research IEEE C /73.
Simulation and Evaluation of Various Block Assignments Evaluation of multiple carriers deployed in a channel block evaluation criteria section.
1 PROGRESS REPORT on CHANNEL MODEL DOCUMENT Al Wieczorek 16 Sept
IEEE Proposed Charter for Performance Committee Khaled Amer Interim Meeting Orlando, FL May 2001.
Doc.: IEEE /1515r0 Submission November 2011 Timo Koskela, Renesas Mobile CorporationSlide ah Wi-Fi Offloading Considerations Date:
Submission doc.: IEEE Comment #1 from WG Comment: In Section 5.2.b two examples of spectrum resource measurements are given: PER and.
Submission doc.: IEEE 11-13/1100r0 September 2013 Osama Aboul-Magd (Huawei Technologies)Slide 1 HEW SG Progress Review Date: Authors:
Doc.: IEEE /0604r1 Submission May 2014 Slide 1 Modeling and Evaluating Variable Bit rate Video Steaming for ax Date: Authors:
C r2. 2 Conference call summaries Major open issues  Open issues in Traffic models  Other open issues addressed by contributions  Other.
1 © 2005 course technology University Of Palestine Chapter 6 Storyboarding the User’s Experience.
Research Project at Cabra
© 2005 course technology1 1 1 University Of Palestine UML for The IT Business Analyst A practical guide to Object Oriented Requirement Gathering Hoard.
C GPP2 TSG-C WG3 TITLE : UMB performance results SOURCE: TSG-C WG3 EMAH Contact to: Satoshi Konishi, Vice-chair of EMAH
IEEE /r3 Submission September 2008 John Notor, Cadence Design Systems, Inc.Slide 1 IEEE IMT-Advanced Review Process Date:
Evaluation Criteria and Traffic Models Update Farooq Khan IEEE Plenary Meeting Orlando, FL, USA March 15-19, 2004.
Doc.: IEEE /0498r0 Submission April 2008 Eldad Perahia, Intel CorporationSlide 1 Modifications to the 60GHz PAR & 5 C’s Proposal Date:
1 IEEE Technology Selection Process Presentation of Contribution C Dan Gal Berlin, September 14, 2004.
Requirements Topics and Proposals as discussed at Session #4 of IEEE /16r1.
Doc.: IEEE /1202r1 Submission October 2004 C. Wright, Azimuth SystemsSlide 1 Proposed Metrics for TGT and Call to Action Date: Oct 21, 2004 Author:
Traffic Models Discussion September 2003 IEEE C /86.
S15: Supervision and review. Objective of supervision and review  To ensure that the audit is done efficiently and effectively so that the audit opinion.
Donghee Kim Samsung Electronics ABSTRACT: This contribution shows the summary of changes in evaluation methodology text.
C r3a2 Issues Discussed in Conference Call - Dec 7 Reviewed list of open issues Evaluation Criteria Status Report from the Plenary updated.
IEEE Session # 3 Closing Plenary Mark Klerer, Jerry Upton Vice-Chairs 24 July 2004 IEEE /13r1.
C xx2 Summary of Conference Call – Feb 8 Reviewed contribution C r3 to recap the status of evaluation criteria document Sections in.
Doc.: IEEE /0542r0 SubmissionSimone Merlin, QualcommSlide 1 HEW Scenarios and Goals Date: Authors: May 2013.
May 16, 2005Chair, IEEE May 16, 2005Chair, IEEE Next Steps & Action Items from March 2005 Plenary Status Review - - May 2005 Interim.
Software Requirements Specification Document (SRS)
Spectral Efficiency Ad-hoc March 18, Status and Continuation The ad-hoc group will meet again Thursday, March 19, 2004 at 7:00 am In preparation.
Simulation Data for Letter Ballot Comments on Quasi-guard Subcarriers and Reverse Link Waveform Lai King (Anna) Tee January 15, 2007.
Doc.: IEEE /243r0 Submission May 2001 Slide 1Steve Shellhammer, Symbol Technologies IEEE P Working Group for Wireless Personal Area Networks.
IEEE C /87. Status of Evaluation Criteria IEEE Evaluation Criteria CG IEEE Interim Meeting September 15-19, 2003.
September 13, 2004Chair, IEEE Joint Opening September 2004 Interim Session #10 Jerry Upton- Chair Gang Wu – Procedural.
WF on Channel Modeling Requirements Huawei, HiSilicon, Ericsson, Keysight, CMCC, […] R GPP TSG RAN1#84 St Julian’s, Malta, Feb , 2016.
Evaluation Criteria and Traffic Models Status Update Farooq Khan IEEE Interim Meeting Garden Grove, CA, USA May 10-13, 2004.
Balloting Preparation Overview Link 11 A/B PDG - Link 11 A/B Network Simulation Standard.
Response to Official Comments
PAR Review - Meeting Agenda and Comment slides - Vancouver 2017
VHTL6 task group work plan proposal (VHTL < 6 GHz)
Evaluation Criteria and Traffic Models Status Update
Status of Channel Models
Requirements Discussion
Evaluation Model for LTE-Advanced
Proposal for TGad Evaluation Methodology
2111 NE 25th Ave, Hillsboro OR 97124, USA
Standard Scripts Project 2
Unit 09 – LO3 - Be able to Implement and Test Products
Evaluation of the saturation of the 5GHz band
Evaluation of the saturation of the 5GHz band
FD TIG mid-week report Date: Authors: September 2018 Name
TGn FRCC Jan 2004 Report Adrian P Stephens
TGn FRCC Jan 2004 Report Adrian P Stephens
IEEE IMT-Advanced Review Process
FD TIG mid-week report Date: Authors: September 2018 Name
IEEE IMT-Advanced Review Process
TGn Chair’s Status Update
Standard Scripts Project 2
Session #5 Closing Status November , 2003
TGac Channel Model Revisions for r7
WiGig technologies IEEE ad/ay
Proposed Addition to Evaluation Methodology
TGah Coexistence Assurance
Proposal for TGad Evaluation Methodology
Standard Scripts Project 2
Summary of Conference Call – Feb 8
Response to Official Comments
TGn FRCC Jan 2004 Report Adrian P Stephens
Discussion on The EHT Timeline and PAR Definition
Presentation transcript:

Issues in Evaluation Criteria Document November 15, 2006

Traffic Models In May ’06 Interim meeting, the issue about the traffic model in the current evaluation criteria document has been discussed [1] In May ’06 Interim meeting, the issue about the traffic model in the current evaluation criteria document has been discussed [1] Traffic model mix used in proposal evaluation was not able to test the reverse link performance sufficiently Traffic model mix used in proposal evaluation was not able to test the reverse link performance sufficiently Needs to investigate an appropriate traffic model mix that can help the WG to evaluate the performance in the reverse link Needs to investigate an appropriate traffic model mix that can help the WG to evaluate the performance in the reverse link For example, the modeling of file uploads from the mobile user to the base station For example, the modeling of file uploads from the mobile user to the base station Can be used to simulate one of the popular applications, i.e., photo or video upload from camera phones, laptops Can be used to simulate one of the popular applications, i.e., photo or video upload from camera phones, laptops

Spectrum block size for comparison Evaluation of performance in a fixed spectrum block should be specified as in the pre-approved version of the evaluation criteria document [2] Evaluation of performance in a fixed spectrum block should be specified as in the pre-approved version of the evaluation criteria document [2] As proposals may be designed for operation in a different basic bandwidth, the evaluation in a typical spectrum block enables a fair comparison between different proposals As proposals may be designed for operation in a different basic bandwidth, the evaluation in a typical spectrum block enables a fair comparison between different proposals Include consideration of actual deployment scenario, e.g., amount of guard bands required between adjacent carriers Include consideration of actual deployment scenario, e.g., amount of guard bands required between adjacent carriers

System Performance at 250 km/h Channel model mix has not included the higher mobility users in several scenarios Channel model mix has not included the higher mobility users in several scenarios Despite the practical reasons as provided in the evaluation criteria document [3], some evaluations on system-level performance at the higher mobility cases should be included Despite the practical reasons as provided in the evaluation criteria document [3], some evaluations on system-level performance at the higher mobility cases should be included For example, handoff performance for mobile users at the highest mobility of 250 km/h should be evaluate For example, handoff performance for mobile users at the highest mobility of 250 km/h should be evaluate Systems requirements section stated that shall support vehicular speed up to 250 km/h Systems requirements section stated that shall support vehicular speed up to 250 km/h

Evaluation Stages Phrased approach technology evaluation provides an opportunity for the system simulators from various proponents to be calibrated so that simulation results can be compared fairly Phrased approach technology evaluation provides an opportunity for the system simulators from various proponents to be calibrated so that simulation results can be compared fairly Thus, phrased approach on proposal evaluation may worth considering again Thus, phrased approach on proposal evaluation may worth considering again as discussed in Section 6 of [4] as discussed in Section 6 of [4]

Others Conduct a thorough comparison between the Systems Requirements and the Evaluation Criteria Documents Conduct a thorough comparison between the Systems Requirements and the Evaluation Criteria Documents To ensure consistency To ensure consistency Also need to ensure consistency between Evaluation Criteria and Channel Models Documents Also need to ensure consistency between Evaluation Criteria and Channel Models Documents

References 1. ‘Proposed traffic modeling ad hoc’, C /18, Mar 7, ‘Draft Evaluation Criteria document-Version 17r1’, September 14, ‘ Evaluation Criteria document-Ver. 1.0’, IEEE PD- 09, September 22, C /10r2, Section 6, Jan 18, 2006

Appendix – Section 6 of [4] Major changes in the Evaluation Criteria document in the September 2005 Interim meeting Major changes in the Evaluation Criteria document in the September 2005 Interim meeting Substantial differences have been identified in a comparison between the two versions of evaluation criteria documents: Substantial differences have been identified in a comparison between the two versions of evaluation criteria documents: IEEE Evaluation Criteria Document V.17r1, September 14, 2005 [5], which is an “Updated Version of Evaluation Criteria document based upon Editor’s clean up of the document and agreements from Session #14, May 17-19, 2005; plus additional Editorial cleanups per notes from Members; and changes agreed at Session #15 plus inputs from Two Conference Calls”, as quoted from the cover page of the document. IEEE Evaluation Criteria Document V.17r1, September 14, 2005 [5], which is an “Updated Version of Evaluation Criteria document based upon Editor’s clean up of the document and agreements from Session #14, May 17-19, 2005; plus additional Editorial cleanups per notes from Members; and changes agreed at Session #15 plus inputs from Two Conference Calls”, as quoted from the cover page of the document. IEEE Evaluation Criteria Document V1.0, PD-09, September 23, 2005[3], which is the final version approved in September 2005 Interim meeting, Session #16. IEEE Evaluation Criteria Document V1.0, PD-09, September 23, 2005[3], which is the final version approved in September 2005 Interim meeting, Session #16.

Appendix – Section 6: (Phased) Approach for Technology Evaluation Until Version 17R1, this section has described clearly about the two-phase approach of proposal evaluation, as quoted below: “The evaluation will be structured in two phases with each phase progressively adding more complexity. The evaluation work for each proposal may then be compared at each phase to ensure a progressive "apples to apples" comparison of proposals. This structured approach will also provide performance metrics for the physical and link layer performance early rather than later in the evaluation process.” For Phase 1: “The goals at the end of phase 1 are, first, to achieve confidence that different simulation models are calibrated and, second, to present fundamental performance metrics for the physical and link layer of various proposals.” The followings have been specified for each phase of evaluation. Phase 1: System-level calibration Channel models: Pedestrian B, 3km/h; Vehicular B, 120 km/h Full-Buffer traffic model Phase 2: Additional traffic models Additional channel models/channel model mix TCP model etc.

Appendix – Approved Version The two-phase approach has been replaced by two reports, with the following description: “The goals of the first report are, first, to achieve confidence that different simulation models are calibrated and, second, to present fundamental performance metrics for the physical and link layer of various proposals.” Important information on the performance characteristics of the proposed technology that should have been obtained in Phase 2 evaluation is not available Throughput performance as affected by TCP flow control algorithms has not been included in the evaluation