Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Hybrid Bit-stream Models. 2 Hybrid bit-stream model: Type 1  Pros: Simple. All we need are open-source codecs.  Cons: May lose some available information.

Similar presentations


Presentation on theme: "1 Hybrid Bit-stream Models. 2 Hybrid bit-stream model: Type 1  Pros: Simple. All we need are open-source codecs.  Cons: May lose some available information."— Presentation transcript:

1 1 Hybrid Bit-stream Models

2 2 Hybrid bit-stream model: Type 1  Pros: Simple. All we need are open-source codecs.  Cons: May lose some available information which is present at the receiver.

3 3 Hybrid bit-stream model: Type 2  Pros: Still simple. All we need are open-source codecs and open-source packetizer/depacketizer. Some codecs already incorporate packetizer/depacketizer.  Cons: May lose some available information present at the receiver.

4 4 Hybrid bit-stream model: Type 3  Proposed by Mr. Osamu Sugimoto.  A model can use all available information by allowing the model to use any bit-stream data.  We need open-source for all components. Source Contents Video Audio (Data) Video Encoder Audio Encoder Encoder Packetizer PES MUX Packet video (Multiplexed) Error Correction (FEC) (packet Reordering) interleaving IP Packetizer (Or Modulation for Air-transmission) (Channel) 204 bytes 1500 bytes Several Muxed A/V packets In one IP packet TS ES

5 5 Bottom lines  All components should have open-source programs.  If proponents of ILGs propose to use particular bit- stream data, they should provide source codes to process such bit-stream data.  If proponents of ILGs propose to use a particular codec, they should provide source codes for the codec.  A model will be given bit-stream data and raw PVS data which will be used for subjective testing. FR and RR models will be given additional data.  The project should be done in a timely manner.

6 6 Things to be determined  Codec types: Should be open-sources  MPEG4, H.264, etc.  Which codec do we want to do first?  Test conditions  Are we going to use the same test conditions of the multimedia testplan with minor changes (as little as possible)?  All test conditions in the multimedia testplan would be admissible if the corresponding bit-stream data is available.  Subjective testing  Option 1: Proponents and ILGs  Pros: More test conditions  Cons: Difficult to. Can delay the whole process significantly.  We may not afford to spend years to develop Recommendations.  Option 2: ILGs only  Pros: Can be done timely and professionally.  Cons: smaller test conditions

7 7 Time Schedule  We may not afford to spend years to finish the project.  Industry needs good objective models immediately and other companies are introducing their products to the market.  A proposed schedule (ambitious ?)  Approval of testplan (by next VQEG meeting): Most works should be done through reflector. Editorial works can be shared by co-chairs and other volunteers.  Submission of models: 6 months after the approval of the testplan.  This technology improves rapidly and does not require backward comparability. We might have to be prepared to revise the standards when more promising technologies are available.

8 8 Decisions to be made

9 9 Scope of hybrid bit-stream models  Video formats: (CIF,VGA), (SD, HDTV) AGREED  Model types:  FR,  RR,  NR, AGREED  parametric bitstream (w/o PVS) AGREED  Codecs, etc. Provisionally AGREED :  MP4 (multiplexing method)  MPEG2 in MPEG2 TS with/without RTP,  H.264 in MPEG2 TS with/without RTP,  VC1 in native TS with/without RTP,  (further details to be provided e.g. coding profiles) Decisions: see above

10 10 Bit-stream data to be used  Basic requirement: there should be open source codes to process the bit-stream data.  Bit-stream data to the decoder  Bit-stream data to the depacketizer  Any other bit-stream data Decision:

11 11 Subjective testing  Subjective testing can be done by proponents/ILGs or ILGs only  Option 1: Proponents and ILGs  Pros: More test conditions  Cons: Difficult to. Can delay the whole process significantly.  We may not afford to spend years to develop Recommendations.  Option 2: ILGs only  Pros: Can be done timely and professionally.  Cons: smaller test conditions Decision: ILG Does as little as possible, only what is necessary to ensure fairness for all proponents.

12 12 Subjective testing method  ACR, ACR-HMM, DSCQS, etc. Decision (provisional):  ACR-HRR for CIF and VGA  ACR-HRR for SD and HD

13 13 Test conditions and source pool  The test conditions of the multimedia testplan can be used for the hybrid testplan with minor changes if necessary.  The source materials currently available will be used. Additional materials will be also used if they are submitted in a timely manner under a similar license condition. Decision : Yes. Also need new source that doesn ’ t expire. (content NDAs will need to be distributed to new proponents)

14 14 Capturing bit-stream data/raw data  Raw data: the current methods of the multimedia testplan can be used.  Software for capturing bit-stream data  Time-stamp for capturing time  Use of error simulator for packet loss Decision: We need software tools to capture bitstream data.

15 15 Schedule  Decision on codecs, video format, bit-stream data: May 9, 2007.  Distribution of open source codes for the codes and bit-stream data: Two months after STEP 1 (proponents and ILGs, July 15, 2007).  Design of test senarios: Two months after STEP 2 (editorial group and approval of proponents of ILGs, September 15, 2007).  First draft of the testplan: One months before the next VQEG meeting (editorial group and approval of proponents of ILGs, September 15, 2007).  Approval of the testplan: one month from STEP 4 October 15, 2007.  Model Submission : 6 months after after approval of the testplan, April 15, 2008.

16 16 Potential proponents for hybrid models Survey: BT, KDDI, NEC, NTIA (with collaborating labs), NTT, Opticom, Psytechnics, Yonsei, Qualideo, Ghent University Others(?): Genista, TDF

17 17 1. Table of the Multimedia testplan 1.Introduction 2.List of Definitions 3.List of Acronyms 4.Subjective Evaluation Procedure 5.Test Laboratories and Schedule 6.Sequence Processing and Data Formats 7.Objective Quality Models 8.Objective Quality Model Evaluation Criteria 9.Recommendation 10.Bibliography

18 18 1. Introduction  By editorial group

19 19 2. List of Definitions  By editorial group

20 20 3. List of Acronyms  By editorial group

21 21 4. Test Laboratories and Schedule 4.Subjective Evaluation Procedure 4.1.The ACR Method with Hidden Reference Removal 4.1.1.General Description 4.1.2.Application across Different Video Formats and Displays 4.1.3.Display Specification and Set-up 4.1.4.Test Method 4.1.5.Subjects 4.1.6.Viewing Conditions 4.1.7.Experiment design 4.1.8.Randomization 4.1.9.Test Data Collection 4.2.Data Format 4.2.1.Results Data Format 4.2.2.Subjective Data Analysis  Which subjective testing method will be used? (ACR, ACR-HMM, DSCQS)  By editorial group  Decision:

22 22 5. Test Laboratories and Schedule 5.1.Independent Laboratory Group (ILG) 5.2.Proponent Laboratories 5.3.Test procedure and schedule  Who is going to do subjective testing: proponents/ILGs or ILGs only  By editorial group Decision:

23 23 6. Sequence Processing and Data Formats 6.1.Sequence Processing Overview 6.1.1.Camera and Source Test Material Requirements 6.1.2.Software Tools 6.1.3.Colour Space Conversion 6.1.4.De-Interlacing 6.1.5.Cropping & Rescaling 6.1.6.Rescaling 6.1.7.File Format 6.1.8.Source Test Video Sequence Documentation 6.2.Test Materials 6.2.1.Selection of Test Material (SRC) 6.3.Hypothetical Reference Circuits (HRC) 6.3.1.Video Bit-rates 6.3.2.Simulated Transmission Errors 6.3.3.Live Network Conditions 6.3.4.Pausing with Skipping and Pausing without Skipping 6.3.5.Frame Rates 6.3.6.Pre-Processing 6.3.7.Post-Processing 6.3.8.Coding Schemes 6.3.9.Processing and Editing Sequences  By editorial group

24 24 7. Objective Quality Models 7.1.Model Type 7.2.Model Input and Output Data Format 7.3.Submission of Executable Model 7.4.Registration  Open source codes to read bit-stream data  By editorial group

25 25 8. Objective Quality Model Evaluation Criteria 8.1.Evaluation Procedure 8.2.Data Processing 8.2.1.Calculating DMOS Values 8.2.2.Mapping to the Subjective Scale 8.2.3.Averaging Process 8.2.4.Aggregation Procedure 8.3.Evaluation Metrics 8.3.1.Pearson Correlation Coefficient 8.3.2.Root Mean Square Error 8.4.Statistical Significance of the Results 8.4.1.Significance of the Difference between the Correlation Coefficients 8.4.2.Significance of the Difference between the Root Mean Square Errors 8.4.3.Significance of the Difference between the Outlier Ratios  By editorial group

26 26 7. Objective Quality Models  By editorial group

27 27 2. List of Definitions  By editorial group


Download ppt "1 Hybrid Bit-stream Models. 2 Hybrid bit-stream model: Type 1  Pros: Simple. All we need are open-source codecs.  Cons: May lose some available information."

Similar presentations


Ads by Google