Generation of the Orchestral Media for the Realistic Multimedia Representation Jae-Kwan Yun', Mi-Kyung Han', Jong-Hyun hug', and Kwang-Roh Parke, ' Meta-Verse&Emotion.

Slides:



Advertisements
Similar presentations
Russell Taylor. Sampling Sampled a file from an on-line/on-board source Edited that file by Deleting a section of the original file Added a Fade-in section.
Advertisements

Multimedia Production
© De Montfort University, Synchronised Presentations using Windows Media Howell Istance School of Computing De Montfort University.
Microsoft Office Illustrated Fundamentals Unit N: Polishing and Running a Presentation.
Quality of Service in IN-home digital networks Alina Albu 22 July 2003.
Department of Information Science and Telecommunications Interactive Systems Michael B. Spring Department of Information Science and Telecommunications.
Quicktime Howell Istance School of Computing De Montfort University.
WMES3103 : INFORMATION RETRIEVAL
CAD/CAM Design Process and the role of CAD. Design Process Engineering and manufacturing together form largest single economic activity of western civilization.
Customizing Outlook. Forms Window in which you enter and view information in Outlook Outlook Form Designer The environment in which you create and customize.
Tutorial 7 Working with Multimedia. XP Objectives Explore various multimedia applications on the Web Learn about sound file formats and properties Embed.
McGraw-Hill/Irwin The Interactive Computing Series © 2002 The McGraw-Hill Companies, Inc. All rights reserved. Microsoft PowerPoint 2002 Working with External.
Multimedia Enabling Software. The Human Perceptual System Since the multimedia systems are intended to be used by human, it is a pragmatic approach to.
Smart Learning Services Based on Smart Cloud Computing
Creating a MagicInfo Pro Screen Template
© 2011 Delmar, Cengage Learning Chapter 11 Adding Media and Interactivity with Flash and Spry.
CHAPTER 2 Communications, Networks, the Internet, and the World Wide Web.
Virtual Studio for Everywhere Darim Vision Co., Ltd.
An Overview of MPEG-21 Cory McKay. Introduction Built on top of MPEG-4 and MPEG-7 standards Much more than just an audiovisual standard Meant to be a.
Chapter 11 Adding Sound and Video. Chapter 11 Lessons 1.Work with sound 2.Specify synchronization options 3.Modify sounds 4.Use ActionScript with sound.
Multimedia Brief overview of capabilities and trends Future perspectives Basic hardware and software requirements and costs.
Copyright © 2008 Advanced Television Systems Committee, Inc. All rights reserved. 1 HPA Technology Retreat 2009 ATSC 2.0 The next generation Graham Jones,
MPEG-21 : Overview MUMT 611 Doug Van Nort. Introduction Rather than audiovisual content, purpose is set of standards to deliver multimedia in secure environment.
CHAPTER FOUR COMPUTER SOFTWARE.
Tutorial 7 Working with Multimedia. XP Objectives Explore various multimedia applications on the Web Learn about sound file formats and properties Embed.
Chapter 3 Multimedia Skills
Multimedia Chapter 1 Introduction to Multimedia Dhekra BEN SASSI.
CMPD 434 MULTIMEDIA AUTHORING Chapter 06 Multimedia Authoring Process IV.
Creating Multimedia Interaction with Windows Media Technologies 7.
Advanced Computer Technology II FTV and 3DV KyungHee Univ. Master Course Kim Kyung Yong 10/10/2015.
MULTIMEDIA DEFINITION OF MULTIMEDIA
Object Orientated Data Topic 5: Multimedia Technology.
Tutorial 7 Working with Multimedia. New Perspectives on HTML, XHTML, and XML, Comprehensive, 3rd Edition 2 Objectives Explore various multimedia applications.
Tutorial 7 Working with Multimedia. New Perspectives on HTML, XHTML, and XML, Comprehensive, 3rd Edition 2 Objectives Explore various multimedia applications.
MULTIMEDIA TECHNOLOGY SMM 3001 MEDIA - VIDEO. In this chapter How digital video differs from conventional analog video How digital video differs from.
I. Getting Started with the Interface Microsoft ® Windows ® Movie Maker.
Multimedia in Education We are going to Learn – Role of Multimedia in Education.
© TMC Computer School HC20203 VRML HIGHER DIPLOMA IN COMPUTING Chapter 2 – Basic VRML.
Design and Implementation of 4D Real-Sense Media Broadcasting Service System for Smart Home Jae-Kwan Yun', Mi-Ryong Park', Hyun-Woo Oh', Mi-Kyung Han',
Adobe AuditionProject 4 guide © 2012 Adobe Systems IncorporatedOverview of Adobe Audition workspace1 Adobe Audition is an audio application designed for.
A Method for Providing Personalized Home Media Service Using Cloud Computing Technology Cui Yunl, Myoungjin Kim l and Hanku Lee l 'z * ' Department of.
XP Tutorial 8 New Perspectives on Microsoft Windows XP 1 Microsoft Windows XP Object Linking and Embedding Tutorial 8.
Advanced Science and Technology Letters Vol.106 (Information Technology and Computer Science 2015), pp.17-21
Improvement of Schema-Informed XML Binary Encoding Using Schema Optimization Method BumSuk Jang and Young-guk Ha' Konkuk University, Department of Computer.
HTML5 based Notification System for Updating E-Training Contents Yu-Doo Kim 1 and Il-Young Moon 1 1 Department of Computer Science Engineering, KoreaTech,
Car Management System with In-Vehicle Networks Jong-Wook Jang 1,Sung-Hyun Baek 1, Yun-Sik, Yu 2 1 Department of Computer Engineering, Dong-Eui University,
A Framework with Behavior-Based Identification and PnP Supporting Architecture for Task Cooperation of Networked Mobile Robots Joo-Hyung Kiml, Yong-Guk.
A computer contains two major sets of tools, software and hardware. Software is generally divided into Systems software and Applications software. Systems.
DITA TOPICS - ADVANCED. Session Results Topic Types Review Format/Layout Separated from Content DITA Concept Elements Using DITA Concept Elements - Advanced.
Implementation of the Digital Integrated Public Address System Based on Network Jung-Sook Kim' Division of IT, Kimpo College ' 97, Gimpodaehak-ro, Wolgot-myun,
Chapter 8 Adding Multimedia Content to Web Pages HTML5 & CSS 7 th Edition.
 The same story, information, etc can be represented in different media  Text, images, sound, moving pictures  All media can be represented digitally.
Introduction to MPEG  Moving Pictures Experts Group,  Geneva based working group under the ISO/IEC standards.  In charge of developing standards for.
Digital Media Preservation Based on Change History Byoung-Dai Lee 1, Sungryeul Rhyu 2, Kyungmo Park 2, Jaeyeon Song 2 1 Department of Computer Science,
Augmented Reality Services based on Embedded Metadata Byoung-Dai Lee Department of Computer Science, Kyonggi University, Suwon, Korea Abstract.
Web Design, 5 th Edition 6 Multimedia and Interactivity Elements.
9.1 The Need for Integrating Data among Different Types of Software Tasks of composing a project.
Chapter 8 Sound FX Composition. Chapter 8 Sound FX Composition.
MPEG-4 Binary Information for Scenes (BIFS)
Digital Media Preservation Based on Change History
HTML5 based Notification System for Updating
Yunsik Son1, Seman Oh1, Yangsun Lee2
SPECIALIZED APPLICATION SOFTWARE
An Overview of MPEG-21 Cory McKay.
Chapter 10 Development of Multimedia Project
Multimedia Content Description Interface
Introduction to Multimedia
MUMT611: Music Information Acquisition, Preservation, and Retrieval
Multimedia (CoSc4151) Chapter One : Introduction to Multimedia
Presentation transcript:

Generation of the Orchestral Media for the Realistic Multimedia Representation Jae-Kwan Yun', Mi-Kyung Han', Jong-Hyun hug', and Kwang-Roh Parke, ' Meta-Verse&Emotion Technology Research Team, 218 Gajeong-no, Yuseong-gu, Daejeon, Republic of Korea 2 Green Computing Research Department, 218 Gajeong-no, Yuseong-gu, Daejeon, Republic of Korea {jkyun, mkhan, jangjh, Abstract. Future digital home is changing into intelligence ubiquitous home, and the future media will be changed into the orchestral media which includes not only visual, audio, text but also effect information related with the specific scene. The orchestral media can be interlocked with user peripheral multiple devises. To play the orchestral media, there needs effect information definition, method for inserting effect information into media. Therefore, in this paper, we explain the SEM (Sensory Effect Metadata), creation of orchestral media and orchestral media authoring tool. Keywords: Orchestral media, realistic multimedia, SEM. 1 Introduction Until now, conventional media with visual and sound components has been presented via display devices and speakers. But, nowadays, users want more realistic experiences of multimedia contents with high fidelity. For examples, stereoscopic video, virtual reality, 3DTV, multi-channel audios are typical types of media for realistic experiences. However, these sorts of applications are limited in visual and audio perspectives. For example, special effects can be authored as a separate track in conjunction with an audiovisual content in a synchronized way. While the audiovisual content is being playbacked, a series of special effects can be made by shaking curtains for a sensation of fear effect, by turning on a flashbulb for lightning flash effect, etc. Furthermore, fragrance, flame, fog and scare effects can be made by a scent device, a flame-thrower, a fog generator and a shaking chair, respectively. The orchestral media is the media that includes not only visual, audio, and text but also device control information, synchronization information, other effects descriptions with various useful devices around user [1]. Therefore, the orchestral media can play one media with user around multiple peripheral devices at the same time to give user realistic feelings [2], [3]. Fig. 1 illustrates the concept of the orchestral media service in the service aspect from the media generation to presentation. Media generation process presented in this figure shows the service architecture in which a user films a video at a garden, obtains Session 5D 687

environmental information (sensory effect) such as temperature, humidity and strength of the wind to create the orchestral media and plays it at a user terminal. That is, with the orchestral media, it is possible to reproduce the situation when filming was done even after several years. With the orchestral media that has already been made, it defines a device to be connected according to the contents of media and connects various devices which allow more realistic playing of the effect provided by the existing media. To do this, we defined the SEM (Sensory Effect Metadata), the orchestral media which contains audiovisual, and the orchestral media service to present these orchestral media. This paper is organized as follows. The SEM schema is described in chapter 2. Creation of the orchestral media is proposed in chapter 3. Finally, the conclusion and future work are described in chapter 4. Fig. 1. Concept of the Orchestral Media; From the orchestral media generation to presentation 2 SEM Schema In this chapter, we will explain about the schema design, syntax & semantics of the SEM. 2.1 Schema Design The SEM has two main parts those are Effect Property and Effect Variables. Effect Property contains definition of each Sensory Effect applied to the contents. By analyzing Effect Property, the orchestral media service system can match each Sensory Effect to the proper User Device in the user's environment, prepare and initialize User Device before processing media scene. Effect Variables contains the control variables for Sensory Effect synchronized with media stream. Fig. 2 shows the process for the SEM. Orchestral Media Generation Orchestral Media Presentation AV al, k er Audio System.- '''t..st DTV SEM  4 4 Pitioner Vibration chairsPIffestral Me i Afnation ( 4 Y 0 0'diem I Media 688 Computers, Networks, Systems, and Industrial Appications

Orchestra Media Servte System User Devices Fig. 2. The Entire Processes for the SEM. 2.2 Syntax & Semantics The SEM element is the root of the SEM schema. This element has three sub- elements those contains the General Information, the Effect Property, and the Effect Variables. The syntax and semantics of the SEM are shown in Fig. 3 and 4. Diagram SEM:SEMType SEM:Generallnfo - SEM:EffectProperty  SEM:EffectVariable 1„ Source <element name="GeneralInfo" type="mpeg7:DescriptionMetadataType" minOccurs="0"/> Fig. 3. The XML Schema Syntax of. GeneralInfoAn element containing the information on the metadata creation EffectProperty An element contains a list of Sensory Effect and the property of each Sensory Effect applied to the contents EffectVariable An element contains a set of Sensory Effect control variables and time information for synchronization with media scene Fig. 4. The XML Schema Semantics of. EffectProperty contains the information about overall Sensory Effect applied to the contents. EffectlD and Type will be defined for each Sensory Effect (Effect element in the schema) to identify itself and to notify category of the Sensory Effect. Under the Effect element, there is a set of property elements for describing Sensory Effect capabilities through which the orchestral media service system will match each Sensory Effect to User Device. The semantics of attributes are shown in Fig. 5 Session 5D 689

I. NameDefinition EffectiD An attribute containing ID of individual Sensory Effect. Type An attribute containing the enumeration set of Sensory Effect type. "VisualEffect": Sensory Effect for visual display such as monitor, TV, wall screen, etc. "SoundEffect": Sensory Effect for sound such as speaker, music instrument, bell, etc. "WindEffect":Sensory Effect for wind such as fan, wind injector, etc. "CoolingEffect": Sensory Effect for temperature such as air conditioner. "HeatingEffect": Sensory Effect for temperature such as heater, fire, etc "DimmmgEffect'. Sensory Effect for light bulb, dimmer, color LED, flash, etc. "FlashEffect": Sensory Effect for flash "ShadingEffect": Sensory Effect for curtain open/close, roll screen up/down, door open/close, etc. "VibrationEffect": Sensory Effect for vibration such as trembling chair, joystick, tickler etc. "DiffusionEffect": Sensory Effect for scent, smog, spray, water fountain, etc. "OtherEffect": Sensory Effect which is not defined or combination of above effect type Priority An optional attribute defining priority among the number of Sensory Effects AltEffect1D An optional attribute containing ID of alternate Sensory Effect which can be replace current Sensory Effect Fig. 5. The semantics of attributes of. EffectVariable is the container for various Sensory Effect variables which controls Sensory Effect in detail. Fig. 6 shows attributes of EffectVariable. In=rtme Defini SEfragmentI An attribute defining ID of the fragment of Sensory Effect DStart An attribute defining the start time that Sensory Effect will be activated. durationAn attribute defining the duration time that Sensory Effect will be deactivated. Fig. 6. The semantics of attributes of. 3 Generation of the Orchestral Media The format of the orchestral media is based on ISO base media file format. In order to satisfy streaming service, we used Timed Metadata Format (TeM) which is a standard method to send metadata in streaming way. Genera linfo Effe tProperty Decoder mit nrme -C - ToTZD Fragment C Fragment >_c--,atramasrsture Access Unit Fig. 7. Fragmentation Strategy of SEM. The first step is to fragment the SEM and encapsulate each fragmentation with TeM message. Fig. 7 shows the fragmentation strategy of SEM. The whole sub-elements of General Information and Effect Property, and the attributes of Effect Variable will be encapsulated with Decoderinit message which should be transported first hand. Fig. 8 shows the example of Decoderinit message. 690 Computers, Networks, Systems, and Industrial Appications

addNode / Fig. 8. The example of Decoderinit Message. Then each fragment containing start time and duration information and actual commands to control user devices will be encapsulated by AccessUnit message those should be transported synchronized with audio/visual packets. Fig. 9 shows the example of AccessUnit messages. addNode /EffectVariable... addNode EffectVariable... Fig. 9. The example of AccesUnit messages. The second step is to embed TeM access units into ISO media file format as a metadata track. Embedding the SEM in ISO base media file format follows the standard specification of MPEG-4 Part 12 Amendment 1. Since the SEM is a kind of timed metadata, we inserted the SEM in 'meta' box defined in ISO file format. The handler_type of Handler Reference Box is 'meta'. Fig. 10 shows the sample description box for the SEM and fig. 11 shows authoring tools which can insert, delete, edit, and publish the orchestral media. class MetaDataSampleEntry(codingname) extends SampleEntry (codingname) class XMLMetaDataSampleEntry() extends MetaDataSampleEntry ()meta') { string content_encoding; //optional string namespace; string schema_location; // optional BitRateBox 0 ; / / optional} Fig. 10. The sample description box for the SEM. Session 5D 691

(8)Window for effect type  ________ aWinclow for Media time line  _________ C ci, Media player button and,, progressive bar GiButtons for selecting effect 0)1F/inflows for media playback 0)1'0°1 bar for commonly used Miami type <Wain Menu CO Simulator Button IDNIP1 EncoderButton elligalif firb alliA Scott Cioing LED Flash Vika", T1 Acton TY,51111, Watts Phcne Fig. 11. Authoring tools for creating the orchestral media. 12u_mmw.: ,111.5 HICklary, lype, Postilion Minfro.. )Fklitbox for inserting information 8Gcncral Info. Tab ®Property Info. Tab OVariab e brio. Tab ®Wmdow for audio wave rLfteet Fragment 5 Conclusion. The Orchestral media service can maximize the reproduction effects via any device, and at any time and any place and restructure the media created in this way to make a media customized for and by users as interlocking the device depending on the environment of a user's peripheral devices. The orchestral media service can be used in the field of real sensing digital cinema, device cooperated education, user created contents (UCC). In this paper, we described SEM, concept of the orchestral media service, creation of the orchestral media and authoring tool. In the future, we have to study about binary encoding of the sensory information to reduce amount of the SEM. Acknowledgments. This work was supported by the Industrial Strategic Technology Development Program funded by the Ministry of Knowledge Economy(MKE, Korea). [ , Development of Convergence Media Service Platform for Realistic Experience in Smart Home Environment] References 1.Yun, J.K., Shin, H.S., Lee, H.R., K.R. Park.: Development of the Multiple Devices Control Technology for Ubiquitous Home Media Service System. In : International Conference on Ubiquitous Information Technology & Applications, pp (2007) 2.Choi, B.S., Joo, S.H., Lee, H.R., Park, K.R.: Metadata structure for Media and Device Interlocking, and the Method for Mapping It in the Media Format. In : Advances in Information Sciences and Services, pp (2007) 3.Elting, C.: Orchestrating Output Devices - Planning Multimedia Presentations for Home Entertainment with Ambient Intelligence. In : Joint sOc-EUSAI conference. (2005) 692 Computers, Networks, Systems, and Industrial Appications