WO2005052936A2 - System and method for arranging and playing a media presentation - Google Patents

System and method for arranging and playing a media presentation Download PDF

Info

Publication number
WO2005052936A2
WO2005052936A2 PCT/US2004/038725 US2004038725W WO2005052936A2 WO 2005052936 A2 WO2005052936 A2 WO 2005052936A2 US 2004038725 W US2004038725 W US 2004038725W WO 2005052936 A2 WO2005052936 A2 WO 2005052936A2
Authority
WO
WIPO (PCT)
Prior art keywords
media
package
media objects
trigger event
user
Prior art date
Application number
PCT/US2004/038725
Other languages
French (fr)
Other versions
WO2005052936A3 (en
Inventor
Sumita Rao
Original Assignee
Kyocera Wireless Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Wireless Corp. filed Critical Kyocera Wireless Corp.
Priority to EP04811438A priority Critical patent/EP1685709B9/en
Priority to DE602004025137T priority patent/DE602004025137D1/en
Priority to AT04811438T priority patent/ATE455435T1/en
Priority to JP2006541378A priority patent/JP4550068B2/en
Publication of WO2005052936A2 publication Critical patent/WO2005052936A2/en
Publication of WO2005052936A3 publication Critical patent/WO2005052936A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages

Definitions

  • the field of the present invention is the presentation of media objects, for example, images or sounds. More particularly, the present invention relates to presenting media objects using an embedded processor system.
  • Many electronic devices use embedded processors.
  • mobile electronic devices often include embedded processors, microprocessors, or other controllers for controlling the device and providing an interface to a user.
  • devices such as mobile wireless phones, personal data assistants, MP3 players, and cameras generally include embedded processors for monitoring, operating, and using these devices.
  • many consumer devices such as DVD players, CD players, stereo equipment, appliances, and motor vehicles include embedded operational controllers.
  • These embedded controllers typically have limited processing capability, and their processing capability is preferably prioritized towards operation and monitoring functions, instead of using excessive processing power and memory to provide a complex user interface.
  • These devices also may have limited memory, such as RAM memory, to keep costs down.
  • the embedded processor's limited memory, limited processor power, and simple structure cooperate to make cost sensitive and reliable devices.
  • These embedded systems often require or benefit from a visual display to a user, and often have other presentation devices such as a speaker, LED panels, or other media presentation components.
  • a mobile phone may have a graphical user interface displayed on an LCD screen for providing a man-machine interface. The mobile phone may also enhance the user experience by permitting the user to view an image, listen to a favorite song, or watch a movie trailer.
  • the processor in the mobile phone is responsible for call processing, diagnostics, and support applications, so only limited processor power is generally available to operate and manage the user interface or other graphical processes. Consumers, however, are demanding more interesting and more useful interaction with their electronic devices. In one example, consumers desire a media aspect to the user interface by using sound, images, graphics, animations, or movies.
  • a typical device that uses an embedded system has limited RAM memory and a relatively simple processor structure. Accordingly, the device may provide only a limited media experience, for example, by allowing for the display of only short animation segments or simple screensavers. Longer media presentations may consume too much processing power and memory, and divert a substantial amount of the device's limited resources into managing and playing a media presentation.
  • the device may fail to respond to a time-critical event, such as receiving a wireless telephone call, because the device has dedicated too much memory or processor time to the media presentation.
  • a time-critical event such as receiving a wireless telephone call
  • an increase in power or memory would increase the complexity and cost for the embedded system and the device.
  • consumers are demanding more interesting, active, and helpful user interfaces, and longer media presentations could assist in making more useful and aesthetically pleasing displays.
  • consumers desire electronic devices that can be customized and tailored to a user's particular preferences. For example, mobile phones often provide for changeable faceplates that allow a user to select a housing color, aesthetic style, or message. In another example, many portable devices allow the user to specify the "wake-up" screen to the device.
  • the device "greets" the user with a message particular to that user.
  • a typical customized screen may show the user the local weather, or may present the latest box scores for the user's favorite team. Accordingly, there is a need for providing a customizable system and method that enables the sequencing and presentation of media objects on embedded systems, particularly where the embedded system has limited memory and processor capability.
  • the present invention provides a method and system for arranging and playing media objects in a media presentation.
  • the system enables a user to select and order media objects, such as sound files, image files, animations, and text into a media presentation.
  • the media presentation is then associated with a trigger or other interrupt event.
  • the system plays the media presentation on the system's output devices.
  • the selected media files, ordering information, and other properties are assembled into a media package.
  • the media package may be published to a remote device so that a remote device may play the media presentation.
  • the method is operated on a mobile wireless phone. A user selects a sequence of images from an image file stored on the phone.
  • the user places the images into a desired order, and in some cases may be enabled to specif , for example, durations, timings, and transitions for the selected images. Depending on specific configurations, the user may also specify and sequence other media objects, such as sound files, text, or animations.
  • the selected and ordered images (and other media objects if selected) are stored as a Screensaver file.
  • the Screensaver file is associated with a Screensaver event on the phone, which typically is set to trigger after a predetermined duration of inactivity. The phone then monitors for the Screensaver event, and upon its occurrence, plays the customized Screensaver.
  • the described system and method enables a user to dynamically arrange customized media presentations on relatively simple devices, such as wireless phones.
  • Fig.1 is a block diagram of a wireless device arranging and playing a media presentation in accordance with the present invention
  • FIG. 2 is a flowchart of a method o£ arranging and playing a media presentation in accordance with the present invention
  • Fig. 3 is a flowchart of method for generating and playing presentation segment files in accordance with the present invention
  • Fig.4 is a block diagram of example file formats for a media package in accordance with the present invention
  • Fig. 5 is a flowchart of a method for arranging and playing a media presentation in accordance with the present invention
  • Fig. 6 is a flowchart of a method for arranging and playing a media presentation in accordance with the present invention
  • Fig. 7 is a block diagram of a device arranging and playing a media presentation in accordance with the present invention.
  • System 10 illustrates an example construction using a wireless communication device 12.
  • Fig. 1 illustrates a wireless device 12
  • the method of system 10 may be advantageously used on many other types of devices.
  • the method of system 10 may be used in devices having embedded controllers, such as personal data assistants, MP3 players, DVD/ CD players, appliances, cars, cameras, or other consumer devices.
  • system 10 enables a consumer to use the simple wireless device 12 to dynamically arrange or configure a custom media presentation.
  • the custom media presentation is associated with some event trigger on the device 12, and when that event occurs, the device plays the media presentation.
  • a user may define a custom screensaver that includes personal image files, favorite sounds, and custom text.
  • the media presentation may be associated with a screensaver trigger on the wireless device 12.
  • the wireless device indicates the screensaver should be played, the custom media presentation is played as the screensaver.
  • Wireless device 12 includes input devices 17 such as a keypad 46 and a microphone 48.
  • the keypad may include a ten-key input for numbers, toggle switches, rotary knobs, and other buttons and input components.
  • the wireless device 12 may also include other input devices such as bar code readers and portable keyboards.
  • the wireless device also has output devices 16.
  • Typical output devices for wireless device include a graphical display 41, which may be in the form of a black and white or color LCD display.
  • the wireless device 12 also typically has one or more speakers 42.
  • the speakers are constructed to enable a user to hear a telephone call, hear a ring tone, and possibly hear communications to facilitate use as a speaker phone. It will be appreciated that one speaker may be arranged for all these functions, or the wireless device may have multiple speakers.
  • the wireless device 12 also may have other lights 43 useful for indicating status of the wireless device and for illumination purposes.
  • the wireless device may also have a vibrator 44 for shaking the wireless device to notify a user of an incoming call when a ring tone is not desired.
  • the wireless device also has an RF section 19 that would include an antenna 51, a transceiver 53, and may include a GPS receiver 55 for determining location position.
  • the RF section 19, input devices 17, and output devices 16, all couple to a processor 14.
  • the processor 14 may be in the form of a single processor, or may be constructed as multiple interconnected processors.
  • the processor may be in the form of a microprocessor, computer chip, gate array, PLD, or other logic device.
  • Processor 14 may be in one example, an embedded processor having a relatively simple structure and limited RAM memory.
  • the processor 14 implements several functions for the wireless device, and those functions may be performed in hardware, in firmware, in application software, or a combination of the above.
  • Wireless device 12 also includes other components which are not shown, such as a battery power source and a housing.
  • the processor 14 functions to provide a user interface for the user of the wireless device. This user interface generally accepts inputs from the keypad 46 and other input devices 17, and displays instructions, status, and other information to the user on the output devices 16 such as display 41.
  • the processor 14 implements a configuration utility 27.
  • Configuration utility 27 enables a user to select and order multimedia files into a presentation. Media files 21 may be accessed by the configuration utility 27 and a list of available files is displayed to the user.
  • Media files 21 may be stored locally on the wireless device, on an expansion memory card on the wireless device, or may be on a server which can be accessed wirelessly.
  • the media files 21 may include sound files 58, image files 59, animation files 60, synthesized speech files 61, and Midi files 62.
  • the media files may also include links 64 which could provide links to remote servers for identifying additional remote media files 28.
  • Media files 21 may also include transitions 63.
  • a transition is used to provide a smooth change when initiating a media file, ending a media file, or changing media files. For example, a transition may allow an image to slowly fade onto the screen or fade gently from the user's view. Such transitions give a more finished and professional appearance to media presentations on a device.
  • the user selects a series of media files for a media presentation. For example, the user may select a series of personal images and a favorite music clip. The user would then use the configuration utility 27 to order the images into a proper sequence, and set the start event and duration for the music clip.
  • the configuration utility 27 then generates a media package 23.
  • the media package 23 has information sufficient to enable the wireless device to play the media presentation, including media file information, ordering information, and timing information.
  • an association utility 26 may be used to associate the media package with a trigger event on the wireless device.
  • the association utility is also part of the user interface for the wireless device 12, thereby enabling a user to associate a specific media package with a particular trigger event.
  • the user may associate a particular media package with a screensaver event, and may associate another media package with a "call received" trigger event.
  • the configuration utility 27 may be automated by the wireless device, such that the wireless device automatically detects and associates a media package with a particular trigger event.
  • the association utility 26 makes an association, the association between an event and a media file is stored in an association list 25. It will be appreciated that other methods may be used for storing and tracking associations.
  • the association utility 26 may be able to associate media packages with several different types of trigger events. For example, a wireless device may have a call processor event 34 that provides triggers for when a call is received, a call is disconnected, or a call is dropped.
  • the call processor may provide indications or triggers regarding whether a voice call, paging call, SMS message, or text message is received.
  • a timer 35 may also generate a trigger. For example, after a period of no user activity, a wireless device often provides a screensaver trigger to cause a screensaver to be shown on the LCD screen. The screensaver not only provides an aesthetic appeal to the wireless device, it also protects the LCD from suffering a burnout condition. Pressing a key 36 may also generate a trigger event that may be used to load and play a media package.
  • Caller ID function 37 is also available on many wireless devices for providing an indication of the specific caller initiating a voice call.
  • a device monitor 38 may be used to set a device trigger.
  • a device trigger may be set for a low battery condition, no carrier received condition, or other status of the wireless device.
  • many wireless devices have an ability to obtain position location information 39. This position location information may be useful for setting a trigger event. For example, a user may desire that a particular song be played when the user is about to arrive at home. In another example, the location trigger could be set play another song whenever the wireless device receives an indication that a coffee shop is close by. It will be appreciated that many other event triggers may be generated consistent with this disclosure.
  • the event processor 33 is used to monitor for an occurrence of that event or events.
  • the event processor upon detecting the associated occurrence, provides an interrupt into the media engine 29.
  • the media engine 29 may recall the association list 25, which includes an identification of which media package should be recalled upon the happening of a particular trigger event. For example, if the timer event 35 provides an interrupt to the media engine 29, the media engine 29 can extract information from the association list 25 regarding which media file is associated to the timer event.
  • the association list 25 may indicate that a particular media package is to be played as a screensaver upon the timer event trigger.
  • the media engine 29 then extracts the media package from the media package file 23, and presents the media presentation on the display 41, speaker 42, or other output devices used by the media package.
  • the media package 23 includes the media object data, so the media engine is able to play the media presentation without accessing the media file 21.
  • the media package provides a reference link to the media objects, and the media engine accesses the needed media objects from media file 21.
  • the media package and media objects may be stored locally on the phone in a format that can be immediately used by the media engine 29, or a media package processor 31 may be used to further process the media package and media objects for use by the media engine 29. For example, one or more of the individual media objects may be too large to be efficiently used by the media engine.
  • Method 80 begins by allowing a user to set configuration 81 of a media presentation.
  • the user accesses files 87 either on a local device or on a remote server.
  • the local files may be stored in local memory, or may be stored on a removable memory card, for example.
  • the media objects may be image files, animations, sound files, Midi music files, text files, artificial speech files, or other types of media objects.
  • the user selects 88 a set of sound, image, animation, or other media objects to be used as part of the media presentation.
  • the user may then specify an order 89 for the selected media objects.
  • each of the media objects may be played sequentially.
  • certain of the media objects may be played concurrently with another, such as when the user desires to play a music clip at the same time an image is being displayed.
  • the user then may be able to set certain specific properties 90 for each of the media files.
  • each type of media object may have different properties that may be set. For example, a sound file may have a duration property and a volume property, while an image file may have a duration property and a color depth property.
  • the user may also be able to add transitions 91 to the media objects. Transitions may be added at the start of a media object, at the end of a media object, or between media objects. Typical transitions may include zooms, fades, dissolves, louvers, and spins. It will be understood that many other types of transitions may be used.
  • a media package is generated 82.
  • the media package may be arranged in optional forms. For example, the media package may be encapsulated 94.
  • An encapsulated media package includes data information for each of the media objects selected, plus the necessary ordering, sequencing, and timing information. In this way, an encapsulated media package is self-contained with the information necessary for the media engine to present the media presentation.
  • encapsulated media package may be published 86.
  • the encapsulated media package may be published to a server where other users may access and download the media package, or it may be transmitted directly to another user.
  • the remote user may be enabled to associate the encapsulated media package with a particular trigger event, and have the media presentation played on the remote user's device.
  • the media package may also be referenced 95.
  • a referenced media package does not contain all the media data information, but instead provides directory or file links to where the media files may be located.
  • a referenced media package may include a directory name, a file name, or a server name where a specific image object could be located. In this way, media files may be reused by many media packages, thereby saving memory space.
  • Media packages may also automatically link media objects, or if individual media objects are too large, then link smaller segments together to form the whole media objects.
  • each media object or segment is associated with sufficient information to cause the media engine to load the next sequential media object. In this way, each of the media objects or segments is linked or chained together.
  • a linked media package provides for a simpler processor structure and memory management.
  • the media package may be associated with a particular trigger event 83.
  • the trigger event may be an incoming call 96, a particular caller ID 97, a particular time or duration 98, or a status of the device 99. It will be appreciated that many other trigger events may be used.
  • Method 120 is particularly useful for embedded systems having a simple structure with limited RAM. Such embedded systems are most efficiently utilized when an entire media segment can be loaded into local RAM prior to presenting the media object. Because embedded processors often have limited RAM, the size and number of sequential media objects may be limited. Therefore, method 120 enables a large media presentation to be segmented into individual segments, with each segment sized to efficiently be loaded into available RAM. Advantageously, long media presentations may then be presented responsive to trigger events.
  • the Method 120 starts by having a user select and order media objects to define a media presentation as shown in block 122.
  • the user desires to start playing the sound "mpl”, and then start playing the animation "anl”.
  • the user desires to display the image "jpl".
  • the selection and ordering is done as part of a device's user interface, with the information being forwarded into a batch process 137. It will be understood that many alternatives exist for providing an interface to enable a user to select and order media objects.
  • the batch process 137 has access to media files 124, where each of the selected media objects can be found.
  • Each of the media objects such as the sound 126 "mpl" is in the media file 124 and includes an indication of file format, the file name, and size information.
  • sound 126 is in an MP3 format
  • the sound data file can be found in a file named "mpl”
  • the file is 150 units in size.
  • the size units may be, for example, in bytes or kilobytes, or may be a relative size indication.
  • the animation file 128 is shown to include eight sequential image files, each in a bit map format, and between 200 and 400 units in size.
  • the image 129 is in a JPEG format, and is 600 units is size.
  • the batch process also has access to configuration information 131, which may include a maximum size for individual presentation segments.
  • the size may be predefined to allow each animation segment to be loaded into memory, or may be determined dynamically dependent on specific device status. Since the batch process 137 knows the number and size of all of the files in the media presentation, and also knows the maximum size from the configuration file 131, the batch process 137 can segment the presentation into a set of sequential segments 140. For example, a very large media object may be divided into smaller sequential subsets, while a sequence of small media objects may be combined into a single segment. In this way, segments are provided that are sized to be particularly efficient for loading into available RAM memory. Each of the segments may also include one or more action commands that provide sequencing, timing, or other presentation information.
  • the max size may be set at 1000.
  • the batch process therefore makes a first presentation segment 141 that includes only the sound file.
  • the sound file is separately segmented since the sound file is going to be played concurrently with showing several image frames.
  • the sound segment 141 includes a file identifier 151, an action command showing that the sound file should be played for 7 seconds (7000 mS) 152, and another action command 153 showing that segment "A" should be immediately loaded and played. Sound segment 141 also shows that the file "mpl” should be loaded. After “mpl” has played, the final action instruction 154 "END", instructs that no additional files should be loaded.
  • the batch processor 137 segments the animation into three sequential segments 142, 143, and 144.
  • the first segment 142 includes the first three image bitmaps. Since the first three images will each be in memory at the same time that the sound file is included, the fourth bit map cannot be loaded without exceeding the 1000 size limit. In this way, segment 142 is limited to the first three image files.
  • the second image segment 143 contains only bitmaps 4 and 5.
  • the third segment 144 includes the final three bitmap images.
  • Each of the three animation segments includes a file identifier and at least one action instruction. For example, segment 142 is identified as "A”, and has an action identifier of "A2".
  • segment 142 includes an action instruction for identifying segment " A2" as the next sequential segment.
  • segment 143 includes an action instruction "A3” for providing a callback to "A3" as the next animation segment
  • segment 144 includes a callback instruction of "A4" for calling the "A4" segment 145 when segment "A3” has played.
  • the " A4" segment also includes a first action instruction to show that the image "jpl” should be displayed for 5 seconds, and a final action instruction "END" showing that no more segments are included in the media presentation. Together, the media segments 140 are combined into a media package 156.
  • the media package includes the media object data, and therefore becomes an encapsulated media package 175.
  • Such an encapsulated media package 175 could therefore be published 177 to a server or transmitted directly to another user. The other user would thereby be able to play the encapsulated media presentation without having to acquire the media objects from other locations.
  • the media package may include references to the file names, and the media engine 160 can retrieve data for the media objects from a media file or files 158.
  • Media packages may be associated with specific event triggers for a device. In one example, the association is made dynamically by a user to facilitate further customization options, and in another example, the device may define the association.
  • An association list 133 is used to track the associations between media packages and trigger events.
  • the device monitors for trigger events 135, and when a trigger event occurs, uses the association list 133 to determine which media package to access and play.
  • the media engine thereby retrieves the correct media package for the associated trigger event and accesses media objects from the media file 158.
  • the media engine proceeds to play the media presentation on an output device 162 such as the display and the speaker.
  • the presentation will be shown to the user as a multimedia presentation 166. More particularly, the user will first hear a sound 172 which will continue for 7 seconds. The user then sees animation segments 168, 169, and 170. At the completion of the eight frames of animation, a static image 171 will be shown for 5 seconds. Shortly after the image is complete, the sound 172 will end.
  • Method 120 enables a user to dynamically configure and arrange media objects into a custom media presentation, and the batch processor divides the media presentation into segments sized to be conveniently loaded into available RAM memory.
  • the media presentation is then associated with a particular trigger event, and upon an occurrence of that event, the media presentation is played.
  • the media presentation may also be provided into a package that can be published for use by a remote user.
  • Fig. 4 example file formats for media packages are illustrated.
  • a media package generally consists of information sufficient so that a media engine may play several media objects, and includes ordering and sequencing information so the media engine knows how to order, sequence, and time the media objects.
  • file formats 201 and 203 the file formats use media segments like the media segments generated with reference to Fig. 3.
  • a media package 209 that references presentation segments stored in a directory.
  • Package 209 includes directory information so that the media engine knows what directory, server, or other location to find the individual presentation segments.
  • the package 209 shows that the first sound file “S” plays for 7 seconds and the "A" segment immediately loads and plays. After “S” completes, no more files are loaded responsive to "S”. After “S” has been loaded, "A” is loaded and each of its frames displayed for 200 milliseconds, and at the completion of the last image, segment “ A2” is loaded. In a similar manner all the images associated with " A2" are played for 200 milliseconds and at the completion of the last image, segment “A3" is loaded.
  • File format 203 shows an example of an encapsulated segmented media package.
  • the media package includes a sound segment 214 that includes sequencing and timing information along with sound data.
  • Three animation segments 215, 216, and 217 are also provided, with each having sequencing information and respective image data.
  • the media package also includes an image segment 218 having timing information, sequencing information and image data.
  • encapsulated segmented file format 203 may be transmitted to a server for download to other users, or may be transferred directly to another remote device for presentation. Since the encapsulated file includes all data and information necessary for the presentation, any compatible media engine may play the media presentation. For example, a user may develop a particularly interesting screensaver, and may encapsulate that screensaver in a media package. The user may transmit the screensaver package to a central server, where another user could download that screensaver package and have that screensaver operate on the remote user's device.
  • a referenced format 205 is shown. The referenced format 205 includes a media package 220 that includes the name of the screensaver and a directory where the data files can be found.
  • the media package simply identifies the order of the media objects, and alternatively may include additional sequencing and timing information.
  • the media package 220 identifies the directory where all the media files 222 are located.
  • the media package 225 includes an identification of the order of media objects, and also includes all of the media data in a single file.
  • the encapsulated media package 225 may be packaged for transmission and publication. It will be appreciated that many alternatives exist for formatting and arranging the media package. Referring now to Fig. 5 a flowchart of a method for arranging and playing a media presentation is illustrated.
  • method 250 illustrates a method for defining a long media presentation, dividing the long media presentation into a series of presentation segments, and then sequentially playing each of the presentation segments responsive to an event trigger.
  • Method 250 is generally divided into a user configuration section 251, a background monitoring section 253, a segment generation section 255, and a segment presentation section 257.
  • a user defines the presentation 261 by ordering and selecting media objects. The user may also be able to add and change particular characteristics and properties for each of the media objects.
  • the media presentation is associated with a particular trigger event 263, such as with an elapsed timer or an interrupt action.
  • the device then enters a monitoring phase 253 where it monitors for the occurrence of the target event 265.
  • the long media presentation is divided into segments that can be more easily loaded into limited RAM of, for example, an embedded processor system.
  • the dividing and segmentation process 255 may occur responsive to the occurrence of the trigger event, or may occur at a different time.
  • the media presentation may be segmented during the time when the processor has additional processing capability, and the prepared segments would therefore be ready for immediate use when the interrupt occurs.
  • the segmentation process may occur after the event trigger has been received.
  • a batch processor is used to determine presentation segments that are smaller than a maximu limit 266. The maximum limit is typically set at a size smaller than the amount of available RAM. In this way, an entire media presentation segment can be loaded into RAM at one time.
  • a segment identification 268 is added to each presentation segment.
  • the segment identification may be the file name for holding that presentation segment.
  • Segment sequence information 269 is also added to the presentation segment.
  • the segment sequence information may be in the form of an action command that provides a callback to the next segment in sequence. In this way, each segment links to the next segment in sequence so that the presentation segments are chained together.
  • an action command of "END" may be added as segment sequence information so that the media engine knows that no additional media segments should be loaded.
  • the segmenting process is continued 271 until all segments have been generated.
  • the trigger event is detected, the media presentation is then presented or played.
  • the trigger event has been associated with a particular media presentation, and that identification is used to recall the first segment identification in block 273.
  • the first segment is loaded into memory 275 and presented to the user as shown in block 276. For example, if the media segment is a sound file, the sound would play through the speaker, and if the media object is an image, then the image would be displayed on the display screen.
  • the media engine checks the sequence information 277 in the segment, and if an action command is a call back 279, then the media engine has the file name of the segment to load into memory next. Depending on other action commands and sequencing information contained in the segment, the media engine may immediately load the file and begin playing it concurrently, or may wait until the current media object has finished playing before loading and playing the next object. If the action command is the END command, then the media engine knows that no more media segments are to be played.
  • Method 250 thereby enables the playing of a long media presentation on an embedded system having simple structure and limited RAM.
  • each media segment includes sequencing information so that all the segments are easily linked or chained together. In some constructions, this enables a particularly efficient and simple processor and memory structure.
  • Fig. 6 another flow chart for arranging and playing a media presentation is illustrated.
  • Method 300 allows a user to define a presentation 302 by selecting and ordering media objects. Once the objects have been selected and ordered, a media package can be generated as shown in block 304. The media package then may be used locally 305 or may be published 306 to be used remotely 307. If used locally 305, the local user associates the media presentation with a trigger event as shown in block 308.
  • the local device then monitors for that event 310 and upon occurrence of that event 312 the media presentation is played.
  • the media presentation generated by the local user may be used and presented by the local device.
  • the media package may be published and used by a remote user and device.
  • the remote user 307 would receive the media package, either by downloading from a service or by receiving the media package as, for example an attachment to an SMS message.
  • the remote user associates the media package with a trigger event 314, and the remote device monitors for that trigger event as shown in block 315.
  • the remote device plays and presents the media presentation.
  • the media presentation generated on a local device may be published to a remote device for use and presentation.
  • System 325 includes a device 327, which may be for example, a portable battery powered device for use by a consumer.
  • the device 327 includes input devices 333 allowing the user to provide inputs to the device. Typical input devices may be for example, keyboards, keypads, microphones, and graffiti stylus devices.
  • the device 327 also includes output devices 331. For example output devices may include color or black and white screens, speakers, vibrators, lights, and other indicators.
  • the device 327 has a processor 329 which may be in the form of a single or multi processor configuration. The processor operates a user interface using the input devices 333 and the output devices 331.
  • the user interface may be a graphical user interface allowing the user to make choices graphically. In another example, the user interface may allow the user to interact with the device by selecting numbers or letters on a keypad, or by toggling various switches.
  • the device 327 also includes media files 335. The media files 335 may be included on local memory, or may be included on removable memory cards. Also, the device 327 may have communication links via wireless connection or Internet connection for accessing servers to find and access additional media files.
  • the user interface includes a configuration utility 340 enabling the user to select and order a set of media files. For example, the user may select a series of images to be displayed concurrently with the sound.
  • the configuration utility may also enable setting of certain properties and characteristics for each of the individual media objects, or for the media presentation as a whole.
  • the configuration utility may be a process operating on a different device.
  • the configuration utility may be operated on a computer system having access to a large number of media objects. The configuration is used to select and order these objects, and generates a media package that is arranged for transmission to the device 327. If the computer system has particular information regarding the device 327, the computer may also provide an association list to the device 327 that identifies which event trigger should be associated with the media package. An association 343 can be made on the local device between the media presentation and a particular trigger event.
  • An event processor 341 monitors for the particular event, and when that event occurs, notifies the media engine 338 that the event has occurred and provides an identification for the media presentation to be played.
  • the media engine 338 recalls the media presentation, and if necessary, recalls the individual media object files, and presents the media presentation using one or more of the output devices 331.
  • a user is able to dynamically construct and configure a media presentation, and associate that media presentation with a particular trigger event.
  • the media presentation may be displayed on the local device.
  • the device 327 includes communication abilities, the media presentation may be published wirelessly or through network connection to a remote device. Provided the remote device has a properly configured media engine, the media presentation may be played remotely.

Abstract

The system enables a user to select and order media objects, such as sound files, image files, animations, and text into a media presentation. The media presentation is then associated with a trigger or other interrupt event. Upon an occurrence of the event, the system plays the media presentation on system output devices. In one example of the system, the selected media files, ordering information, and other properties are assembled into a media package. The media package may be published to a remote device so that a remote device may play the media presentation.

Description

SYSTEM AND METHOD FOR ARRANGING AND PLAYING A MEDIA PRESENTATION
BACKGROUND This application is related to U.S. patent application number 10/719,317, filed
November 14, 2003, and entitled "System and Method for Arranging and Playing a Media Presentation", which is incorporated herein by reference. The field of the present invention is the presentation of media objects, for example, images or sounds. More particularly, the present invention relates to presenting media objects using an embedded processor system. Many electronic devices use embedded processors. For example mobile electronic devices often include embedded processors, microprocessors, or other controllers for controlling the device and providing an interface to a user. More specifically, devices such as mobile wireless phones, personal data assistants, MP3 players, and cameras generally include embedded processors for monitoring, operating, and using these devices. Also, many consumer devices such as DVD players, CD players, stereo equipment, appliances, and motor vehicles include embedded operational controllers. These embedded controllers typically have limited processing capability, and their processing capability is preferably prioritized towards operation and monitoring functions, instead of using excessive processing power and memory to provide a complex user interface. These devices also may have limited memory, such as RAM memory, to keep costs down. In this way, the embedded processor's limited memory, limited processor power, and simple structure cooperate to make cost sensitive and reliable devices. These embedded systems often require or benefit from a visual display to a user, and often have other presentation devices such as a speaker, LED panels, or other media presentation components. For example, a mobile phone may have a graphical user interface displayed on an LCD screen for providing a man-machine interface. The mobile phone may also enhance the user experience by permitting the user to view an image, listen to a favorite song, or watch a movie trailer. The processor in the mobile phone is responsible for call processing, diagnostics, and support applications, so only limited processor power is generally available to operate and manage the user interface or other graphical processes. Consumers, however, are demanding more interesting and more useful interaction with their electronic devices. In one example, consumers desire a media aspect to the user interface by using sound, images, graphics, animations, or movies. A typical device that uses an embedded system has limited RAM memory and a relatively simple processor structure. Accordingly, the device may provide only a limited media experience, for example, by allowing for the display of only short animation segments or simple screensavers. Longer media presentations may consume too much processing power and memory, and divert a substantial amount of the device's limited resources into managing and playing a media presentation. In such a case, the device may fail to respond to a time-critical event, such as receiving a wireless telephone call, because the device has dedicated too much memory or processor time to the media presentation. But, an increase in power or memory would increase the complexity and cost for the embedded system and the device. Despite these limitations, consumers are demanding more interesting, active, and helpful user interfaces, and longer media presentations could assist in making more useful and aesthetically pleasing displays. Also, consumers desire electronic devices that can be customized and tailored to a user's particular preferences. For example, mobile phones often provide for changeable faceplates that allow a user to select a housing color, aesthetic style, or message. In another example, many portable devices allow the user to specify the "wake-up" screen to the device. In this way, the device "greets" the user with a message particular to that user. A typical customized screen may show the user the local weather, or may present the latest box scores for the user's favorite team. Accordingly, there is a need for providing a customizable system and method that enables the sequencing and presentation of media objects on embedded systems, particularly where the embedded system has limited memory and processor capability.
SUMMARY Briefly, the present invention provides a method and system for arranging and playing media objects in a media presentation. The system enables a user to select and order media objects, such as sound files, image files, animations, and text into a media presentation. The media presentation is then associated with a trigger or other interrupt event. Upon an occurrence of the event, the system plays the media presentation on the system's output devices. In one example of the system, the selected media files, ordering information, and other properties are assembled into a media package. The media package may be published to a remote device so that a remote device may play the media presentation. In a preferred example, the method is operated on a mobile wireless phone. A user selects a sequence of images from an image file stored on the phone. The user places the images into a desired order, and in some cases may be enabled to specif , for example, durations, timings, and transitions for the selected images. Depending on specific configurations, the user may also specify and sequence other media objects, such as sound files, text, or animations. The selected and ordered images (and other media objects if selected) are stored as a Screensaver file. The Screensaver file is associated with a Screensaver event on the phone, which typically is set to trigger after a predetermined duration of inactivity. The phone then monitors for the Screensaver event, and upon its occurrence, plays the customized Screensaver. Advantageously, the described system and method enables a user to dynamically arrange customized media presentations on relatively simple devices, such as wireless phones. In this way, the user is able to customize the device according to the user's personal tastes. For example, the user may configure custom multimedia presentations as personalized screensavers or ring notifications. These custom presentations provide a level of sophistication and professionalism not available on typical known portable devices, and may even combine different types of media into a dramatic multimedia presentation. The system and method operates on relatively simple processor structures and in devices with limited memory resources. In this way, the present system and method may be implemented without substantial added expense or complexity. BRIEF DESCRIPTION OF THE DRAWINGS Fig.1 is a block diagram of a wireless device arranging and playing a media presentation in accordance with the present invention; Fig. 2 is a flowchart of a method o£ arranging and playing a media presentation in accordance with the present invention; Fig. 3 is a flowchart of method for generating and playing presentation segment files in accordance with the present invention; Fig.4 is a block diagram of example file formats for a media package in accordance with the present invention; Fig. 5 is a flowchart of a method for arranging and playing a media presentation in accordance with the present invention; Fig. 6 is a flowchart of a method for arranging and playing a media presentation in accordance with the present invention; and Fig. 7 is a block diagram of a device arranging and playing a media presentation in accordance with the present invention.
DETAILED DESCRIPTION Referring now to Fig. 1, a system and method for arranging and playing a media presentation is illustrated. System 10 illustrates an example construction using a wireless communication device 12. Although Fig. 1 illustrates a wireless device 12, it will be understood that the method of system 10 may be advantageously used on many other types of devices. For example, the method of system 10 may be used in devices having embedded controllers, such as personal data assistants, MP3 players, DVD/ CD players, appliances, cars, cameras, or other consumer devices. Generally, system 10 enables a consumer to use the simple wireless device 12 to dynamically arrange or configure a custom media presentation. The custom media presentation is associated with some event trigger on the device 12, and when that event occurs, the device plays the media presentation. For example, a user may define a custom screensaver that includes personal image files, favorite sounds, and custom text. The media presentation may be associated with a screensaver trigger on the wireless device 12. When the wireless device indicates the screensaver should be played, the custom media presentation is played as the screensaver. In this way users may personalize and customize their wireless device to make the device easier to use, more interesting, and more aesthetically pleasing. Wireless device 12 includes input devices 17 such as a keypad 46 and a microphone 48. The keypad may include a ten-key input for numbers, toggle switches, rotary knobs, and other buttons and input components. The wireless device 12 may also include other input devices such as bar code readers and portable keyboards. The wireless device also has output devices 16. Typical output devices for wireless device include a graphical display 41, which may be in the form of a black and white or color LCD display. The wireless device 12 also typically has one or more speakers 42. The speakers are constructed to enable a user to hear a telephone call, hear a ring tone, and possibly hear communications to facilitate use as a speaker phone. It will be appreciated that one speaker may be arranged for all these functions, or the wireless device may have multiple speakers. The wireless device 12 also may have other lights 43 useful for indicating status of the wireless device and for illumination purposes. The wireless device may also have a vibrator 44 for shaking the wireless device to notify a user of an incoming call when a ring tone is not desired. The wireless device also has an RF section 19 that would include an antenna 51, a transceiver 53, and may include a GPS receiver 55 for determining location position. It will be appreciated that the general construction of a wireless device, including its RF section, input devices, output devices, is well known and therefore will not be described in detail. The RF section 19, input devices 17, and output devices 16, all couple to a processor 14. It will be understood that the processor 14 may be in the form of a single processor, or may be constructed as multiple interconnected processors. It will also be understood that the processor may be in the form of a microprocessor, computer chip, gate array, PLD, or other logic device. Processor 14, may be in one example, an embedded processor having a relatively simple structure and limited RAM memory. The processor 14 implements several functions for the wireless device, and those functions may be performed in hardware, in firmware, in application software, or a combination of the above. Wireless device 12 also includes other components which are not shown, such as a battery power source and a housing. The processor 14 functions to provide a user interface for the user of the wireless device. This user interface generally accepts inputs from the keypad 46 and other input devices 17, and displays instructions, status, and other information to the user on the output devices 16 such as display 41. As part of the user interface, the processor 14 implements a configuration utility 27. Configuration utility 27 enables a user to select and order multimedia files into a presentation. Media files 21 may be accessed by the configuration utility 27 and a list of available files is displayed to the user. Media files 21 may be stored locally on the wireless device, on an expansion memory card on the wireless device, or may be on a server which can be accessed wirelessly. The media files 21 may include sound files 58, image files 59, animation files 60, synthesized speech files 61, and Midi files 62. The media files may also include links 64 which could provide links to remote servers for identifying additional remote media files 28. Media files 21 may also include transitions 63. A transition is used to provide a smooth change when initiating a media file, ending a media file, or changing media files. For example, a transition may allow an image to slowly fade onto the screen or fade gently from the user's view. Such transitions give a more finished and professional appearance to media presentations on a device. Using the configuration utility 27, the user selects a series of media files for a media presentation. For example, the user may select a series of personal images and a favorite music clip. The user would then use the configuration utility 27 to order the images into a proper sequence, and set the start event and duration for the music clip. The configuration utility 27 then generates a media package 23. The media package 23 has information sufficient to enable the wireless device to play the media presentation, including media file information, ordering information, and timing information. After the media package 23 has been generated, an association utility 26 may be used to associate the media package with a trigger event on the wireless device. In one example, the association utility is also part of the user interface for the wireless device 12, thereby enabling a user to associate a specific media package with a particular trigger event. For example, the user may associate a particular media package with a screensaver event, and may associate another media package with a "call received" trigger event. It will also be understood that the configuration utility 27 may be automated by the wireless device, such that the wireless device automatically detects and associates a media package with a particular trigger event. After the association utility 26 makes an association, the association between an event and a media file is stored in an association list 25. It will be appreciated that other methods may be used for storing and tracking associations. The association utility 26 may be able to associate media packages with several different types of trigger events. For example, a wireless device may have a call processor event 34 that provides triggers for when a call is received, a call is disconnected, or a call is dropped. Further, the call processor may provide indications or triggers regarding whether a voice call, paging call, SMS message, or text message is received. A timer 35 may also generate a trigger. For example, after a period of no user activity, a wireless device often provides a screensaver trigger to cause a screensaver to be shown on the LCD screen. The screensaver not only provides an aesthetic appeal to the wireless device, it also protects the LCD from suffering a burnout condition. Pressing a key 36 may also generate a trigger event that may be used to load and play a media package. Caller ID function 37 is also available on many wireless devices for providing an indication of the specific caller initiating a voice call. In another example, a device monitor 38 may be used to set a device trigger. For example, a device trigger may be set for a low battery condition, no carrier received condition, or other status of the wireless device. Finally, many wireless devices have an ability to obtain position location information 39. This position location information may be useful for setting a trigger event. For example, a user may desire that a particular song be played when the user is about to arrive at home. In another example, the location trigger could be set play another song whenever the wireless device receives an indication that a coffee shop is close by. It will be appreciated that many other event triggers may be generated consistent with this disclosure. After the association utility 26 has been used to associate a particular event or set of trigger events with a media package, the event processor 33 is used to monitor for an occurrence of that event or events. The event processor, upon detecting the associated occurrence, provides an interrupt into the media engine 29. The media engine 29 may recall the association list 25, which includes an identification of which media package should be recalled upon the happening of a particular trigger event. For example, if the timer event 35 provides an interrupt to the media engine 29, the media engine 29 can extract information from the association list 25 regarding which media file is associated to the timer event. The association list 25 may indicate that a particular media package is to be played as a screensaver upon the timer event trigger. The media engine 29 then extracts the media package from the media package file 23, and presents the media presentation on the display 41, speaker 42, or other output devices used by the media package. In the illustrated example, the media package 23 includes the media object data, so the media engine is able to play the media presentation without accessing the media file 21. In an alternative implementation, the media package provides a reference link to the media objects, and the media engine accesses the needed media objects from media file 21. It will be understood that the media package and media objects may be stored locally on the phone in a format that can be immediately used by the media engine 29, or a media package processor 31 may be used to further process the media package and media objects for use by the media engine 29. For example, one or more of the individual media objects may be too large to be efficiently used by the media engine.
Accordingly, the media package processor 31 may be used to divide the large media object into a series of sequential segments or subsets that can be easily accommodated by the media engine 29. Referring now to Fig. 2, a flowchart of a method of arranging and playing a media presentation is illustrated. Method 80 begins by allowing a user to set configuration 81 of a media presentation. In setting a configuration, the user accesses files 87 either on a local device or on a remote server. The local files may be stored in local memory, or may be stored on a removable memory card, for example. The media objects may be image files, animations, sound files, Midi music files, text files, artificial speech files, or other types of media objects. The user then selects 88 a set of sound, image, animation, or other media objects to be used as part of the media presentation. The user may then specify an order 89 for the selected media objects. In one example, each of the media objects may be played sequentially. In another example, certain of the media objects may be played concurrently with another, such as when the user desires to play a music clip at the same time an image is being displayed. The user then may be able to set certain specific properties 90 for each of the media files. It will be appreciated that each type of media object may have different properties that may be set. For example, a sound file may have a duration property and a volume property, while an image file may have a duration property and a color depth property. The user may also be able to add transitions 91 to the media objects. Transitions may be added at the start of a media object, at the end of a media object, or between media objects. Typical transitions may include zooms, fades, dissolves, louvers, and spins. It will be understood that many other types of transitions may be used. After the user has set the configuration, a media package is generated 82. The media package may be arranged in optional forms. For example, the media package may be encapsulated 94. An encapsulated media package includes data information for each of the media objects selected, plus the necessary ordering, sequencing, and timing information. In this way, an encapsulated media package is self-contained with the information necessary for the media engine to present the media presentation. In one example, encapsulated media package may be published 86. The encapsulated media package may be published to a server where other users may access and download the media package, or it may be transmitted directly to another user. In this way, the remote user may be enabled to associate the encapsulated media package with a particular trigger event, and have the media presentation played on the remote user's device. The media package may also be referenced 95. A referenced media package does not contain all the media data information, but instead provides directory or file links to where the media files may be located. For example, a referenced media package may include a directory name, a file name, or a server name where a specific image object could be located. In this way, media files may be reused by many media packages, thereby saving memory space. Media packages may also automatically link media objects, or if individual media objects are too large, then link smaller segments together to form the whole media objects. Using a linked media package 92, each media object or segment is associated with sufficient information to cause the media engine to load the next sequential media object. In this way, each of the media objects or segments is linked or chained together. In some configurations, a linked media package provides for a simpler processor structure and memory management. Once the media package has been generated, the media package may be associated with a particular trigger event 83. For example, the trigger event may be an incoming call 96, a particular caller ID 97, a particular time or duration 98, or a status of the device 99. It will be appreciated that many other trigger events may be used. The device then monitors for the event 84. When the event occurs, the device activates the media engine to play 85 and present the media package. In this way, the presentation is played for the user responsive to a particular trigger event. Referring now to Fig. 3 a flowchart for method for generating and playing presentation segment files is shown. Method 120 is particularly useful for embedded systems having a simple structure with limited RAM. Such embedded systems are most efficiently utilized when an entire media segment can be loaded into local RAM prior to presenting the media object. Because embedded processors often have limited RAM, the size and number of sequential media objects may be limited. Therefore, method 120 enables a large media presentation to be segmented into individual segments, with each segment sized to efficiently be loaded into available RAM. Advantageously, long media presentations may then be presented responsive to trigger events. In this way, devices become more friendly, more interesting, and more fun. Further description of generating and sequencing segments is provided in related U.S. patent application number 10/719,317, which has been incorporated herein by reference. The Method 120 starts by having a user select and order media objects to define a media presentation as shown in block 122. In this example, the user desires to start playing the sound "mpl", and then start playing the animation "anl". After the animation is completed, the user then desires to display the image "jpl". The selection and ordering is done as part of a device's user interface, with the information being forwarded into a batch process 137. It will be understood that many alternatives exist for providing an interface to enable a user to select and order media objects. The batch process 137 has access to media files 124, where each of the selected media objects can be found. Each of the media objects, such as the sound 126 "mpl" is in the media file 124 and includes an indication of file format, the file name, and size information. For example, sound 126 is in an MP3 format, the sound data file can be found in a file named "mpl", and the file is 150 units in size. It will be appreciated that the size units may be, for example, in bytes or kilobytes, or may be a relative size indication. In another example, the animation file 128 is shown to include eight sequential image files, each in a bit map format, and between 200 and 400 units in size. Finally, the image 129 is in a JPEG format, and is 600 units is size. The batch process also has access to configuration information 131, which may include a maximum size for individual presentation segments. The size may be predefined to allow each animation segment to be loaded into memory, or may be determined dynamically dependent on specific device status. Since the batch process 137 knows the number and size of all of the files in the media presentation, and also knows the maximum size from the configuration file 131, the batch process 137 can segment the presentation into a set of sequential segments 140. For example, a very large media object may be divided into smaller sequential subsets, while a sequence of small media objects may be combined into a single segment. In this way, segments are provided that are sized to be particularly efficient for loading into available RAM memory. Each of the segments may also include one or more action commands that provide sequencing, timing, or other presentation information. In Fig.3, the max size may be set at 1000. The batch process therefore makes a first presentation segment 141 that includes only the sound file. The sound file is separately segmented since the sound file is going to be played concurrently with showing several image frames. The sound segment 141 includes a file identifier 151, an action command showing that the sound file should be played for 7 seconds (7000 mS) 152, and another action command 153 showing that segment "A" should be immediately loaded and played. Sound segment 141 also shows that the file "mpl" should be loaded. After "mpl" has played, the final action instruction 154 "END", instructs that no additional files should be loaded. Since the images in the animation file 128 aggregate to a size much larger than the 1000 maximum size, the batch processor 137 segments the animation into three sequential segments 142, 143, and 144. The first segment 142 includes the first three image bitmaps. Since the first three images will each be in memory at the same time that the sound file is included, the fourth bit map cannot be loaded without exceeding the 1000 size limit. In this way, segment 142 is limited to the first three image files. In a similar manner, the second image segment 143 contains only bitmaps 4 and 5. And finally, the third segment 144 includes the final three bitmap images. Each of the three animation segments includes a file identifier and at least one action instruction. For example, segment 142 is identified as "A", and has an action identifier of "A2". In this way, the segment 142 includes an action instruction for identifying segment " A2" as the next sequential segment. In this way, when segment "A" has played, the media engine is instructed to load and play the segment "A2". In a similar manner, segment 143 includes an action instruction "A3" for providing a callback to "A3" as the next animation segment, and segment 144 includes a callback instruction of "A4" for calling the "A4" segment 145 when segment "A3" has played. The " A4" segment also includes a first action instruction to show that the image "jpl" should be displayed for 5 seconds, and a final action instruction "END" showing that no more segments are included in the media presentation. Together, the media segments 140 are combined into a media package 156. In one example, the media package includes the media object data, and therefore becomes an encapsulated media package 175. Such an encapsulated media package 175 could therefore be published 177 to a server or transmitted directly to another user. The other user would thereby be able to play the encapsulated media presentation without having to acquire the media objects from other locations. Alternatively, the media package may include references to the file names, and the media engine 160 can retrieve data for the media objects from a media file or files 158. Media packages may be associated with specific event triggers for a device. In one example, the association is made dynamically by a user to facilitate further customization options, and in another example, the device may define the association. An association list 133 is used to track the associations between media packages and trigger events. The device monitors for trigger events 135, and when a trigger event occurs, uses the association list 133 to determine which media package to access and play. The media engine thereby retrieves the correct media package for the associated trigger event and accesses media objects from the media file 158. The media engine proceeds to play the media presentation on an output device 162 such as the display and the speaker. When the media engine plays the presentation, the presentation will be shown to the user as a multimedia presentation 166. More particularly, the user will first hear a sound 172 which will continue for 7 seconds. The user then sees animation segments 168, 169, and 170. At the completion of the eight frames of animation, a static image 171 will be shown for 5 seconds. Shortly after the image is complete, the sound 172 will end. Method 120 enables a user to dynamically configure and arrange media objects into a custom media presentation, and the batch processor divides the media presentation into segments sized to be conveniently loaded into available RAM memory. The media presentation is then associated with a particular trigger event, and upon an occurrence of that event, the media presentation is played. The media presentation may also be provided into a package that can be published for use by a remote user. Referring now to Fig. 4, example file formats for media packages are illustrated. A media package generally consists of information sufficient so that a media engine may play several media objects, and includes ordering and sequencing information so the media engine knows how to order, sequence, and time the media objects. In file formats 201 and 203, the file formats use media segments like the media segments generated with reference to Fig. 3. In file format 201, a media package 209 is illustrated that references presentation segments stored in a directory. Package 209 includes directory information so that the media engine knows what directory, server, or other location to find the individual presentation segments. The package 209 shows that the first sound file "S" plays for 7 seconds and the "A" segment immediately loads and plays. After "S" completes, no more files are loaded responsive to "S". After "S" has been loaded, "A" is loaded and each of its frames displayed for 200 milliseconds, and at the completion of the last image, segment " A2" is loaded. In a similar manner all the images associated with " A2" are played for 200 milliseconds and at the completion of the last image, segment "A3" is loaded. Also, "A3" plays each of its images for 200 milliseconds and at the completion of the last image, presentation segment " A4" is loaded. Segment " A4" plays for 5 seconds, and the media presentation ends. Segment files 211 are located in the directory identified in the package 209 as well as the media file data 213. File format 203 shows an example of an encapsulated segmented media package. In file format 203, the media package includes a sound segment 214 that includes sequencing and timing information along with sound data. Three animation segments 215, 216, and 217 are also provided, with each having sequencing information and respective image data. The media package also includes an image segment 218 having timing information, sequencing information and image data. In this way encapsulated segmented file format 203 may be transmitted to a server for download to other users, or may be transferred directly to another remote device for presentation. Since the encapsulated file includes all data and information necessary for the presentation, any compatible media engine may play the media presentation. For example, a user may develop a particularly interesting screensaver, and may encapsulate that screensaver in a media package. The user may transmit the screensaver package to a central server, where another user could download that screensaver package and have that screensaver operate on the remote user's device. In another example of the media package format, a referenced format 205 is shown. The referenced format 205 includes a media package 220 that includes the name of the screensaver and a directory where the data files can be found. In this way, the media package simply identifies the order of the media objects, and alternatively may include additional sequencing and timing information. The media package 220 identifies the directory where all the media files 222 are located. In another example of the file format 207, the media package 225 includes an identification of the order of media objects, and also includes all of the media data in a single file. Again, the encapsulated media package 225 may be packaged for transmission and publication. It will be appreciated that many alternatives exist for formatting and arranging the media package. Referring now to Fig. 5 a flowchart of a method for arranging and playing a media presentation is illustrated. More particularly, method 250 illustrates a method for defining a long media presentation, dividing the long media presentation into a series of presentation segments, and then sequentially playing each of the presentation segments responsive to an event trigger. Method 250 is generally divided into a user configuration section 251, a background monitoring section 253, a segment generation section 255, and a segment presentation section 257. In the configuration segment 251 a user defines the presentation 261 by ordering and selecting media objects. The user may also be able to add and change particular characteristics and properties for each of the media objects. After the media presentation has been generally defined, the media presentation is associated with a particular trigger event 263, such as with an elapsed timer or an interrupt action. The device then enters a monitoring phase 253 where it monitors for the occurrence of the target event 265. In section 255, the long media presentation is divided into segments that can be more easily loaded into limited RAM of, for example, an embedded processor system. The dividing and segmentation process 255 may occur responsive to the occurrence of the trigger event, or may occur at a different time. For example, the media presentation may be segmented during the time when the processor has additional processing capability, and the prepared segments would therefore be ready for immediate use when the interrupt occurs. In another example, the segmentation process may occur after the event trigger has been received. In segmenting the long media presentation, a batch processor is used to determine presentation segments that are smaller than a maximu limit 266. The maximum limit is typically set at a size smaller than the amount of available RAM. In this way, an entire media presentation segment can be loaded into RAM at one time. A segment identification 268 is added to each presentation segment. For example, the segment identification may be the file name for holding that presentation segment. Segment sequence information 269 is also added to the presentation segment. For example, the segment sequence information may be in the form of an action command that provides a callback to the next segment in sequence. In this way, each segment links to the next segment in sequence so that the presentation segments are chained together. For the last segment, an action command of "END" may be added as segment sequence information so that the media engine knows that no additional media segments should be loaded. The segmenting process is continued 271 until all segments have been generated. When the trigger event is detected, the media presentation is then presented or played. The trigger event has been associated with a particular media presentation, and that identification is used to recall the first segment identification in block 273. The first segment is loaded into memory 275 and presented to the user as shown in block 276. For example, if the media segment is a sound file, the sound would play through the speaker, and if the media object is an image, then the image would be displayed on the display screen. The media engine then checks the sequence information 277 in the segment, and if an action command is a call back 279, then the media engine has the file name of the segment to load into memory next. Depending on other action commands and sequencing information contained in the segment, the media engine may immediately load the file and begin playing it concurrently, or may wait until the current media object has finished playing before loading and playing the next object. If the action command is the END command, then the media engine knows that no more media segments are to be played. Method 250 thereby enables the playing of a long media presentation on an embedded system having simple structure and limited RAM. Advantageously, each media segment includes sequencing information so that all the segments are easily linked or chained together. In some constructions, this enables a particularly efficient and simple processor and memory structure. Referring now to Fig. 6 another flow chart for arranging and playing a media presentation is illustrated. Method 300 allows a user to define a presentation 302 by selecting and ordering media objects. Once the objects have been selected and ordered, a media package can be generated as shown in block 304. The media package then may be used locally 305 or may be published 306 to be used remotely 307. If used locally 305, the local user associates the media presentation with a trigger event as shown in block 308. The local device then monitors for that event 310 and upon occurrence of that event 312 the media presentation is played. In this way, the media presentation generated by the local user may be used and presented by the local device. Alternatively, the media package may be published and used by a remote user and device. The remote user 307 would receive the media package, either by downloading from a service or by receiving the media package as, for example an attachment to an SMS message. Once the media package has been received at the remote device, the remote user associates the media package with a trigger event 314, and the remote device monitors for that trigger event as shown in block 315. Upon occurrence of that trigger event 317 the remote device plays and presents the media presentation. In this way, the media presentation generated on a local device may be published to a remote device for use and presentation. Referring now to Fig. 7 a block diagram of another device for arranging and playing a media presentation is illustrated. System 325 includes a device 327, which may be for example, a portable battery powered device for use by a consumer. The device 327 includes input devices 333 allowing the user to provide inputs to the device. Typical input devices may be for example, keyboards, keypads, microphones, and graffiti stylus devices. The device 327 also includes output devices 331. For example output devices may include color or black and white screens, speakers, vibrators, lights, and other indicators. The device 327 has a processor 329 which may be in the form of a single or multi processor configuration. The processor operates a user interface using the input devices 333 and the output devices 331. In one example, the user interface may be a graphical user interface allowing the user to make choices graphically. In another example, the user interface may allow the user to interact with the device by selecting numbers or letters on a keypad, or by toggling various switches. The device 327 also includes media files 335. The media files 335 may be included on local memory, or may be included on removable memory cards. Also, the device 327 may have communication links via wireless connection or Internet connection for accessing servers to find and access additional media files. The user interface includes a configuration utility 340 enabling the user to select and order a set of media files. For example, the user may select a series of images to be displayed concurrently with the sound. The configuration utility may also enable setting of certain properties and characteristics for each of the individual media objects, or for the media presentation as a whole. In an alternative arrangement, the configuration utility may be a process operating on a different device. For example, the configuration utility may be operated on a computer system having access to a large number of media objects. The configuration is used to select and order these objects, and generates a media package that is arranged for transmission to the device 327. If the computer system has particular information regarding the device 327, the computer may also provide an association list to the device 327 that identifies which event trigger should be associated with the media package. An association 343 can be made on the local device between the media presentation and a particular trigger event. An event processor 341 monitors for the particular event, and when that event occurs, notifies the media engine 338 that the event has occurred and provides an identification for the media presentation to be played. The media engine 338 recalls the media presentation, and if necessary, recalls the individual media object files, and presents the media presentation using one or more of the output devices 331. Using system 325, a user is able to dynamically construct and configure a media presentation, and associate that media presentation with a particular trigger event. Upon occurrence of the trigger event, the media presentation may be displayed on the local device. If the device 327 includes communication abilities, the media presentation may be published wirelessly or through network connection to a remote device. Provided the remote device has a properly configured media engine, the media presentation may be played remotely. While particular preferred and alternative embodiments of the present intention have been disclosed, it will be appreciated that many various modifications and extensions of the above described technology may be implemented using the teaching of this invention. All such modifications and extensions are intended to be included within the true spirit and scope of the appended claims.

Claims

CLAIMSWhat is claimed is:
1. A method for arranging and playing a media presentation, comprising: providing a media file having a plurality of media objects; receiving configuration instructions from a user; arranging the plurality of media objects into an ordered sequence responsive to the configuration instructions; associating the ordered sequence of media objects with a trigger event; monitoring for the trigger event; detecting the trigger event; and playing, responsive to the trigger event, the ordered sequence of media objects.
2. The method according to claim 1, further including the step of packaging the plurality of media objects into an encapsulated media package, the media package further including sequencing information for the media objects.
3. The method according to claim 2, further including the steps of: publishing the media package to a remote user device; associating, on the remote device, the media package with a trigger event; monitoring for the trigger event on the remote device; detecting the trigger event on the remote device; and playing on the remote device, responsive to the trigger event, the ordered sequence of media objects.
4. The method according to claim 1, further including: dividing at least one of the media objects into a set of sequential subsets so that each subset is smaller than a maximum size; and wherein the divided media object is played by loading and playing each of its respective subsets in sequential order.
5. The method according to claim 1 where at least one of the media objects is a sound file and at least anther one of the media objects is an image file.
6. The method according to claim 1 where the presentation is a screensaver for a display device, and the ordered sequence of media objects is played responsive to a timed trigger event.
7. A wireless device, comprising: an embedded processor; a keypad input device coupled to the embedded processor; a display screen coupled to the embedded processor; wherein the embedded processor implements a method comprising: displaying to a user a plurality of available media objects; receiving the user's configuration instructions from the keypad; selecting and ordering a set of media objects responsive to the configuration instructions; associating the set of media objects with a trigger event, the trigger event occurring at the wireless device; monitoring for the trigger event; and presenting, responsive to the trigger event, the ordered sequence of media objects.
8. The wireless device according to claim 7, further including a position location receiver coupled to the embedded processor, and wherein the embedded processor presents the media objects responses to a trigger event generated by the position location receiver.
9. The wireless device according to claim 7, further including a timer coupled to the embedded processor, and wherein the embedded processor presents the media objects responses to a trigger event generated by the timer.
10. The wireless device according to claim 7, further including a call processor coupled to the embedded processor, and wherein the embedded processor presents the media objects responses to a trigger event generated by the call processor.
11. The wireless device according to claim 7, wherein the embed process further receives caller identification information, and wherein the embedded processor presents the media objects responses to a trigger event generated according to the content of the caller identification information.
12. A method of arranging a screensaver and playing the screensaver on the display of a portable, battery powered device, comprising: providing a plurality of image files; receiving selection commands, the selection commands selecting a set of image files to use in the screensaver; ordering the selected files into a sequence; associating the sequence with a screensaver event; monitoring for an occurrence of the screensaver event; detecting the screensaver event; and playing the sequence on the display as a screensaver.
13. The method according to claim 12, wherein the receiving step further includes accepting commands entered by a user.
14. The method according to claim 12, wherein the receiving step further includes accepting commands generated responsive to the portable device receiving a wireless communication.
15. The method according to claim 12, further providing a sound file, and ordering the sound files into the sequence so that the sound plays on a speaker device.
16. The method according to claim 12 further including the step of packaging the selected media files and sequencing information into a media package.
17. The method according to claim 16 further including the step of transmitting the media package to a remote device, and playing the sequence on the display of the remote device.
18. A method of playing a media presentation using a device, comprising: providing a media package, the media package including sequence information for ordering a plurality of media objects in the media presentation; associating the media package with an event trigger; monitoring the device for an occurrence of the event trigger; and playing, responsive to the event trigger, the media presentation.
19. The method according to claim 18, wherein the providing step includes receiving the media package through a network connection.
20. The method according to claim 18, wherein the providing step includes receiving the media package through a wireless connection.
21. The method according to claim 18, wherein the providing step includes: receiving configuration instructions from a user of the device; selecting the media objects according to the configuration instructions; ordering the media objects according to the configuration instructions; and generating the media package at the device.
22. The method according to claim 18, wherein the providing step includes: receiving configuration instructions; selecting the media objects according to the configuration instructions; ordering the media objects according to the configuration instructions; and generating the media package.
23. The method according to claim 18, wherein the media package is an encapsulated media package including data for the media objects.
24. The method according to claim 18, wherein the media package is a referenced media package including a reference to a file location to access data for the media objects.
PCT/US2004/038725 2003-11-21 2004-11-17 System and method for arranging and playing a media presentation WO2005052936A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP04811438A EP1685709B9 (en) 2003-11-21 2004-11-17 System and method for arranging and playing a media presentation
DE602004025137T DE602004025137D1 (en) 2003-11-21 2004-11-17 SYSTEM AND METHOD FOR ARRANGING AND PLAYING A MEDIA PRESENTATION
AT04811438T ATE455435T1 (en) 2003-11-21 2004-11-17 SYSTEM AND METHOD FOR ARRANGING AND PLAYING A MEDIA PRESENTATION
JP2006541378A JP4550068B2 (en) 2003-11-21 2004-11-17 System and method for preparing and playing media presentations

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/719,317 US8166422B2 (en) 2003-11-21 2003-11-21 System and method for arranging and playing a media presentation
US10/719,317 2003-11-21

Publications (2)

Publication Number Publication Date
WO2005052936A2 true WO2005052936A2 (en) 2005-06-09
WO2005052936A3 WO2005052936A3 (en) 2005-06-30

Family

ID=34591291

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/038725 WO2005052936A2 (en) 2003-11-21 2004-11-17 System and method for arranging and playing a media presentation

Country Status (8)

Country Link
US (1) US8166422B2 (en)
EP (1) EP1685709B9 (en)
JP (1) JP4550068B2 (en)
KR (1) KR101033085B1 (en)
AT (1) ATE455435T1 (en)
DE (1) DE602004025137D1 (en)
ES (1) ES2336004T3 (en)
WO (1) WO2005052936A2 (en)

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005055495A2 (en) * 2003-11-26 2005-06-16 Saffi & Jones, Llc System and method for preparing mobile digital devices
US20050216913A1 (en) * 2004-03-23 2005-09-29 Gemmell David J Annotating / rating / organizing / relating content rendered on computer device during idle mode thereof
US20050248576A1 (en) * 2004-05-07 2005-11-10 Sheng-Hung Chen Transformation method and system of computer system for transforming a series of video signals
US8819143B2 (en) * 2005-05-31 2014-08-26 Flash Networks Ltd. Presentation layer adaptation in multimedia messaging
US8112514B2 (en) * 2005-06-27 2012-02-07 ARB Intellectual Property Holdings (HK), Limited Method and system for defining media objects for computer network monitoring
US20070069946A1 (en) * 2005-09-27 2007-03-29 Diego Kaplan Systems and methods for position based services in a mobile device
TWI299466B (en) * 2005-10-27 2008-08-01 Premier Image Technology Corp System and method for providing presentation files for an embedded system
KR100735327B1 (en) * 2005-11-11 2007-07-04 삼성전자주식회사 Method for displaying background screen in digital broadcasting reception terminal
US7460021B1 (en) * 2005-11-16 2008-12-02 The Weather Channel, Inc. Interactive wallpaper weather map
KR100726258B1 (en) * 2006-02-14 2007-06-08 삼성전자주식회사 Method for producing digital images using photographic files and phonetic files in a mobile device
US20070238453A1 (en) * 2006-04-05 2007-10-11 Ting-Mao Chang System and method for delivering notification through telephone network
WO2007132285A1 (en) * 2006-05-12 2007-11-22 Nokia Corporation A customizable user interface
KR101140212B1 (en) * 2006-07-11 2012-05-02 엘지전자 주식회사 Method of displaying a phrase on a wall paper of a mobile communication terminal using a mini planar function and the mobile communication terminal thereof
US20080046822A1 (en) * 2006-08-15 2008-02-21 Frank Meyer Apparatus and method for interactive user software interface design
US7860516B2 (en) * 2006-12-05 2010-12-28 Microsoft Corporation Automatic localization of devices
JP4962018B2 (en) * 2007-01-25 2012-06-27 富士通株式会社 Information processing device
US20090119332A1 (en) * 2007-11-01 2009-05-07 Lection David B Method And System For Providing A Media Transition Having A Temporal Link To Presentable Media Available From A Remote Content Provider
JP6051827B2 (en) * 2012-12-07 2016-12-27 株式会社リコー Document processing apparatus, image processing apparatus, document processing method, and document processing program
US20090172547A1 (en) * 2007-12-31 2009-07-02 Sparr Michael J System and method for dynamically publishing multiple photos in slideshow format on a mobile device
US8199904B2 (en) * 2008-04-07 2012-06-12 Sony Ericsson Mobile Communications Ab Method and device for creating a media signal
TWI428814B (en) * 2008-04-15 2014-03-01 Htc Corp Method for switching wallpaper in screen lock state, mobile electronic device thereof, and recording medium thereof
US20090262257A1 (en) * 2008-04-18 2009-10-22 Sony Corporation Background tv
US9225817B2 (en) * 2008-06-16 2015-12-29 Sony Corporation Method and apparatus for providing motion activated updating of weather information
US7941458B2 (en) * 2008-06-26 2011-05-10 Microsoft Corporation Abstraction layer for online/offline resource access
US9176962B2 (en) 2009-09-07 2015-11-03 Apple Inc. Digital media asset browsing with audio cues
US20120151341A1 (en) * 2010-12-10 2012-06-14 Ko Steve S Interactive Screen Saver Method and Apparatus
US20130166998A1 (en) * 2011-12-23 2013-06-27 Patrick Sutherland Geographically-referenced Video Asset Mapping
US9922439B2 (en) * 2014-07-25 2018-03-20 Samsung Electronics Co., Ltd. Displaying method, animation image generating method, and electronic device configured to execute the same
WO2016013893A1 (en) 2014-07-25 2016-01-28 Samsung Electronics Co., Ltd. Displaying method, animation image generating method, and electronic device configured to execute the same
US20160357364A1 (en) * 2015-06-07 2016-12-08 Apple Inc. Graphical User Interface for a Document Viewing Application
KR102448340B1 (en) * 2017-12-20 2022-09-28 삼성전자주식회사 Electronic device and method for controlling display location of content based on coordinate information stored in display driving integrated circuit

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020023274A1 (en) * 2000-04-07 2002-02-21 Giacalone, Louis D. Method and system for electronically distributing, displaying and controlling advertising and other communicative media
US20020055992A1 (en) * 2000-11-08 2002-05-09 Lavaflow, Llp Method of providing a screen saver on a cellular telephone
EP1424839A1 (en) * 2002-11-27 2004-06-02 Nec Corporation Method for generating graphics animations from still pictures in a cellular telephone and associated cellular telephone

Family Cites Families (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4722005A (en) * 1986-09-12 1988-01-26 Intel Corporation Software controllable hardware CRT dimmer
US5861881A (en) * 1991-11-25 1999-01-19 Actv, Inc. Interactive computer system for providing an interactive presentation with personalized video, audio and graphics responses for multiple viewers
US5479656A (en) * 1992-05-13 1995-12-26 Rawlings, Iii; Joseph H. Method and system for maximizing data files stored in a random access memory of a computer file system and optimization therefor
FI92782C (en) * 1993-02-09 1994-12-27 Nokia Mobile Phones Ltd Grouping mobile phone settings
US5812937B1 (en) * 1993-04-08 2000-09-19 Digital Dj Inc Broadcast data system with multiple-tuner receiver
US5452277A (en) * 1993-12-30 1995-09-19 International Business Machines Corporation Adaptive system for optimizing disk drive power consumption
US6292181B1 (en) * 1994-09-02 2001-09-18 Nec Corporation Structure and method for controlling a host computer using a remote hand-held interface device
US5974558A (en) * 1994-09-02 1999-10-26 Packard Bell Nec Resume on pen contact
US5740435A (en) * 1994-10-31 1998-04-14 Sony Corporation Data management apparatus and method for managing data of variable lengths recorded on a record medium
US5819284A (en) * 1995-03-24 1998-10-06 At&T Corp. Personalized real time information display as a portion of a screen saver
US5819290A (en) * 1995-04-10 1998-10-06 Sony Corporation Data recording and management system and method for detecting data file division based on quantitative number of blocks
US5680535A (en) * 1995-06-06 1997-10-21 Galerie 500 Screen saver for exhibiting artists and artwords
US5913040A (en) * 1995-08-22 1999-06-15 Backweb Ltd. Method and apparatus for transmitting and displaying information between a remote network and a local computer
US5748190A (en) * 1995-09-05 1998-05-05 Wisevision As Presentation system for individual personal computers in a personal computer network
US5978566A (en) * 1996-07-12 1999-11-02 Microsoft Corporation Client side deferred actions within multiple MAPI profiles
US6317593B1 (en) * 1996-08-12 2001-11-13 Gateway, Inc. Intelligent cellular telephone function
US5870683A (en) * 1996-09-18 1999-02-09 Nokia Mobile Phones Limited Mobile station having method and apparatus for displaying user-selectable animation sequence
US5930501A (en) * 1996-09-20 1999-07-27 Neil; John M. Pictorial user interface for establishing time of day and geographical or environmental context on a computer display or other monitor
EP1004082A2 (en) * 1996-10-09 2000-05-31 Starguide Digital Networks Aggregate information production and display system
US5905988A (en) * 1996-11-13 1999-05-18 Imaginon Method and apparatus for database transformation and adaptive playback
US5905492A (en) * 1996-12-06 1999-05-18 Microsoft Corporation Dynamically updating themes for an operating system shell
TW409245B (en) * 1996-12-11 2000-10-21 Koninkl Philips Electronics Nv A method and device for user-presentation of a compilation system
DE69840846D1 (en) * 1997-01-28 2009-07-09 Intellectual Ventures Holding DATA PROCESSING NETWORK FOR A COMMUNICATION NETWORK
US5907604A (en) * 1997-03-25 1999-05-25 Sony Corporation Image icon associated with caller ID
US5983073A (en) * 1997-04-04 1999-11-09 Ditzik; Richard J. Modular notebook and PDA computer systems for personal computing and wireless communications
US6209011B1 (en) * 1997-05-08 2001-03-27 Microsoft Corporation Handheld computing device with external notification system
US6243725B1 (en) * 1997-05-21 2001-06-05 Premier International, Ltd. List building system
US6009333A (en) * 1997-08-14 1999-12-28 Executone Information Systems, Inc. Telephone communication system having a locator and a scheduling facility
JPH1198020A (en) * 1997-09-24 1999-04-09 Sony Corp Method and device for analyzing bit stream
JPH11187439A (en) * 1997-12-22 1999-07-09 Sony Corp Portable information terminal equipment, screen display method, record medium, and microcomputer
US6112225A (en) * 1998-03-30 2000-08-29 International Business Machines Corporation Task distribution processing system and the method for subscribing computers to perform computing tasks during idle time
US6219694B1 (en) * 1998-05-29 2001-04-17 Research In Motion Limited System and method for pushing information from a host system to a mobile data communication device having a shared electronic address
US6438585B2 (en) * 1998-05-29 2002-08-20 Research In Motion Limited System and method for redirecting message attachments between a host system and a mobile data communication device
US6507351B1 (en) * 1998-12-09 2003-01-14 Donald Brinton Bixler System for managing personal and group networked information
US6353449B1 (en) * 1998-12-10 2002-03-05 International Business Machines Corporation Communicating screen saver
US6360101B1 (en) * 1998-12-31 2002-03-19 Ericsson Inc. Cellular phone that displays or sends messages upon its arrival at a predetermined location
US6920606B1 (en) * 1999-02-22 2005-07-19 Extended Digital, Llc Custom computer wallpaper and marketing system and method
US6769120B1 (en) * 1999-06-30 2004-07-27 International Business Machines Corporation Calendar-induced program execution
US6323775B1 (en) * 1999-08-10 2001-11-27 Telefonaktiebolaget Im Ericsson (Publ) Method, system and apparatus for proximity-based recharge notification
US6957398B1 (en) * 1999-12-22 2005-10-18 Farshad Nayeri Collaborative screensaver
JP2001292199A (en) 2000-01-31 2001-10-19 Denso Corp Telephone set and reading object-writing matter
US6640098B1 (en) * 2000-02-14 2003-10-28 Action Engine Corporation System for obtaining service-related information for local interactive wireless devices
JP2003526861A (en) * 2000-03-13 2003-09-09 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Storage of compressed data items
US6590590B1 (en) * 2000-06-06 2003-07-08 Mustek Systems, Inc. System and method for updating a graphic representation of a window item using an image information reading apparatus
GB2366697A (en) * 2000-08-31 2002-03-13 Nokia Mobile Phones Ltd Transmission of user profile via messaging service
ATE288096T1 (en) * 2000-09-13 2005-02-15 Siemens Ag SYSTEM HAVING A PROCESS ELEMENT WITH A SCREEN WITH AN ACTIVATION ELEMENT FOR REMOTE-CONTROLLED CANCELING OF THE SCREEN SAFE FUNCTION AND ACTIVATION ELEMENT FOR SUCH A SYSTEM
US6831970B1 (en) * 2000-09-21 2004-12-14 International Business Machines Corporation Method and system for remote activation of a telephone profile
JP2002118638A (en) * 2000-10-06 2002-04-19 Kyocera Corp Mobile communication terminal
US20020055986A1 (en) * 2000-11-08 2002-05-09 Lavaflow, Llp Method of downloadinga screen saver to a cellular telephone
JP2002152329A (en) 2000-11-08 2002-05-24 Tu-Ka Cellular Tokyo Inc Portable terminal system
WO2002039779A1 (en) * 2000-11-08 2002-05-16 King, John, J. Method of displaying a picture file on a cellular telephone
US6928300B1 (en) * 2000-11-09 2005-08-09 Palmsource, Inc. Method and apparatus for automated flexible configuring of notifications and activation
US7458080B2 (en) * 2000-12-19 2008-11-25 Microsoft Corporation System and method for optimizing user notifications for small computer devices
EP1219927B1 (en) * 2000-12-27 2008-05-28 FUJIFILM Corporation Information notification system and method
US7143433B1 (en) * 2000-12-27 2006-11-28 Infovalve Computing Inc. Video distribution system using dynamic segmenting of video data files
JP2002216460A (en) * 2001-01-16 2002-08-02 Matsushita Electric Ind Co Ltd Information recording medium
US20020138772A1 (en) * 2001-03-22 2002-09-26 Crawford Timothy James Battery management system employing software controls upon power failure to estimate battery duration based on battery/equipment profiles and real-time battery usage
GB2373887A (en) * 2001-03-28 2002-10-02 Hewlett Packard Co Context dependent operation, including power management, of a mobile computer
US6694418B2 (en) * 2001-03-30 2004-02-17 Intel Corporation Memory hole modification and mixed technique arrangements for maximizing cacheable memory space
US20020152193A1 (en) * 2001-04-13 2002-10-17 Thompson Robert S. System and method for displaying images
US7092740B1 (en) * 2001-04-20 2006-08-15 Trilogy Development Group, Inc. High density information presentation using space-constrained display device
CA2451178C (en) * 2001-06-18 2009-11-03 Research In Motion Limited System and method for managing message attachment and information processing from a mobile data communication device
JP2003018283A (en) * 2001-07-05 2003-01-17 Nec Corp Caller identifying method for telephone system, and telephone system with caller identification function applied with the method
US20030035529A1 (en) * 2001-08-14 2003-02-20 Charles Baker Presence detection by screen saver method and apparatus
US7362854B2 (en) * 2001-09-28 2008-04-22 Gateway Inc. Portable electronic device having integrated telephony and calendar functions
US7039784B1 (en) * 2001-12-20 2006-05-02 Info Value Computing Inc. Video distribution system using dynamic disk load balancing with variable sub-segmenting
JP3665615B2 (en) * 2002-01-30 2005-06-29 株式会社東芝 External storage device and battery remaining amount notification method in external storage device
US6961859B2 (en) * 2002-01-30 2005-11-01 Hewlett Packard Development Company, L.P Computing device having programmable state transitions
US20030169306A1 (en) * 2002-03-07 2003-09-11 Nokia Corporation Creating a screen saver from downloadable applications on mobile devices
US20040002943A1 (en) * 2002-06-28 2004-01-01 Merrill John Wickens Lamb Systems and methods for application delivery and configuration management of mobile devices
CN100438664C (en) * 2002-07-19 2008-11-26 华为技术有限公司 Intelligent mobile phone service triggering method based on identifier
US6909878B2 (en) * 2002-08-20 2005-06-21 Ixi Mobile (Israel) Ltd. Method, system and computer readable medium for providing an output signal having a theme to a device in a short distance wireless network
US7123696B2 (en) * 2002-10-04 2006-10-17 Frederick Lowe Method and apparatus for generating and distributing personalized media clips
KR100532273B1 (en) * 2002-10-11 2005-11-29 삼성전자주식회사 A method for informing a battery availability time according to action modes in a complex terminal
US20050005043A1 (en) * 2002-11-01 2005-01-06 Pushplay Interactive, Llc Controller and removable user interface (RUI) for media event and additional media content
US20050060238A1 (en) * 2002-11-01 2005-03-17 Pushplay Interactive, Llc Controller and peripheral user interface (pui) for media event
US20040116155A1 (en) * 2002-12-12 2004-06-17 Alain Aisenberg Cellular telephone back-up and media system
US7113809B2 (en) * 2002-12-19 2006-09-26 Nokia Corporation Apparatus and a method for providing information to a user
US7500198B2 (en) * 2003-04-25 2009-03-03 Motorola, Inc. Method and apparatus for modifying skin and theme screens on a communication product
US7315882B1 (en) * 2003-10-14 2008-01-01 At&T Delaware Intellectual Property, Inc. Method, system, and storage medium for providing automated execution of pre-defined events
US7301451B2 (en) * 2003-12-31 2007-11-27 Ge Medical Systems Information Technologies, Inc. Notification alarm transfer methods, system, and device
US7496352B2 (en) * 2004-03-02 2009-02-24 International Business Machines Corporation Environmentally driven phone behavior
JP2005267147A (en) * 2004-03-18 2005-09-29 Fuji Xerox Co Ltd Information management system
US7395097B2 (en) * 2004-12-03 2008-07-01 Motorola, Inc. Communications device with low energy notification
US7400229B2 (en) * 2005-04-04 2008-07-15 International Business Machines Corporation Method, system, and computer program product for providing an intelligent event notification system
US7430675B2 (en) * 2007-02-16 2008-09-30 Apple Inc. Anticipatory power management for battery-powered electronic device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020023274A1 (en) * 2000-04-07 2002-02-21 Giacalone, Louis D. Method and system for electronically distributing, displaying and controlling advertising and other communicative media
US20020055992A1 (en) * 2000-11-08 2002-05-09 Lavaflow, Llp Method of providing a screen saver on a cellular telephone
EP1424839A1 (en) * 2002-11-27 2004-06-02 Nec Corporation Method for generating graphics animations from still pictures in a cellular telephone and associated cellular telephone

Also Published As

Publication number Publication date
EP1685709B9 (en) 2010-05-26
EP1685709A2 (en) 2006-08-02
KR101033085B1 (en) 2011-05-06
US20050114800A1 (en) 2005-05-26
US8166422B2 (en) 2012-04-24
JP4550068B2 (en) 2010-09-22
ATE455435T1 (en) 2010-01-15
WO2005052936A3 (en) 2005-06-30
DE602004025137D1 (en) 2010-03-04
ES2336004T3 (en) 2010-04-07
KR20060103437A (en) 2006-09-29
EP1685709B1 (en) 2010-01-13
JP2007518292A (en) 2007-07-05

Similar Documents

Publication Publication Date Title
US8166422B2 (en) System and method for arranging and playing a media presentation
CA2436872C (en) Methods and apparatuses for programming user-defined information into electronic devices
US7620427B2 (en) Methods and apparatuses for programming user-defined information into electronic devices
US8996038B2 (en) Method for providing idle screen layer endowed with visual effect and method for providing idle screen by using the same
US20100073382A1 (en) System and method for sequencing media objects
US20080070616A1 (en) Mobile Communication Terminal with Improved User Interface
US8170538B2 (en) Methods and apparatuses for programming user-defined information into electronic devices
CN1148093C (en) Method for providing function of specific data notice in telephone
US20060140141A1 (en) Method and an apparatus for providing multimedia services in mobile terminal
US20060122718A1 (en) Data processing device, control method of data processing device, control program and manufacturing method of data processing device
US20060205439A1 (en) System and method for background sound scan element of a user interface
KR20040026962A (en) System and Method for Providing Background Image on Mobile Phone
CN101005520A (en) Multimedia action customizing method for communication device and communication device capable of customizing said action

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1291/KOLNP/2006

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2006541378

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2004811438

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1020067009909

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

WWP Wipo information: published in national office

Ref document number: 2004811438

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1020067009909

Country of ref document: KR