WO2004030381A1 - System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party - Google Patents

System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party Download PDF

Info

Publication number
WO2004030381A1
WO2004030381A1 PCT/KR2003/001893 KR0301893W WO2004030381A1 WO 2004030381 A1 WO2004030381 A1 WO 2004030381A1 KR 0301893 W KR0301893 W KR 0301893W WO 2004030381 A1 WO2004030381 A1 WO 2004030381A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
media information
information
video
avatar
Prior art date
Application number
PCT/KR2003/001893
Other languages
French (fr)
Inventor
Byung-Kwan Yi
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Priority to AU2003263618A priority Critical patent/AU2003263618B2/en
Priority to EP03798572A priority patent/EP1550326A4/en
Priority to CA2501595A priority patent/CA2501595C/en
Publication of WO2004030381A1 publication Critical patent/WO2004030381A1/en
Priority to HK06102511.0A priority patent/HK1082626A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1101Session protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/752Media network packet handling adapting media to network capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/148Interfacing a video terminal to a particular transmission medium, e.g. ISDN

Definitions

  • This invention generally relates to managing network communications, and more particularly to a system and method for controlling the communication of media information over an integrated services network.
  • next- generation communications systems must provide a variety of multimedia services including real-time streaming video and video-clip swapping, while simultaneously conserving or reducing transmission bandwidth requirements and other network resources.
  • Present communications systems do note adequately provide these services, and the services they do provide are implemented in an efficient manner. A need therefore exists for a system and method for providing enhanced multimedia services to the public which at the same time conserves or reduces network resources.
  • An object of the invention is to solve at least the above problems and/or disadvantages and to provide at least the advantages described hereinafter.
  • network resources are not used every time one terminal desired another
  • multimedia information based on an identity of a party at another terminal.
  • the foregoing and other objects of the invention are achieved by providing, in one respect, a highly compressed pseudo-video system which manages the transmission and display of media information in an integrated services network using fewer network resources than conventional systems.
  • the network may be a wireless network or the Internet, and the media information may include real-time video streams, short-time video scripts, images (e.g., snapshots), live animations, and still animations.
  • the invention reduces transmission bandwidth by having animation, image, and/or short-time video script information pre-stored in memories located within or attached to the communicating terminals. Instead of transmitting this media information over the network, the receiving terminal may therefore automatically retrieve and display the pre-stored media information in response to receiving a call from another user or other events that may transpire during a call.
  • the invention reduces transmission bandwidth by combining and transmitting high-bandwidth media such as streaming video and short-time video scripts to a receiving terminal, and then coordinating the display of that high-bandwidth media with lower- bandwidth media. Since the lower-bandwidth media is pre-stored in the receiving terminal, no transmission bandwidth is expended in order to be display the lower-bandwidth media on the receiving terminal.
  • the present invention also allows for a shifting in the telecommunication paradigm. In conventional systems, every communication link relies on the following assumption: communications links are established is independently from prior knowledge and experience of the communicating parties. Conventional systems, thus, strictly allocate network resources based on communication protocols. The abstractions, imaginations, and emotions of callers are never taken into consideration.
  • the present invention takes the abstractions, imaginations, and emotions of the callers as well as their past knowledge and experience with one another into consideration when managing the communication and display of media information on user terminals. For example, using prior knowledge and experience, the communicating parties can control the amount of information they want to transmit based on how much they are willing to pay for various communication services. Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objects and advantages of the invention may be realized and attained as particularly pointed out in the appended claims. BRIEF DESCRIPTION OF THE DRAWINGS
  • Fig. 1 is a diagram showing an example of a communications system in which the present invention may be implemented.
  • Fig. 2 is a diagram showing a mobile terminal which may be configured to operate in accordance with the present invention.
  • Fig. 3 is a diagram showing an example of a control circuit which may
  • Fig. 4 is a flow diagram showing steps included in a method for controlling communication of media information in accordance with a first
  • Fig. 5 is a timing diagram showing an example of how the
  • combined/multiplexed media information may be transmitted in accordance with the present invention.
  • Fig. 6 is a flow diagram showing steps included in a method of the
  • present invention for controlling the manner in which media information is
  • FIG. 7 is timing diagram showing how an optional step of the method of
  • FIG. 8 is a diagram showing an example of a table entry which may be stored in a memory of a receiving terminal for controlling the display of media information in accordance with the present invention.
  • Fig. 9 is a diagram showing how the display of media information may be controlled on two terminals that communicate in accordance with the present invention.
  • the present invention is a system and method for controlling communication of media information between two terminals in a network.
  • the invention controls the manner in which different types of media information are multiplexed and transmitted between the terminals.
  • the invention controls the manner in which media information is displayed or otherwise output on the terminals. This control may be performed based on past experience/knowledge one user may have about the other, or based on a service option selected by one or both of the users.
  • the invention thus may be customized to meet the specific desires of each user.
  • the invention may also be implemented to reduce network resources (including the transmission bandwidth) that conventional methods would require in order to send media information between terminals.
  • Fig. 1 shows an example of a communications system in which the present invention may be implemented.
  • the communications system includes a network 1 for receiving and transmitting calls within a predetermined geographic area.
  • the network may be a wired or wireless network operating in accordance with any one of a variety of communications standards.
  • a wired-version of the network may be implemented as a wide-area such as the Internet.
  • the network is integrated to include a plurality of access points or gateways 2 for connecting terminals through different networks.
  • These terminals include mobile terminals 3 such as mobile telephones, so-called web phones, personal digital assistants (PDAs), and pocket computers to name a few.
  • the network may also connect desktop or notebook personal computers 4 either to each other or to the mobile terminals.
  • Wireless content providers 5 and/or network content providers 6 may be included as desired.
  • Fig. 2 shows a mobile terminal 10 which may be configured to operate in accordance with the present invention.
  • This terminal includes a speaker 11 , a microphone 12, a keypad 13, and a display 14 for displaying media information which has either been pre-stored in a memory of the terminal or transmitted to the terminal through antenna 15, or both.
  • the mobile terminal also includes as an optional feature a camera 16 which has the ability of capturing still images and/or acquiring real-time streaming video in a manner similar to a video phone or video-conferencing terminal.
  • An external data port 17 such as a USB port may also be included for receiving and/or downloading information including media information from another system 18 such as a personal computer. While the mobile terminal shown in Fig.
  • FIG. 3 shows an example of a control circuit which may be used to display media information on a wireless terminal configured in accordance with the present invention.
  • This control circuit includes a processor 20 connected to the antenna (not shown), an optional caller ID unit 21 , a memory 22, a keyboard 23, a camera 24, and a data port 25.
  • the caller ID unit extracts identification information from a call indicating an identity of a caller. This information may include the caller's telephone number, name, address, etc.
  • the memory may include an area for storing media information which may either be displayed on the terminal itself or transmitted for display on another terminal.
  • the media information may be pre-stored in the terminal memory or received from another terminal for display.
  • the memory may include an on-board personal information management (PIM) database.
  • PIM personal information management
  • This database may operate based on information derived from the caller ID unit, control information input by a user, information downloaded to the terminal from an external computing system, or any combination thereof.
  • the PIM database may be located in a personal computer or other external computing device which interfaces to the terminal through the data port.
  • the data port may also be used to load media information into the memory of the terminal for subsequent transmission or display.
  • the processor of the terminal may also be used to combine, or multiplex, media data for transmission through its antenna.
  • This media information may be stored in the on-board memory, imported from the external computing system, or both.
  • a user may set the parameters of operating software 26 in the processor for controlling, inter alia, the manner media information is to be displayed on the terminal.
  • the user may also designate one or more media service options he or she would like to receive. These options may control the type of media information to be received in order to reduce service charges or, if cost is not an issue, enhance terminal operation to receive and display broadband media.
  • Service options may be negotiated with the carrier directly or set by inputting information using the keyboard terminal.
  • the processor When a call connection is established, the processor displays media information based, for example, on the control parameters and/or service options designated by the user. This media information may be stored in memory 22, received from a transmitting terminal on display 28, or both. The manner in which media information is communicated between terminals for display will now be discussed.
  • Fig. 4 is a flow diagram showing steps included in a method for controlling communication of media information in accordance with a first embodiment of the present invention.
  • This embodiment combines, or multiplexes, different types of media information in one terminal for transmission to a receiving terminal within a same transmission period.
  • the method begins when the transmitting terminal initiates a call with the receiving terminal. (Block 30).
  • the call may be initiated through a wireless network, the Internet, or any other type of wired network.
  • the transmitting terminal combines, or multiplexes, first media information and second media information in an output transmission stream.
  • This media information may be selected by a user, for example, through operation of a terminal keypad in conjunction with a displayed menu.
  • the media information may take any one of a variety of forms, including the following:
  • streaming video either pre-stored or real-time
  • short-time video script e.g., a MPEG file
  • live animation e.g., a moving GIF
  • Fig. 5 is a timing diagram showing an example of
  • Fig. 5 shows that transmission of media information may take place in
  • a video stream is transmitted with animation information known as an avatar.
  • the video stream is transmitted for the first 20 seconds of the
  • Media information transmitted from the receiving terminal may
  • period only 20 seconds of streaming video may be transmitted. The remainder of the period may then be left idle.
  • a user of the receiving terminal may select a service option instructing a wireless carrier to limit or expand the media services that can be received. For example, receiving high-bandwidth transmissions such as streaming video and short-time video scripts may be very costly in terms of a user's subscription fee. To limit costs, the user of the receiving terminal may enter into an agreement with the wireless carrier to ensure that only lower- bandwidth media is displayed on the receiving terminal. This may be performed, for example, by controller at a switching station which blocks or otherwise filters out higher-bandwidth media before it is arrives at the receiving terminal.
  • the operating software of the receiving terminal may be configured to block display of higher-bandwidth media, even though it may have been sent by the transmitting terminal. If cost is not a consideration, the receiving terminal may be configured to receive the higher-bandwidth information or only a certain type of media information.
  • the present invention thus, advantageously allows users to customize their terminals for purposes of displaying multimedia information.
  • the operating software of the receiving terminal may be programmed to replace received media information with alternative media information stored in a terminal memory.
  • an avatar received by the receiving terminal may not be displayed. Instead, this avatar may be replaced with another avatar stored in the terminal memory.
  • This feature of the invention is particularly desirable for purposes of customizing operation of the terminal to each specific user.
  • terminal may also be interfaced to an external memory which stores and retrieves media information in accordance with the present invention.
  • the method of the present invention controls the manner in which media information is displayed based on the person who sent the information.
  • this method begins by receiving a call in a first terminal. (Block 40). Once a call connection has been established, a second step of the method includes extracting and processing caller identification information in the receiving terminal to determine the identity of the transmitting terminal and therefore of a user who likely placed the call. (Block 41).
  • the extracted information may include any one or more of the telephone number of the second terminal, the name of the owner of the terminal, his or her address, etc. If the terminal is connected to the internet, this caller ID information may be comparable information such as a website address.
  • a third step includes comparing the caller ID information (e.g., caller's telephone number) with information stored in a memory of the first terminal.
  • the stored information may include media information which has been pre-stored in association with the telephone number of the transmitting terminal.
  • the media information may be any of the types previously mentioned, and may even correspond to a characteristic of a user of the transmitting terminal.
  • the media information may be transmitted to the receiving terminal and stored in memory in association with the transmitting terminal's caller ID information for later retrieval.
  • the pre-stored media information may be an avatar which is related to a characteristic of the user of the transmitting terminal.
  • an avatar is understood to be an animated icon, symbol, or character which may be used, for example, to represent some characteristic or trait of a person.
  • This characteristic may be a physical attribute of the person or may relate to some non-physical feature. Examples of non-physical features include a relationship one user may have with another and a user's occupation.
  • the avatar may also be based on one user's opinion of the other formulated, for example, based on prior knowledge and experience.
  • a fourth step of the method includes outputting the stored media information on the receiving terminal based on the identity of the transmitting terminal. (Block 43). This involves retrieving the stored media information and the displaying it for a predetermined period of time. If the media information is a video script, for example, the script may be played until its conclusion. Alternatively, if the media information is an avatar, it may be displayed intermittently or even constantly throughout the call.
  • an avatar representing a characteristic of the user of the transmitting terminal is displayed before the call is answered by a user of the receiving terminal. This may be accomplished as follows.
  • an audible tone may sound to inform a user of the receiving terminal of the incoming call.
  • the call may include information which identifies the transmitting party or his terminal, for example, based on a telephone number or website address.
  • a processor of the first terminal may search a memory to locate an avatar which corresponds to the telephone number. This avatar may then be automatically displayed (in lieu of, for example, the transmitting terminal's telephone number).
  • the receiving terminal user may then instantly recognize who the calling party is. For example, if the wife of a receiving terminal user is calling, an avatar in the shape of a heart with her image may be displayed. In another case, if an avatar indicative of an undesirable person is displayed, the receiving terminal user has the option of not answering the call.
  • Another optional step is to allow the user to answer the call, but then continue to display the avatar identifying the transmitting terminal user either intermittently or continuously throughout the call. For example, if the only media information to be displayed during the call is a heart-shaped avatar with her image, then this avatar may be displayed until conclusion of the call. On the other hand, if streaming video or image information is to be displayed, then the avatar may be replaced by this additional media information. If desired, the avatar may then be redisplayed after the additional media information concludes to provide continuous visual communication effects.
  • Fig. 7 is a timing diagram showing how this optional step of the invention may be performed.
  • the receiving terminal receives 20 seconds of video streaming data. After this data concludes, an avatar representing a characteristic of the transmitting terminal user may be displayed. As previously discussed, this avatar may be pre-stored in a memory of the receiving terminal, or it may have been transmitted from the transmitting terminal and then stored in the receiving terminal memory for subsequent display. Whenever the receiving terminal displays the stored avatar image, it is not necessary to allocate communication resources.
  • Fig. 8 shows an example of a table entry which may be stored in a memory of the receiving terminal for controlling the display of media information in accordance with the present invention.
  • This table entry may be a data structure derived from a personal information management (PIM) database located within or interfaced to the receiving terminal.
  • the data structure preferably includes a user identification field 50 and media information identification field 60.
  • the user identification field may include information identifying a user (and/or his terminal) who either has called or may be expected to call the receiving terminal.
  • the information includes the name of a transmitting terminal user, his or her telephone number, and the type of terminal (e.g., mobile phone, home phone, or office phone) corresponding to the telephone number. If the user has an internet phone, the phone number field may be replaced by a website address, otherwise known as a Universal Resource Locator (URL).
  • URL Universal Resource Locator
  • the media information identification field is stored in association with the user identification information and may include, for example, an address in either the same or an external memory in which the media information is stored. This situation may arise, for example, when the media information is an image file or video clip.
  • the media information identification field may contain information defining one or more attributes of an avatar which corresponds to the transmitting terminal user. If the avatar resembles a physical likeness of this user, the following sub-fields may be included: hair style, face model, eye glasses, body, jacket, pants, shoes, accessories, etc.
  • a processor of the receiving terminal may use graphics generation software to generate the avatar for display based on the information in these fields.
  • the media information identification field may also include an animator indicator sub-field (AIF) which includes a plurality of bits describing the avatar. For example, a two-bit AIF field may indicate the following:
  • Non-composite avatar which can be accessed by the special address field.
  • the receiving terminal processor may access the AIF field and generate the avatar accordingly.
  • the media information identification field may also include a code for instructing the receiving terminal processor to activate audio (e.g., a bell) or other visual effects.
  • a code for instructing the receiving terminal processor to activate audio e.g., a bell
  • audio e.g., a bell
  • Currently existing mobile terminals typically have more than 1 megabits of memory space which can be used to store this code along with a plurality of table entries for controlling the display of media information on the receiving terminal.
  • the table entries of the present invention may be updated, modified, or otherwise maintained by connecting the receiving terminal to an external computing system (e.g., a personal computer) via a data port, which, for example, may be a universal serial bus (USB) port.
  • an external computing system e.g., a personal computer
  • a data port which, for example, may be a universal serial bus (USB) port.
  • This external system may be loaded with software which allows it to generate custom-designed avatars for each transmitting terminal user identified in the table.
  • the avatars may be two- or three-dimensional representations of these users, if desired.
  • the memory storing the table entries of the present invention may also store a plurality of default or factory-preset avatars which may be selected to correspond to different users.
  • the operating software of the receiving terminal may be written to allow these avatars to be switched, modified, or deleted either automatically or in response to a receiving terminal user's command.
  • Avatars may be switched for display based on the telephone number from which the transmitting terminal user is calling. For example, as shown in Fig. 8, one user may have multiple telephone numbers. In this case, a different avatar may be displayed based on the telephone number from which the user is calling.
  • Fig. 9 shows one way in which communications may take place between two mobile terminals 70 and 80 in accordance with the present invention.
  • terminal 70 initiated a call to terminal 80 through a wireless network 90.
  • the terminal 80 receives the call, its processor determines the identity of caller and then displays an avatar 85 in the shape of a man which bears the likeness of caller who's name is Christopher.
  • This avatar may have been transmitted to terminal 80 through the network or may have been created by the receiving terminal user based on his/her experience and knowledge about the caller and pre-stored in a memory of this terminal.
  • Terminal 70 also displays an avatar 75, representing his/her own desired avatar image.
  • This avatar may be created by the transmitter terminal user to portray him/herself. It may display in his/her terminal as a default image when the terminal is on or may be transmitted over the network to the receiver terminal as one of optional visual communication.
  • the avatar 85 corresponding to caller Christopher may not be the same avatar 75 of the caller display. If desired, avatars 75 and 85 may be displayed throughout the call session.
  • one terminal transmits a control signal to the other terminal during the call to change an attribute of an avatar, or to replace the avatar displayed on the receiving terminal with one either transmitted from the transmitting terminal or pre-stored in the receiving terminal.
  • a control signal of this type may, for example, cause the avatar on the receiving terminal to display an emotion (crying, laughing, etc.) to coincide with a mood or feeling of the transmitting terminal user.
  • This may be implemented, for example, by including a mood sub-field in the table entry of Fig. 8.
  • This sub-field may be a two-bit field for controlling the emotion on the face of the avatar. Updating this field will cause the processor to automatically change the avatar in a corresponding manner.
  • the table in a terminal memory may include multiple entries for the same user. This may occur, for example, when the user has multiple phone numbers. In this case, the same avatar may be displayed for all phone numbers corresponding to that user or different avatars may be displayed, for example, depending on the number where the user is calling from.
  • the receiving terminal may be equipped with image- capture software that will allow a single frame (or image) from a received video stream to be stored and subsequently displayed.
  • the present invention outperforms conventional media communications management systems in terms of performance and convenience to the user. For example, the invention controls the communication of media information between terminals using fewer network resources than are conventionally required. These resources (which include transmission bandwidth) are reduced by allowing a memory of the receiving terminal to store media information that conventional systems must necessarily transmit over the network. This memory may be located within the receiver or externally connected to it.
  • the present invention allows users to control the types of media services that they would like to receive, thereby allowing the users to control costs and the extent of media services to be received.
  • the present invention also allows users to combine, or multiplex, different types of media information having different transmission bandwidth requirements within a single transmission period, thereby enhancing the content of conversations between users for both personal and business applications.
  • the present invention may be used to implement at least the following communications scenarios.
  • a user of the transmitting terminal may negotiate options for a video stream service with the network. This negotiation process may be performed using an intended visual message transmitter. An appropriate service may then be selected from various service options. Exemplary service options include: a) full-bandwidth streaming video service (e.g., MPEG format) b) short-time streaming video service (e.g., MPEG format) c) Images d) Still animation e) Live Animation
  • the transmitting terminal may transmit continuous video output from the camera unit in the terminal. This may continue until for the duration of the call or until the user switches this function off.
  • the transmitting terminal may transmit a short-time streaming video script for a predetermined time period (e.g., 20 seconds) over a network.
  • the receiving terminal may capture and then display the video, and in the meantime may store the script in an internal memory or external memory.
  • a user of the receiving terminal may display this script repeatedly from an internal memory without requiring any additional allocation of network resources.
  • the video script may be refreshed periodically during the call, for example, every 2 minutes. If the user selects the option for receiving images, images may be transmitted to the receiving terminal once every predetermined time period (e.g., every two minutes).
  • animated information such as, for example, an icon may be transmitted to the receiving terminal, for example, in the manner shown in the timing diagrams of Figs. 5 or 7.
  • this icon may be an avatar created, for example, by the transmitting party in the form of a character, symbol, or other graphical representation of him or herself.
  • the avatar may be created by software stored inside the transmitting terminal, or may be downloaded from external software tools allowing the user to create his own avatar.
  • a physical resemblance is not necessary, as the avatar may correspond to any desired graphic of the transmitting user's choosing.
  • a system capable of generating an avatar of this type is disclosed in U.S. Patent 6,384,829, the contents of which are incorporated herein by reference.
  • the animated information transmitted in accordance with the present invention may be live animation or a moving avatar, one type of which is known as an animated GIF.
  • the avatar When the avatar resembles a character of some sort, its movement may cause the avatar to appear to be speaking, moving forward and backward, laughing, crying, eyes closing and opening, hand pointing, and etc.
  • the transmitting party is identified as creating and transmitting the avatar to the receiving party, those skilled in the art can appreciate that the avatar may be stored in and subsequently displayed on the receiving terminal.
  • the avatar may be automatically displayed based on recognition of caller ID information by the receiving terminal, displayed in response to a control signal transmitted from the transmitting terminal to the receiving terminal, displayed based on control information input by the receiving party himself, or at any other time during the call session.
  • the control signal may cause different avatars to be displayed on the receiving terminal, for example, to commemorate an event (e.g., a happy birthday GIF) or to resemble an emotion or mood the transmitting party is feeling (e.g., a GIF resembling the transmitting part with a happy face).
  • an event e.g., a happy birthday GIF
  • an emotion or mood the transmitting party e.g., a GIF resembling the transmitting part with a happy face.
  • the avatar sent by the transmitting user may be ignored and replaced with an avatar of the receiving user's choosing.
  • the receiving terminal may, based on previously stored settings, cause an avatar of a dog to be displayed in response to the detection of caller ID information. If desired, this avatar may be displayed even when the transmitting party transmits no avatar to the receiving terminal.
  • no media information may be displayed. In this case, no extra bandwidth is allocated for visual communications between the transmitting and receiving terminals. All these options include a simultaneous and continuous two-way voice conversation.
  • a default avatar stored in the receiving terminal or a avatar which corresponds to the caller id may be displayed.
  • the transmitting terminal user may select one of the aforementioned service options or any combination of these options to control the display of media information during a call session. This selection may be based on his or her desire and willingness to pay for the service desired. Generally speaking, the higher the bandwidth requirement, the more expensive the service option. Thus, live streaming video may be expected to be the most expensive and still animation the least expensive.
  • Fig. 7 shows an exemplary scenario.
  • the call initiator sends a 20-second short-time stream video every 2 minutes.
  • the call initiator sends his avatar just one time if the receiving party indicates that he want to receive the call initiator's avatar or if the transmitting party is not sure whether the receiving terminal user has the call initiator's avatar stored in a memory of the receiving terminal.
  • the receiving terminal may store the avatar in a receiver-accessible memory, which may be internal or external memory.
  • the receiving terminal can then subsequently display the avatar at the user's discretion.
  • the call initiator does not transmit any media information.
  • the receiving terminal user may display an avatar created and/or selected by this user to represent the caller. This avatar may be stored in the receiving terminal's memory and re-called as previously discussed. Because the avatar was pre-stored in the receiver terminal, no extra bandwidth allocation is required to display this avatar and thus network resources are conserved.
  • the network may indicate that there is a particular service option request from the call initiator.
  • the service options at the receiver side can be same as for the transmitter side.
  • one of the terminals may have a different service option setting than the other, to reflect that user's preference for either cost savings or enhanced media services.
  • the service options are the same as in the first example.
  • the service option request sent from the transmitting terminal is checked against the receiver's parameter settings. These settings may indicate the current software and hardware versions of the receiving terminal, and the willingness of a receiving terminal user to share the cost of communicating or receiving media services indicated in the service option request. If the requested services are acceptable to the receiving terminal user, the transmitting terminal user (or call initiator) transmits the media information to the receiving terminal in accordance with the service options mentioned in the request.
  • the receiving terminal does not need to receive any information from the call initiator if the avatar of the call initiator has already been generated and stored in the receiver terminal's memory or PIM
  • the receiving terminal fetches and then displays the
  • This feature of the invention is advantageous because it allows media information to be displayed on the
  • the system and method of the present invention may include a
  • the invention may communicate
  • media information with voice communications e.g., video teleconferencing,

Abstract

A system and method for managing communication in an integrated services network ensures that media information is displayed on at least one of two terminals during a call without using any transmission bandwidth. This is achieved by having at least one of the terminals pre-store the media information in a memory and then controlling the terminal to recall and display this information when certain events occur during the call. The media information may include animated information, images, and short-time video scripts. This information and broader band media such as streaming video may be displayed on the same terminal. The display of media information may also be controlled based on the past knowledge and experience callers have about one another and/or their communications equipment.

Description

SYSTEM AND METHOD FOR MULTIPLEXING MEDIA INFORMATION OVER A NETWORK USING REDUCED COMMUNICATIONS RESOURCES AND PRIOR KNOWLEDGE/EXPERIENCE OF A CALLED OR CALLING PARTY
TECHNICAL FIELD
This invention generally relates to managing network communications, and more particularly to a system and method for controlling the communication of media information over an integrated services network.
BACKGROUND ART
In recent years, remarkable technical advances have been made in the areas of wireless- and Internet-related communications. One application of this technology focuses on providing bi-directional video conferencing (e.g., video phone) services over wired and wireless networks. In order to provide real-time streaming video of reasonable quality, a transmission bandwidth of at least 64 K bits per second is required. This is approximately eight times the bandwidth required for voice communications, even if a highly efficient compression scheme is implemented.
One of the most significant stumbling blocks to providing high-quality multimedia video-conferencing services is insufficient transmission bandwidth. Also, there is great doubt as to whether two-way video phone services will be of interest to the public, even if these services can be provided at a reasonably affordable price. To elicit public interest and therefore to build a strong market for the video conferencing industry, the Inventor of the present invention has realized that social interactions between callers must be encouraged without sacrificing valuable transmission bandwidth. Taking this approach will likely increase minutes of usage and thus generate revenue sufficient to ensure the continued advancement of the telecommunications industry.
The Inventor of the present invention has also recognized that next- generation communications systems must provide a variety of multimedia services including real-time streaming video and video-clip swapping, while simultaneously conserving or reducing transmission bandwidth requirements and other network resources. Present communications systems do note adequately provide these services, and the services they do provide are implemented in an efficient manner. A need therefore exists for a system and method for providing enhanced multimedia services to the public which at the same time conserves or reduces network resources.
Further, it is noted that currently existing telecommunications systems cannot deal with abstractions and emotions of callers that are natural to every day interaction. For example, the majority of callers know one another. They know their characters, physical appearances, and other attributes through past shared experiences and knowledge. Parties also often have knowledge of other parties' mobile terminals, including the manner in which they are equipped and their ability to support multimedia and other services. Existing communications systems do not use the prior knowledge and experience of callers as a basis for reducing transmission bandwidth in providing multimedia services in a communications system. These systems also do not use prior knowledge and experience as a basis for reducing the costs associated with providing multimedia communications. Further, it is noted that conventional communications systems are required to transmit multimedia information over a network every time these services are desired to be displayed on a receiving terminal. This frustrates attempts to conserve transmission bandwidth and adversely affects the quality of communications of other users. There is, therefore, an additional need for a system and method that manages communication of multimedia services more efficiently than conventional systems, by ensuring that transmission bandwidth and other network resources are not used every time one terminal desires another terminal to display media information. There is also a need to provide a system and method of this type in a cost-effective manner.
DISCLOSURE OF THE INVENTION
An object of the invention is to solve at least the above problems and/or disadvantages and to provide at least the advantages described hereinafter.
It is one object of the present invention to provide a system and
method for managing communication of media information in a network more efficiently than conventional systems of this type.
It is another object of the present invention to achieve the
aforementioned object by ensuring that transmission bandwidth and other
network resources are not used every time one terminal desired another
terminal to display media information.
It is another object of the present invention to achieve the
aforementioned objects in a cost-effective manner.
It is another object of the present invention to provide a system and
method for managing communication of media information in a network using
fewer network resources (including transmission bandwidth) than
conventional systems, while simultaneously providing an equal or greater
array of multimedia services to customers.
It is another object of the present invention to provide a system and
method which allows a user at one terminal to control the display of
multimedia information based on an identity of a party at another terminal.
It is an object of the present invention to provide a system and method
which manages the communication of media information in a network based
on prior knowledge and experience callers have with one another, and more
specifically to use this prior knowledge and experience as a basis for reducing transmission bandwidth in providing multimedia services within a network without sacrificing the quality of those services.
The foregoing and other objects of the invention are achieved by providing, in one respect, a highly compressed pseudo-video system which manages the transmission and display of media information in an integrated services network using fewer network resources than conventional systems. The network may be a wireless network or the Internet, and the media information may include real-time video streams, short-time video scripts, images (e.g., snapshots), live animations, and still animations. In accordance with one embodiment, the invention reduces transmission bandwidth by having animation, image, and/or short-time video script information pre-stored in memories located within or attached to the communicating terminals. Instead of transmitting this media information over the network, the receiving terminal may therefore automatically retrieve and display the pre-stored media information in response to receiving a call from another user or other events that may transpire during a call.
In accordance with another embodiment, the invention reduces transmission bandwidth by combining and transmitting high-bandwidth media such as streaming video and short-time video scripts to a receiving terminal, and then coordinating the display of that high-bandwidth media with lower- bandwidth media. Since the lower-bandwidth media is pre-stored in the receiving terminal, no transmission bandwidth is expended in order to be display the lower-bandwidth media on the receiving terminal. The present invention also allows for a shifting in the telecommunication paradigm. In conventional systems, every communication link relies on the following assumption: communications links are established is independently from prior knowledge and experience of the communicating parties. Conventional systems, thus, strictly allocate network resources based on communication protocols. The abstractions, imaginations, and emotions of callers are never taken into consideration.
The present invention takes the abstractions, imaginations, and emotions of the callers as well as their past knowledge and experience with one another into consideration when managing the communication and display of media information on user terminals. For example, using prior knowledge and experience, the communicating parties can control the amount of information they want to transmit based on how much they are willing to pay for various communication services. Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objects and advantages of the invention may be realized and attained as particularly pointed out in the appended claims. BRIEF DESCRIPTION OF THE DRAWINGS
The invention will be described in detail with reference to the following drawings in which like reference numerals refer to like elements wherein:
Fig. 1 is a diagram showing an example of a communications system in which the present invention may be implemented.
Fig. 2 is a diagram showing a mobile terminal which may be configured to operate in accordance with the present invention.
Fig. 3 is a diagram showing an example of a control circuit which may
be used to display media information on a wireless terminal configured in
accordance with the present invention.
Fig. 4 is a flow diagram showing steps included in a method for controlling communication of media information in accordance with a first
embodiment of the present invention.
Fig. 5 is a timing diagram showing an example of how the
combined/multiplexed media information may be transmitted in accordance with the present invention.
Fig. 6 is a flow diagram showing steps included in a method of the
present invention for controlling the manner in which media information is
displayed based on the person who sent the information. Fig. 7 is timing diagram showing how an optional step of the method of
the present invention may be performed. Fig. 8 is a diagram showing an example of a table entry which may be stored in a memory of a receiving terminal for controlling the display of media information in accordance with the present invention.
Fig. 9 is a diagram showing how the display of media information may be controlled on two terminals that communicate in accordance with the present invention.
MODES FOR CARRYING OUT THE PREFERRED EMBODIMENTS
The present invention is a system and method for controlling communication of media information between two terminals in a network. In one respect, the invention controls the manner in which different types of media information are multiplexed and transmitted between the terminals. In another respect, the invention controls the manner in which media information is displayed or otherwise output on the terminals. This control may be performed based on past experience/knowledge one user may have about the other, or based on a service option selected by one or both of the users. The invention thus may be customized to meet the specific desires of each user. Advantageously, the invention may also be implemented to reduce network resources (including the transmission bandwidth) that conventional methods would require in order to send media information between terminals.
Fig. 1 shows an example of a communications system in which the present invention may be implemented. The communications system includes a network 1 for receiving and transmitting calls within a predetermined geographic area. The network may be a wired or wireless network operating in accordance with any one of a variety of communications standards. To maximize customer subscriptions, a wired-version of the network may be implemented as a wide-area such as the Internet. Preferably, the network is integrated to include a plurality of access points or gateways 2 for connecting terminals through different networks. These terminals include mobile terminals 3 such as mobile telephones, so-called web phones, personal digital assistants (PDAs), and pocket computers to name a few. The network may also connect desktop or notebook personal computers 4 either to each other or to the mobile terminals. Wireless content providers 5 and/or network content providers 6 may be included as desired.
Fig. 2 shows a mobile terminal 10 which may be configured to operate in accordance with the present invention. This terminal includes a speaker 11 , a microphone 12, a keypad 13, and a display 14 for displaying media information which has either been pre-stored in a memory of the terminal or transmitted to the terminal through antenna 15, or both. The mobile terminal also includes as an optional feature a camera 16 which has the ability of capturing still images and/or acquiring real-time streaming video in a manner similar to a video phone or video-conferencing terminal. An external data port 17 such as a USB port may also be included for receiving and/or downloading information including media information from another system 18 such as a personal computer. While the mobile terminal shown in Fig. 2 is preferable for use with the present invention, those skilled in the art can appreciate that other type of terminals may be used provided they have the ability to output media information in a manner which will now be described. Fig. 3 shows an example of a control circuit which may be used to display media information on a wireless terminal configured in accordance with the present invention. This control circuit includes a processor 20 connected to the antenna (not shown), an optional caller ID unit 21 , a memory 22, a keyboard 23, a camera 24, and a data port 25. The caller ID unit extracts identification information from a call indicating an identity of a caller. This information may include the caller's telephone number, name, address, etc.
The memory may include an area for storing media information which may either be displayed on the terminal itself or transmitted for display on another terminal. The media information may be pre-stored in the terminal memory or received from another terminal for display. To control the display of media information, the memory may include an on-board personal information management (PIM) database. This database may operate based on information derived from the caller ID unit, control information input by a user, information downloaded to the terminal from an external computing system, or any combination thereof. If desired, the PIM database may be located in a personal computer or other external computing device which interfaces to the terminal through the data port. The data port may also be used to load media information into the memory of the terminal for subsequent transmission or display.
The processor of the terminal may also be used to combine, or multiplex, media data for transmission through its antenna. This media information may be stored in the on-board memory, imported from the external computing system, or both.
Prior to receiving a call, a user may set the parameters of operating software 26 in the processor for controlling, inter alia, the manner media information is to be displayed on the terminal. The user may also designate one or more media service options he or she would like to receive. These options may control the type of media information to be received in order to reduce service charges or, if cost is not an issue, enhance terminal operation to receive and display broadband media. Service options may be negotiated with the carrier directly or set by inputting information using the keyboard terminal.
When a call connection is established, the processor displays media information based, for example, on the control parameters and/or service options designated by the user. This media information may be stored in memory 22, received from a transmitting terminal on display 28, or both. The manner in which media information is communicated between terminals for display will now be discussed.
Fig. 4 is a flow diagram showing steps included in a method for controlling communication of media information in accordance with a first embodiment of the present invention. This embodiment combines, or multiplexes, different types of media information in one terminal for transmission to a receiving terminal within a same transmission period. The method begins when the transmitting terminal initiates a call with the receiving terminal. (Block 30). As previously discussed, the call may be initiated through a wireless network, the Internet, or any other type of wired network.
Once a call connection is established, the transmitting terminal combines, or multiplexes, first media information and second media information in an output transmission stream. (Block 31). This media information may be selected by a user, for example, through operation of a terminal keypad in conjunction with a displayed menu. The media information may take any one of a variety of forms, including the following:
• streaming video (either pre-stored or real-time) • short-time video script (e.g., a MPEG file)
• image (e.g., a JPEG file)
• still animation (e.g., a graphical interchange format - GIF)
• live animation (e.g., a moving GIF)
The types of media information listed above may be combined, or multiplexed, in any order, or this information may be combined based on the transmitting terminal user's knowledge of media information that is pre-stored in the receiving terminal. These features of the invention will be described in greater detail below. Once the media information has been combined, it is transmitted in tandem or multiplexed form to the receiving terminal preferably with voice information. (Block 32). Fig. 5 is a timing diagram showing an example of
how the combined/multiplexed media information may be transmitted in accordance with the present invention. (For convenience purposes, transmission of the voice information has been omitted in this drawing).
Fig. 5 shows that transmission of media information may take place in
successive periods, which in this case equals two minutes each. During the
first period, a video stream is transmitted with animation information known as an avatar. The video stream is transmitted for the first 20 seconds of the
period and the animation information is transmitted in a next succeeding
second. In this example, the remainder of the first period is left idle, which
means that the transmitting terminal does not transmit any media information
although it may very well be continuously transmitting and receiving voice information. Media information transmitted from the receiving terminal may
also be received in the transmitting terminal during this time. In a next period,
different media information may be transmitted. For example, in the next
period only 20 seconds of streaming video may be transmitted. The remainder of the period may then be left idle.
Returning to Fig. 4, in a subsequent step of the method, the receiving
terminal controls the manner in which the media information is output on its
display. (Block 33). This control may be performed in a variety of ways. First, a user of the receiving terminal may select a service option instructing a wireless carrier to limit or expand the media services that can be received. For example, receiving high-bandwidth transmissions such as streaming video and short-time video scripts may be very costly in terms of a user's subscription fee. To limit costs, the user of the receiving terminal may enter into an agreement with the wireless carrier to ensure that only lower- bandwidth media is displayed on the receiving terminal. This may be performed, for example, by controller at a switching station which blocks or otherwise filters out higher-bandwidth media before it is arrives at the receiving terminal.
Second, the operating software of the receiving terminal may be configured to block display of higher-bandwidth media, even though it may have been sent by the transmitting terminal. If cost is not a consideration, the receiving terminal may be configured to receive the higher-bandwidth information or only a certain type of media information. The present invention, thus, advantageously allows users to customize their terminals for purposes of displaying multimedia information.
Third, the operating software of the receiving terminal may be programmed to replace received media information with alternative media information stored in a terminal memory. For example, an avatar received by the receiving terminal may not be displayed. Instead, this avatar may be replaced with another avatar stored in the terminal memory. This feature of the invention is particularly desirable for purposes of customizing operation of the terminal to each specific user.
While the invention has been described as storing media information in a terminal memory, those skilled in the art can appreciate that the terminal may also be interfaced to an external memory which stores and retrieves media information in accordance with the present invention.
In accordance with another embodiment, the method of the present invention controls the manner in which media information is displayed based on the person who sent the information. Referring to Fig. 6, this method begins by receiving a call in a first terminal. (Block 40). Once a call connection has been established, a second step of the method includes extracting and processing caller identification information in the receiving terminal to determine the identity of the transmitting terminal and therefore of a user who likely placed the call. (Block 41). The extracted information may include any one or more of the telephone number of the second terminal, the name of the owner of the terminal, his or her address, etc. If the terminal is connected to the internet, this caller ID information may be comparable information such as a website address.
A third step includes comparing the caller ID information (e.g., caller's telephone number) with information stored in a memory of the first terminal. (Block 42). The stored information may include media information which has been pre-stored in association with the telephone number of the transmitting terminal. The media information may be any of the types previously mentioned, and may even correspond to a characteristic of a user of the transmitting terminal. Alternatively, the media information may be transmitted to the receiving terminal and stored in memory in association with the transmitting terminal's caller ID information for later retrieval. In accordance with one particularly advantageous feature of the invention, the pre-stored media information may be an avatar which is related to a characteristic of the user of the transmitting terminal. In the graphics world, an avatar is understood to be an animated icon, symbol, or character which may be used, for example, to represent some characteristic or trait of a person. This characteristic may be a physical attribute of the person or may relate to some non-physical feature. Examples of non-physical features include a relationship one user may have with another and a user's occupation. The avatar may also be based on one user's opinion of the other formulated, for example, based on prior knowledge and experience. A fourth step of the method includes outputting the stored media information on the receiving terminal based on the identity of the transmitting terminal. (Block 43). This involves retrieving the stored media information and the displaying it for a predetermined period of time. If the media information is a video script, for example, the script may be played until its conclusion. Alternatively, if the media information is an avatar, it may be displayed intermittently or even constantly throughout the call.
In an optional but desirable step, an avatar representing a characteristic of the user of the transmitting terminal is displayed before the call is answered by a user of the receiving terminal. This may be accomplished as follows.
When a call is received, an audible tone may sound to inform a user of the receiving terminal of the incoming call. As previously discussed, the call may include information which identifies the transmitting party or his terminal, for example, based on a telephone number or website address. When this information received, a processor of the first terminal may search a memory to locate an avatar which corresponds to the telephone number. This avatar may then be automatically displayed (in lieu of, for example, the transmitting terminal's telephone number). The receiving terminal user may then instantly recognize who the calling party is. For example, if the wife of a receiving terminal user is calling, an avatar in the shape of a heart with her image may be displayed. In another case, if an avatar indicative of an undesirable person is displayed, the receiving terminal user has the option of not answering the call.
Another optional step is to allow the user to answer the call, but then continue to display the avatar identifying the transmitting terminal user either intermittently or continuously throughout the call. For example, if the only media information to be displayed during the call is a heart-shaped avatar with her image, then this avatar may be displayed until conclusion of the call. On the other hand, if streaming video or image information is to be displayed, then the avatar may be replaced by this additional media information. If desired, the avatar may then be redisplayed after the additional media information concludes to provide continuous visual communication effects.
Fig. 7 is a timing diagram showing how this optional step of the invention may be performed. In this timing diagram, the receiving terminal receives 20 seconds of video streaming data. After this data concludes, an avatar representing a characteristic of the transmitting terminal user may be displayed. As previously discussed, this avatar may be pre-stored in a memory of the receiving terminal, or it may have been transmitted from the transmitting terminal and then stored in the receiving terminal memory for subsequent display. Whenever the receiving terminal displays the stored avatar image, it is not necessary to allocate communication resources.
Fig. 8 shows an example of a table entry which may be stored in a memory of the receiving terminal for controlling the display of media information in accordance with the present invention. This table entry may be a data structure derived from a personal information management (PIM) database located within or interfaced to the receiving terminal. The data structure preferably includes a user identification field 50 and media information identification field 60. The user identification field may include information identifying a user (and/or his terminal) who either has called or may be expected to call the receiving terminal. The information includes the name of a transmitting terminal user, his or her telephone number, and the type of terminal (e.g., mobile phone, home phone, or office phone) corresponding to the telephone number. If the user has an internet phone, the phone number field may be replaced by a website address, otherwise known as a Universal Resource Locator (URL).
The media information identification field is stored in association with the user identification information and may include, for example, an address in either the same or an external memory in which the media information is stored. This situation may arise, for example, when the media information is an image file or video clip. Alternatively, or additionally, the media information identification field may contain information defining one or more attributes of an avatar which corresponds to the transmitting terminal user. If the avatar resembles a physical likeness of this user, the following sub-fields may be included: hair style, face model, eye glasses, body, jacket, pants, shoes, accessories, etc. A processor of the receiving terminal may use graphics generation software to generate the avatar for display based on the information in these fields. The media information identification field may also include an animator indicator sub-field (AIF) which includes a plurality of bits describing the avatar. For example, a two-bit AIF field may indicate the following:
0 0 - No avatar exists for the particular telephone number
0 1 - Composite avatar 1 0 - Composite avatar with gestures and body movement
1 1 - Non-composite avatar which can be accessed by the special address field. In operation, the receiving terminal processor may access the AIF field and generate the avatar accordingly.
The media information identification field may also include a code for instructing the receiving terminal processor to activate audio (e.g., a bell) or other visual effects. Currently existing mobile terminals typically have more than 1 megabits of memory space which can be used to store this code along with a plurality of table entries for controlling the display of media information on the receiving terminal.
The table entries of the present invention may be updated, modified, or otherwise maintained by connecting the receiving terminal to an external computing system (e.g., a personal computer) via a data port, which, for example, may be a universal serial bus (USB) port. This external system may be loaded with software which allows it to generate custom-designed avatars for each transmitting terminal user identified in the table. The avatars may be two- or three-dimensional representations of these users, if desired.
The memory storing the table entries of the present invention may also store a plurality of default or factory-preset avatars which may be selected to correspond to different users. The operating software of the receiving terminal may be written to allow these avatars to be switched, modified, or deleted either automatically or in response to a receiving terminal user's command. Avatars may be switched for display based on the telephone number from which the transmitting terminal user is calling. For example, as shown in Fig. 8, one user may have multiple telephone numbers. In this case, a different avatar may be displayed based on the telephone number from which the user is calling.
Fig. 9 shows one way in which communications may take place between two mobile terminals 70 and 80 in accordance with the present invention. In this figure, terminal 70 initiated a call to terminal 80 through a wireless network 90. When the terminal 80 receives the call, its processor determines the identity of caller and then displays an avatar 85 in the shape of a man which bears the likeness of caller who's name is Christopher. This avatar may have been transmitted to terminal 80 through the network or may have been created by the receiving terminal user based on his/her experience and knowledge about the caller and pre-stored in a memory of this terminal.
Terminal 70 also displays an avatar 75, representing his/her own desired avatar image. This avatar may be created by the transmitter terminal user to portray him/herself. It may display in his/her terminal as a default image when the terminal is on or may be transmitted over the network to the receiver terminal as one of optional visual communication. In the example shown, avatar 75 corresponds to a panda bear which affectionately was selected to correspond to caller, Christopher=s desired representative image. At the receiver terminal, the avatar 85 corresponding to caller Christopher may not be the same avatar 75 of the caller display. If desired, avatars 75 and 85 may be displayed throughout the call session. To make the conversation more animated, one terminal transmit a control signal to the other terminal during the call to change an attribute of an avatar, or to replace the avatar displayed on the receiving terminal with one either transmitted from the transmitting terminal or pre-stored in the receiving terminal. A control signal of this type may, for example, cause the avatar on the receiving terminal to display an emotion (crying, laughing, etc.) to coincide with a mood or feeling of the transmitting terminal user. This may be implemented, for example, by including a mood sub-field in the table entry of Fig. 8. This sub-field may be a two-bit field for controlling the emotion on the face of the avatar. Updating this field will cause the processor to automatically change the avatar in a corresponding manner.
As previously discussed, the table in a terminal memory may include multiple entries for the same user. This may occur, for example, when the user has multiple phone numbers. In this case, the same avatar may be displayed for all phone numbers corresponding to that user or different avatars may be displayed, for example, depending on the number where the user is calling from. Also, the receiving terminal may be equipped with image- capture software that will allow a single frame (or image) from a received video stream to be stored and subsequently displayed. The present invention outperforms conventional media communications management systems in terms of performance and convenience to the user. For example, the invention controls the communication of media information between terminals using fewer network resources than are conventionally required. These resources (which include transmission bandwidth) are reduced by allowing a memory of the receiving terminal to store media information that conventional systems must necessarily transmit over the network. This memory may be located within the receiver or externally connected to it.
The present invention allows users to control the types of media services that they would like to receive, thereby allowing the users to control costs and the extent of media services to be received.
The present invention also allows users to combine, or multiplex, different types of media information having different transmission bandwidth requirements within a single transmission period, thereby enhancing the content of conversations between users for both personal and business applications.
With the foregoing in mind, the present invention may be used to implement at least the following communications scenarios.
Visual Message Initiator (Transmitter) with Real-Time,
Two-Way Video Stream Phone Capability including Voice Conversation
Despite of the fact that many terminals today including mobile terminals have the capability to display streaming video, many customers may not want to pay high rates carriers charge for these and other broadband services. The present invention takes this into consideration by giving users the option to control which media services they would like to receive. For example, during a call set-up process, a user of the transmitting terminal may negotiate options for a video stream service with the network. This negotiation process may be performed using an intended visual message transmitter. An appropriate service may then be selected from various service options. Exemplary service options include: a) full-bandwidth streaming video service (e.g., MPEG format) b) short-time streaming video service (e.g., MPEG format) c) Images d) Still animation e) Live Animation
If full-bandwidth streaming video is selected, the transmitting terminal may transmit continuous video output from the camera unit in the terminal. This may continue until for the duration of the call or until the user switches this function off. If the short-time streaming video service is selected, the transmitting terminal may transmit a short-time streaming video script for a predetermined time period (e.g., 20 seconds) over a network. The receiving terminal may capture and then display the video, and in the meantime may store the script in an internal memory or external memory. A user of the receiving terminal may display this script repeatedly from an internal memory without requiring any additional allocation of network resources. In addition, the video script may be refreshed periodically during the call, for example, every 2 minutes. If the user selects the option for receiving images, images may be transmitted to the receiving terminal once every predetermined time period (e.g., every two minutes).
If the user selects the option for receiving animation, animated information such as, for example, an icon may be transmitted to the receiving terminal, for example, in the manner shown in the timing diagrams of Figs. 5 or 7. As previously discussed, this icon may be an avatar created, for example, by the transmitting party in the form of a character, symbol, or other graphical representation of him or herself. The avatar may be created by software stored inside the transmitting terminal, or may be downloaded from external software tools allowing the user to create his own avatar. Those skilled in the art can appreciate that a physical resemblance is not necessary, as the avatar may correspond to any desired graphic of the transmitting user's choosing. A system capable of generating an avatar of this type is disclosed in U.S. Patent 6,384,829, the contents of which are incorporated herein by reference.
In addition to still avatars, the animated information transmitted in accordance with the present invention may be live animation or a moving avatar, one type of which is known as an animated GIF. When the avatar resembles a character of some sort, its movement may cause the avatar to appear to be speaking, moving forward and backward, laughing, crying, eyes closing and opening, hand pointing, and etc. While in this example the transmitting party is identified as creating and transmitting the avatar to the receiving party, those skilled in the art can appreciate that the avatar may be stored in and subsequently displayed on the receiving terminal. In this case, the avatar may be automatically displayed based on recognition of caller ID information by the receiving terminal, displayed in response to a control signal transmitted from the transmitting terminal to the receiving terminal, displayed based on control information input by the receiving party himself, or at any other time during the call session.
When displayed in response to a control signal from the transmitting terminal, the control signal may cause different avatars to be displayed on the receiving terminal, for example, to commemorate an event (e.g., a happy birthday GIF) or to resemble an emotion or mood the transmitting party is feeling (e.g., a GIF resembling the transmitting part with a happy face).
When displayed based on control information input the user, the avatar sent by the transmitting user may be ignored and replaced with an avatar of the receiving user's choosing. For example, if the receiving terminal user does not like the transmitting terminal user, the receiving terminal may, based on previously stored settings, cause an avatar of a dog to be displayed in response to the detection of caller ID information. If desired, this avatar may be displayed even when the transmitting party transmits no avatar to the receiving terminal.
If no media service is selected, no media information may be displayed. In this case, no extra bandwidth is allocated for visual communications between the transmitting and receiving terminals. All these options include a simultaneous and continuous two-way voice conversation. At the receiving terminal, a default avatar stored in the receiving terminal or a avatar which corresponds to the caller id may be displayed.
In this exemplary embodiment of the invention, the transmitting terminal user (or call initiator) may select one of the aforementioned service options or any combination of these options to control the display of media information during a call session. This selection may be based on his or her desire and willingness to pay for the service desired. Generally speaking, the higher the bandwidth requirement, the more expensive the service option. Thus, live streaming video may be expected to be the most expensive and still animation the least expensive.
Fig. 7, which has been previously discussed, shows an exemplary scenario. To reiterate, in this scenario the call initiator sends a 20-second short-time stream video every 2 minutes. During the first 2- minute period, the call initiator sends his avatar just one time if the receiving party indicates that he want to receive the call initiator's avatar or if the transmitting party is not sure whether the receiving terminal user has the call initiator's avatar stored in a memory of the receiving terminal.
After the receiving terminal receives the transmitted avatar, the receiving terminal may store the avatar in a receiver-accessible memory, which may be internal or external memory. The receiving terminal can then subsequently display the avatar at the user's discretion. In another scenario, the call initiator does not transmit any media information. In this case, the receiving terminal user may display an avatar created and/or selected by this user to represent the caller. This avatar may be stored in the receiving terminal's memory and re-called as previously discussed. Because the avatar was pre-stored in the receiver terminal, no extra bandwidth allocation is required to display this avatar and thus network resources are conserved.
Real-Time, Two-Way Video Stream Phone Communications Including Voice Conversation for the Call Receiver During a call set-up process, the network may indicate that there is a particular service option request from the call initiator. The service options at the receiver side can be same as for the transmitter side. Alternatively, one of the terminals may have a different service option setting than the other, to reflect that user's preference for either cost savings or enhanced media services. In this example, the service options are the same as in the first example.
At the receiver, the service option request sent from the transmitting terminal is checked against the receiver's parameter settings. These settings may indicate the current software and hardware versions of the receiving terminal, and the willingness of a receiving terminal user to share the cost of communicating or receiving media services indicated in the service option request. If the requested services are acceptable to the receiving terminal user, the transmitting terminal user (or call initiator) transmits the media information to the receiving terminal in accordance with the service options mentioned in the request.
For the avatar option, the receiving terminal does not need to receive any information from the call initiator if the avatar of the call initiator has already been generated and stored in the receiver terminal's memory or PIM
database. In this case, the receiving terminal fetches and then displays the
particular avatar from the memory or PIM database based on the caller
initiator's phone number (Caller ID). This feature of the invention is advantageous because it allows media information to be displayed on the
receiver terminal without using any bandwidth resources of the network.
The system and method of the present invention may include a
number of additional features. For example, the invention may communicate
media information with voice communications (e.g., video teleconferencing,
video telephone applications, etc) or without voice communications (e.g.,
real-time, instant messaging system, etc. ). This may be accomplished, for
example, by transmitting voice signals over a circuit-switched logical channel and the media information over one or more packet-switched logical
channels. Those skilled in the art can appreciate that other known methods
may also be used. Other modifications and variations to the invention will be apparent to
those skilled in the art from the foregoing disclosure. Thus, while only certain
embodiments of the invention have been specifically described herein, it will be apparent that numerous modifications may be made thereto without departing from the spirit and scope of the invention.
The foregoing embodiments and advantages are merely exemplary and are not to be construed as limiting the present invention. The present teaching can be readily applied to other types of apparatuses. The description of the present invention is intended to be illustrative, and not to limit the scope of the claims. Many alternatives, modifications, and variations will be apparent to those skilled in the art. In the claims, means- plus-function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents but also equivalent structures.

Claims

CLAIMS:
1. A method of reducing bandwidth in a communication network, comprising: sending a visual signal within a prescribed n-length transmission time having at least one of: a) a first video stream having a prescribed x-length transmission time, wherein x is less than or equal to n, b) a second video stream having a prescribed y-length transmission time, wherein y is less than x, c) a still image, and d) a virtual image; receiving the video signal; and displaying the received video signal during a prescribed m-length reception time in a prescribed display format, wherein the video signal is transmitted through a wireless network..
2. The method of claim 1 , wherein combination during the n- iength transmission time includes an idle state having none of a, b, c or d above within a remaining time period of the n-length transmission time.
3. The method of claim 2, wherein the idle state does not allocate required communication resources during transmission.
4. The method of claim 3, wherein a virtual image is displayed at the receiver terminal during the idle state.
5. The method of claim 1 , wherein the prescribed display format includes a third video stream having a prescribed z-length time period..
6. The method of claim 5, wherein the third video stream is the same as the second video stream if the video signal includes the second video stream.
7. The method of claim 5, wherein the third video stream is different from the second video stream even though the video signal includes the second video stream.
8. The method of claim 7, wherein the third video stream is a previously stored video stream.
9. The method of claim 1 , wherein the received video signal is displayed by a mobile terminal.
10. The method of claim 1 , wherein the first video stream is a full bandwidth streaming video which is operating continuously.
11. The method of claim 1 , wherein the second video stream is a short-time stream video.
12. The method of claim 1 , wherein the snapshot of a virtual image is an image of the calling party.
13. The method of claim 1 , wherein the virtual image is an avatar.
14. The method of claim 8, wherein the avatar is an animated avatar.
15. A method of dynamically multiplexing different types of visual information sent over a network, comprising: establishing a call connection by a first mobile terminal of a calling party; multiplexing a plurality of first video signals provided by the first mobile terminal during a call connection, wherein the first video signals include an idle state, a first video stream, a first still image, and a first graphical representation/depiction; and transmitting the multiplexed first video signals.
16. The method of claim 15, the network transmits a signal indicative of the multiplex video signal to a second mobile terminal of the called party, and the second mobile terminal display a video image based on multiplexed second video signals.
17. The method of claim 16, wherein the multiplex second video signals comprises at least one of a second video stream, a second still image, and a second graphical representation/depiction.
18. The method of claim 17, wherein the first and second video streams are different.
19. The method of claim 17, wherein the first and second still images are different.
20. The method of claim 17, wherein the first and second graphical representation/depiction are different.
21. The method of 20, wherein the second graphical representation/depiction is stored in a memory of the second mobile terminal.
22. The method of claim 21 , wherein the second graphical representation/depiction is based on called party=s past experience with the calling party.
23. The method of claim 21 , wherein the second graphical representation/depiction is replaced by the first graphical representation/depiction of the calling party.
24. The method of claim 22, wherein the second graphical representation/depiction elicit an image likeness and look and feel of the calling party to the called party.
25. The method of claim 17, wherein the second video stream includes an idle video state.
26. The method of claim 25, wherein in an idle video state, the receiver terminal displays the second graphical representation.
27. A method of imprinting/eliciting an image/likeness/look and feel of the caller to a called party, comprising:
sending caller related information from a mobile terminal for visual
display in a first prescribed format; and displaying on a display of the called party the caller related
information in a second prescribed display format, wherein the first
prescribed display format is different from the second prescribed display format.
28. The method of claim 27, wherein the display is a display of a mobile terminal.
29. A method for communicating information, comprising: initiating a call between a first terminal and a second terminal; multiplexing first media information and second media information in the first terminal; and transmitting the multiplexed information with voice information from the first terminal to the second terminal.
30. The method of claim 29, wherein the first media information and the second media information are selected from the group consisting of a video stream, a short-time video script, a still image, moving animation, and still animation.
31. The method of claim 30, wherein said video stream includes
real-time streaming video.
32. The method of claim 29, wherein the first media information
is a video stream and the second media information is still animation.
33. The method of claim 29, wherein the multiplexed information is transmitted during a first call period, and wherein different multiplexed information is transmitted during a second call period.
34. The method of claim 29, further comprising: controlling output of the multiplexed media information on the second terminal, said controlling step including blocking output of one of the first
media information and the second media information.
35. The method of claim 34, further comprising:
controlling output of the multiplexed media information on the second
terminal, said controlling step including setting a service option of the second terminal which controls types of media information to be output on the
second terminal.
36. The method of claim 29, further comprising:
controlling output of the multiplexed media information on the second
terminal, said controlling step including replacing the first media information
with third media information stored in a memory of the second terminal.
37. The method of claim 36, wherein the first media information and the third media information include different avatars.
38. A method for managing communications over a network, comprising: receiving a call in a first terminal; dentifying a second terminal from which the call was placed; and retrieving media information from a memory of the first terminal based on the identity of the second terminal.
39. The method of claim 38, wherein the network includes a wireless network.
40. The method of claim 38, wherein the network includes a wide area network.
41. The method of claim 40, wherein the wide area network is the Internet.
42. The method of claim 38, wherein at least one of the first terminal and the second terminal is a mobile terminal.
43. The method of claim 38, wherein at least one of the first terminal and the second terminal is equipped for communications over a wide area network.
44. The method of claim 43, wherein the wide area network is the Internet.
45. The method of claim 38, further comprising: storing information indicative of a telephone number of the second terminal in association with said media information.
46. The method of claim 45, wherein the identifying step includes: extracting information identifying the telephone number of the second terminal from said call; and retrieving said media information from memory based on the extracted telephone number information.
47. The method of claim 38, wherein the media information includes video information.
48. The method of claim 38, wherein the media information
includes image information.
49. The method of claim 38, wherein the media information
includes an avatar.
50. The method of claim 49, wherein the avatar resembles a characteristic of a caller using the second terminal.
51. The method of claim 38, wherein said memory stores a plurality of avatars.
52. The method of claim 51 , further comprising: receiving a control signal from the first terminal which selects one of the avatars stored in said memory, wherein said retrieved media information includes the selected avatar.
53. The method of claim 51 , wherein each of the stored avatars exhibit a different emotion of the caller using the second terminal.
54. A method for managing communications over a network, comprising: receiving a call in a first terminal; identifying a second terminal from which the call was placed; and outputting media information on the first terminal based on the identity of the second terminal.
55. The method of claim 54, wherein said outputting step includes: retrieving the media information from a memory of the second terminal, said memory storing the media information in association with information identifying the second terminal.
56. The method of claim 54, further comprising: receiving first media information from the first terminal; blocking output of the first media information; and
outputting second media information stored in a memory of the second terminal in place of the first media information.
57. The method of claim 54, wherein the media information
includes a avatar.
58. The method of claim 56, wherein the first media information is a first avatar and the second media information is a second avatar.
59. The method of claim 58, wherein the second avatar is
selected or generated based on past knowledge of a user of the second
terminal.
60. The method of claim 58, wherein the second avatar includes
a characteristic which reflects a relationship or opinion a user of the second
terminal has concerning a user of the first terminal.
61. The method of claim 54, further comprising: selecting a service option to control type of media information to be output on the first terminal.
62. The method of claim 54, wherein the selecting step includes: selecting a service option which displays low-bandwidth media information and blocks display of high-bandwidth media information on the first terminal.
PCT/KR2003/001893 2002-09-24 2003-09-16 System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party WO2004030381A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2003263618A AU2003263618B2 (en) 2002-09-24 2003-09-16 System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party
EP03798572A EP1550326A4 (en) 2002-09-24 2003-09-16 System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party
CA2501595A CA2501595C (en) 2002-09-24 2003-09-16 System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party
HK06102511.0A HK1082626A1 (en) 2002-09-24 2006-02-24 A method of reducing bandwidth in a communication network

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/252,412 2002-09-24
US10/252,412 US7003040B2 (en) 2002-09-24 2002-09-24 System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party

Publications (1)

Publication Number Publication Date
WO2004030381A1 true WO2004030381A1 (en) 2004-04-08

Family

ID=31992953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2003/001893 WO2004030381A1 (en) 2002-09-24 2003-09-16 System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party

Country Status (9)

Country Link
US (2) US7003040B2 (en)
EP (1) EP1550326A4 (en)
KR (1) KR100617183B1 (en)
CN (2) CN100440988C (en)
AU (1) AU2003263618B2 (en)
CA (1) CA2501595C (en)
HK (1) HK1082626A1 (en)
RU (1) RU2334371C2 (en)
WO (1) WO2004030381A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009003536A1 (en) 2007-06-29 2009-01-08 Sony Ericsson Mobile Communications Ab Methods and terminals that control avatars during videoconferencing and other communications
EP1983750A3 (en) * 2007-04-16 2010-10-27 NTT DoCoMo, Inc. Control device, mobile communication system, and communication terminal

Families Citing this family (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020183038A1 (en) * 2001-05-31 2002-12-05 Palm, Inc. System and method for crediting an account associated with a network access node
US7693484B2 (en) * 2002-01-29 2010-04-06 Palm, Inc. Dynamic networking modes method and apparatus
JP2004179923A (en) * 2002-11-27 2004-06-24 Nec Corp Portable telephone set, and method for setting original animation of continuously-shot images, which is used for the same
JP4073819B2 (en) * 2003-04-10 2008-04-09 エボリウム・エス・アー・エス Push-type distribution method of video information to mobile phones
KR20040093208A (en) * 2003-04-22 2004-11-05 삼성전자주식회사 Apparatus and method for transmitting received television signal in mobile terminal
CN100514924C (en) * 2003-04-25 2009-07-15 腾讯科技(深圳)有限公司 Method for showing network virtual image on instant communication tool
US7468752B2 (en) * 2003-10-23 2008-12-23 Nokia Corporation Camera output format for real time viewfinder/video image
US20050097046A1 (en) 2003-10-30 2005-05-05 Singfield Joy S. Wireless electronic check deposit scanning and cashing machine with web-based online account cash management computer application system
KR100594005B1 (en) * 2004-04-14 2006-06-30 삼성전자주식회사 The method and apparatus for Bring up simulation of mobile communication terminal device
KR100703281B1 (en) 2004-04-30 2007-04-03 삼성전자주식회사 Method for displaying screen image in wireless terminal
KR100557130B1 (en) * 2004-05-14 2006-03-03 삼성전자주식회사 Terminal equipment capable of editing movement of avatar and method therefor
US7176956B2 (en) 2004-05-26 2007-02-13 Motorola, Inc. Video enhancement of an avatar
US20060033811A1 (en) * 2004-07-30 2006-02-16 Pulitzer J H Integrated broadband telecommunications A/V appliance and device
TWI246322B (en) * 2004-09-20 2005-12-21 Alpha Imaging Technology Corp Image processing device
WO2006077526A1 (en) * 2005-01-21 2006-07-27 Koninklijke Philips Electronics, N.V. Ordering content by mobile phone to be played on consumer devices
US9065979B2 (en) 2005-07-01 2015-06-23 The Invention Science Fund I, Llc Promotional placement in media works
US9583141B2 (en) 2005-07-01 2017-02-28 Invention Science Fund I, Llc Implementing audio substitution options in media works
US20080086380A1 (en) * 2005-07-01 2008-04-10 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Alteration of promotional content in media works
US8910033B2 (en) * 2005-07-01 2014-12-09 The Invention Science Fund I, Llc Implementing group content substitution in media works
US9092928B2 (en) 2005-07-01 2015-07-28 The Invention Science Fund I, Llc Implementing group content substitution in media works
US8126190B2 (en) 2007-01-31 2012-02-28 The Invention Science Fund I, Llc Targeted obstrufication of an image
US9230601B2 (en) 2005-07-01 2016-01-05 Invention Science Fund I, Llc Media markup system for content alteration in derivative works
WO2007005911A2 (en) * 2005-07-05 2007-01-11 White Technologies Group System for multimedia on demand over internet based network
KR101233150B1 (en) * 2005-07-19 2013-02-15 엘지전자 주식회사 Method for Setup and Controlling Service Connection
KR100678206B1 (en) 2005-07-22 2007-02-02 삼성전자주식회사 Method for displaying emotion in video telephone mode of wireless terminal
EP1994179A2 (en) * 2006-02-18 2008-11-26 Michael Strathmann Massively multiplexed sequencing
US8165282B1 (en) * 2006-05-25 2012-04-24 Avaya Inc. Exploiting facial characteristics for improved agent selection
US8708227B1 (en) 2006-10-31 2014-04-29 United Services Automobile Association (Usaa) Systems and methods for remote deposit of checks
US8799147B1 (en) 2006-10-31 2014-08-05 United Services Automobile Association (Usaa) Systems and methods for remote deposit of negotiable instruments with non-payee institutions
US8351677B1 (en) 2006-10-31 2013-01-08 United Services Automobile Association (Usaa) Systems and methods for remote deposit of checks
US7873200B1 (en) 2006-10-31 2011-01-18 United Services Automobile Association (Usaa) Systems and methods for remote deposit of checks
GB0703974D0 (en) * 2007-03-01 2007-04-11 Sony Comp Entertainment Europe Entertainment device
US8959033B1 (en) 2007-03-15 2015-02-17 United Services Automobile Association (Usaa) Systems and methods for verification of remotely deposited checks
US10380559B1 (en) 2007-03-15 2019-08-13 United Services Automobile Association (Usaa) Systems and methods for check representment prevention
US9215512B2 (en) 2007-04-27 2015-12-15 Invention Science Fund I, Llc Implementation of media content alteration
US8433127B1 (en) 2007-05-10 2013-04-30 United Services Automobile Association (Usaa) Systems and methods for real-time validation of check image quality
US8538124B1 (en) 2007-05-10 2013-09-17 United Services Auto Association (USAA) Systems and methods for real-time validation of check image quality
US9058512B1 (en) 2007-09-28 2015-06-16 United Services Automobile Association (Usaa) Systems and methods for digital signature detection
US9892454B1 (en) 2007-10-23 2018-02-13 United Services Automobile Association (Usaa) Systems and methods for obtaining an image of a check to be deposited
US9898778B1 (en) 2007-10-23 2018-02-20 United Services Automobile Association (Usaa) Systems and methods for obtaining an image of a check to be deposited
US8358826B1 (en) 2007-10-23 2013-01-22 United Services Automobile Association (Usaa) Systems and methods for receiving and orienting an image of one or more checks
US9159101B1 (en) 2007-10-23 2015-10-13 United Services Automobile Association (Usaa) Image processing
US8290237B1 (en) 2007-10-31 2012-10-16 United Services Automobile Association (Usaa) Systems and methods to use a digital camera to remotely deposit a negotiable instrument
US8320657B1 (en) 2007-10-31 2012-11-27 United Services Automobile Association (Usaa) Systems and methods to use a digital camera to remotely deposit a negotiable instrument
US7900822B1 (en) 2007-11-06 2011-03-08 United Services Automobile Association (Usaa) Systems, methods, and apparatus for receiving images of one or more checks
US10380562B1 (en) 2008-02-07 2019-08-13 United Services Automobile Association (Usaa) Systems and methods for mobile deposit of negotiable instruments
US8351678B1 (en) 2008-06-11 2013-01-08 United Services Automobile Association (Usaa) Duplicate check detection
US8788957B2 (en) * 2008-08-22 2014-07-22 Microsoft Corporation Social virtual avatar modification
US8422758B1 (en) 2008-09-02 2013-04-16 United Services Automobile Association (Usaa) Systems and methods of check re-presentment deterrent
GB2463124B (en) * 2008-09-05 2012-06-20 Skype Ltd A peripheral device for communication over a communications sytem
GB2463108B (en) * 2008-09-05 2012-08-29 Skype Communication system and method
GB2463104A (en) * 2008-09-05 2010-03-10 Skype Ltd Thumbnail selection of telephone contact using zooming
GB2463105A (en) * 2008-09-05 2010-03-10 Skype Ltd Viewer activity dependent video telephone call ringing
GB2463107A (en) * 2008-09-05 2010-03-10 Skype Ltd A remote control unit of a media device for placing/receiving calls, comprising activating one of the two wireless transceivers when needed.
GB2463109B (en) * 2008-09-05 2013-03-13 Skype Communication system and method
GB2463103A (en) * 2008-09-05 2010-03-10 Skype Ltd Video telephone call using a television receiver
GB2463110B (en) * 2008-09-05 2013-01-16 Skype Communication system and method
US10504185B1 (en) 2008-09-08 2019-12-10 United Services Automobile Association (Usaa) Systems and methods for live video financial deposit
US8391599B1 (en) 2008-10-17 2013-03-05 United Services Automobile Association (Usaa) Systems and methods for adaptive binarization of an image
US8452689B1 (en) 2009-02-18 2013-05-28 United Services Automobile Association (Usaa) Systems and methods of check detection
US10956728B1 (en) 2009-03-04 2021-03-23 United Services Automobile Association (Usaa) Systems and methods of check processing with background removal
US8542921B1 (en) 2009-07-27 2013-09-24 United Services Automobile Association (Usaa) Systems and methods for remote deposit of negotiable instrument using brightness correction
US9779392B1 (en) 2009-08-19 2017-10-03 United Services Automobile Association (Usaa) Apparatuses, methods and systems for a publishing and subscribing platform of depositing negotiable instruments
US8977571B1 (en) 2009-08-21 2015-03-10 United Services Automobile Association (Usaa) Systems and methods for image monitoring of check during mobile deposit
US8699779B1 (en) 2009-08-28 2014-04-15 United Services Automobile Association (Usaa) Systems and methods for alignment of check during mobile deposit
EP2418792A1 (en) * 2010-05-19 2012-02-15 Fraunhofer-Gesellschaft zur Förderung der Angewandten Forschung e.V. Digital Multimedia Broadcast (DMB) with efficient transmission of conditional access data in the transport stream packet (TS packet) of the program association table (PAT)
US9152734B2 (en) * 2010-05-24 2015-10-06 Iii Holdings 2, Llc Systems and methods for identifying intersections using content metadata
US9129340B1 (en) 2010-06-08 2015-09-08 United Services Automobile Association (Usaa) Apparatuses, methods and systems for remote deposit capture with enhanced image detection
US9544543B2 (en) 2011-02-11 2017-01-10 Tangome, Inc. Augmenting a video conference
US8665307B2 (en) 2011-02-11 2014-03-04 Tangome, Inc. Augmenting a video conference
KR20130142202A (en) * 2011-08-12 2013-12-30 이세용 Two-way data sharing system over wireless data network and method of the same
KR20140063673A (en) * 2011-09-23 2014-05-27 탱고미, 인크. Augmenting a video conference
US10380565B1 (en) 2012-01-05 2019-08-13 United Services Automobile Association (Usaa) System and method for storefront bank deposits
US10552810B1 (en) 2012-12-19 2020-02-04 United Services Automobile Association (Usaa) System and method for remote deposit of financial instruments
US11138578B1 (en) 2013-09-09 2021-10-05 United Services Automobile Association (Usaa) Systems and methods for remote deposit of currency
US9286514B1 (en) 2013-10-17 2016-03-15 United Services Automobile Association (Usaa) Character count determination for a digital image
CN105338285B (en) * 2014-08-08 2018-11-06 华为技术有限公司 A kind of newer method, apparatus of media content and system
US10402790B1 (en) 2015-05-28 2019-09-03 United Services Automobile Association (Usaa) Composing a focused document image from multiple image captures or portions of multiple image captures
CN111670431A (en) * 2018-02-07 2020-09-15 索尼公司 Information processing apparatus, information processing method, and program
US11030752B1 (en) 2018-04-27 2021-06-08 United Services Automobile Association (Usaa) System, computing device, and method for document detection
US11900755B1 (en) 2020-11-30 2024-02-13 United Services Automobile Association (Usaa) System, computing device, and method for document detection and deposit processing

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907604A (en) 1997-03-25 1999-05-25 Sony Corporation Image icon associated with caller ID
WO2001033781A1 (en) 1999-11-05 2001-05-10 Nokia Corporation A method for implementing a multimedia messaging service, a multimedia messaging system, a server of a multimedia messaging system and a multimedia terminal
KR20010046956A (en) 1999-11-16 2001-06-15 윤종용 Method for informing caller id of mobile wireless phone
KR20010049041A (en) 1999-11-30 2001-06-15 윤종용 Method for transmitting and receiving multimedia data using short message service in portable radio telephone
KR20020060489A (en) 2001-01-11 2002-07-18 윤종용 Method for transmitting and receiving image file in mobile phone

Family Cites Families (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4090577A (en) * 1977-04-18 1978-05-23 Moore Wallace H Solar celled hybrid vehicle
US5253724A (en) * 1991-10-25 1993-10-19 Prior Ronald E Power wheelchair with transmission using multiple motors per drive wheel
ATE213887T1 (en) * 1991-12-16 2002-03-15 Xircom Wireless Inc SPREAD SPECTRUM DATA PUBLISHING SYSTEM
US5325423A (en) 1992-11-13 1994-06-28 Multimedia Systems Corporation Interactive multimedia communication system
WO1997019429A1 (en) 1995-11-20 1997-05-29 Motorola Inc. Displaying graphic messages in a radio receiver
US6675384B1 (en) * 1995-12-21 2004-01-06 Robert S. Block Method and apparatus for information labeling and control
CA2193764A1 (en) 1995-12-25 1997-06-25 Yasuyuki Mochizuki Selective call receiver
US6111863A (en) * 1995-12-29 2000-08-29 Lsi Logic Corporation Method and apparatus for the dynamic allocation of signal bandwidth between audio, video and data signals
JP3039619B2 (en) 1996-06-28 2000-05-08 静岡日本電気株式会社 Radio selective call receiver and message display method thereof
US5711795A (en) 1996-08-23 1998-01-27 Battelle Memorial Institute Compressible and moldable toy sand composition
US5870683A (en) 1996-09-18 1999-02-09 Nokia Mobile Phones Limited Mobile station having method and apparatus for displaying user-selectable animation sequence
US6343120B1 (en) 1996-10-08 2002-01-29 At&T Wireless Services, Inc. Method and apparatus for providing a caller ID alias
SE510664C2 (en) 1996-10-29 1999-06-14 Ericsson Telefon Ab L M Methods and apparatus for message management in a communication system
US6314302B1 (en) 1996-12-09 2001-11-06 Siemens Aktiengesellschaft Method and telecommunication system for supporting multimedia services via an interface and a correspondingly configured subscriber terminal
JP3671590B2 (en) 1997-01-23 2005-07-13 ソニー株式会社 Display method, display device, and communication device
US5934397A (en) * 1997-01-28 1999-08-10 Schaper; Douglas Modular land vehicle
US6084951A (en) 1997-04-23 2000-07-04 Nortel Networks Corporation Iconized name list
US6226367B1 (en) 1997-04-23 2001-05-01 Nortel Networks Limited Calling line identification with location icon
US6424369B1 (en) 1997-10-06 2002-07-23 Edwin L. Adair Hand-held computers incorporating reduced area imaging devices
US6307836B1 (en) * 1997-12-10 2001-10-23 Mci Communications Corporation High-speed transparent access to multiple services
US6205128B1 (en) * 1998-01-07 2001-03-20 Nokia Telecommunications, Oy Enhanced handoff signaling for high speed data and multimedia
US6181954B1 (en) 1998-01-12 2001-01-30 David A. Monroe Method and apparatus for image capture, compression and transmission of a visual image over telephonic or radio transmission system
US6226512B1 (en) 1998-06-04 2001-05-01 Nortel Networks Limited Apparatus and method for displaying caller attributes
US6335753B1 (en) 1998-06-15 2002-01-01 Mcdonald Arcaster Wireless communication video telephone
US6452915B1 (en) * 1998-07-10 2002-09-17 Malibu Networks, Inc. IP-flow classification in a wireless point to multi-point (PTMP) transmission system
US6317039B1 (en) * 1998-10-19 2001-11-13 John A. Thomason Wireless video audio data remote system
ATE302517T1 (en) 1999-04-19 2005-09-15 Nokia Corp MESSAGE DELIVERY METHOD
CN1178508C (en) * 1999-06-07 2004-12-01 松下电器产业株式会社 Data receiving and transmitting system and its method
US6793619B1 (en) * 1999-06-09 2004-09-21 Yaacov Blumental Computer-implemented method and system for giving a user an impression of tactile feedback
JP2001024776A (en) * 1999-07-06 2001-01-26 Matsushita Electric Ind Co Ltd Telephone equipment
US6473631B1 (en) 1999-12-20 2002-10-29 Motorola, Inc. Video swivel phone
JP4139031B2 (en) * 1999-12-27 2008-08-27 富士通株式会社 Caller information display method
KR20010058785A (en) 1999-12-30 2001-07-06 윤종용 Method for displying caller identification in a mobile wireless communication terminal
JP2001274923A (en) * 2000-03-28 2001-10-05 Nec Eng Ltd Portable telephone transmission system
JP3738383B2 (en) * 2000-05-26 2006-01-25 富士通株式会社 Communication device
US6453294B1 (en) 2000-05-31 2002-09-17 International Business Machines Corporation Dynamic destination-determined multimedia avatars for interactive on-line communications
JP2002044285A (en) 2000-07-28 2002-02-08 Seiko Instruments Inc Mobile communication terminal device and portable display terminal device
KR20000064041A (en) 2000-08-18 2000-11-06 최용화 Multimedia Messaging System and the Method
US7792676B2 (en) * 2000-10-25 2010-09-07 Robert Glenn Klinefelter System, method, and apparatus for providing interpretive communication on a network
KR20020046671A (en) 2000-12-15 2002-06-21 조정남 Method for displaying image related to caller information
JP3697180B2 (en) * 2001-03-21 2005-09-21 キヤノン株式会社 Line communication apparatus and control method thereof
GB2373966B (en) * 2001-03-30 2003-07-09 Toshiba Res Europ Ltd Mode monitoring & identification through distributed radio
US7200855B2 (en) * 2001-05-24 2007-04-03 Vixs Systems, Inc. Method and apparatus of multiplexing a plurality of channels in a multimedia system
US7574474B2 (en) * 2001-09-14 2009-08-11 Xerox Corporation System and method for sharing and controlling multiple audio and video streams
JP3954834B2 (en) * 2001-10-30 2007-08-08 三洋電機株式会社 Communication equipment
US20030109219A1 (en) * 2001-12-10 2003-06-12 Zak Amselem System and method for real-time simultaneous recording on playback over communication network
US20030112821A1 (en) * 2001-12-14 2003-06-19 Samsung Electronics Co., Ltd. System and method for increasing a data transmission rate in mobile wireless communication channels
US20030142648A1 (en) * 2002-01-31 2003-07-31 Samsung Electronics Co., Ltd. System and method for providing a continuous high speed packet data handoff
US20030224830A1 (en) * 2002-05-30 2003-12-04 Zhi-Yun Zhang Mobile telephone capable of displaying image of caller automatically and method for realize it
JP2004072636A (en) * 2002-08-08 2004-03-04 Fujitsu Ltd Image transfer method, image transfer device, computer program and recording medium
US6839417B2 (en) * 2002-09-10 2005-01-04 Myriad Entertainment, Inc. Method and apparatus for improved conference call management
US7058023B2 (en) * 2002-09-11 2006-06-06 Wynn Sol H Self-configuring network telephone system and method
US7130282B2 (en) * 2002-09-20 2006-10-31 Qualcomm Inc Communication device for providing multimedia in a group communication network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907604A (en) 1997-03-25 1999-05-25 Sony Corporation Image icon associated with caller ID
WO2001033781A1 (en) 1999-11-05 2001-05-10 Nokia Corporation A method for implementing a multimedia messaging service, a multimedia messaging system, a server of a multimedia messaging system and a multimedia terminal
KR20010046956A (en) 1999-11-16 2001-06-15 윤종용 Method for informing caller id of mobile wireless phone
KR20010049041A (en) 1999-11-30 2001-06-15 윤종용 Method for transmitting and receiving multimedia data using short message service in portable radio telephone
KR20020060489A (en) 2001-01-11 2002-07-18 윤종용 Method for transmitting and receiving image file in mobile phone

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1550326A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1983750A3 (en) * 2007-04-16 2010-10-27 NTT DoCoMo, Inc. Control device, mobile communication system, and communication terminal
WO2009003536A1 (en) 2007-06-29 2009-01-08 Sony Ericsson Mobile Communications Ab Methods and terminals that control avatars during videoconferencing and other communications
US8111281B2 (en) 2007-06-29 2012-02-07 Sony Ericsson Mobile Communications Ab Methods and terminals that control avatars during videoconferencing and other communications

Also Published As

Publication number Publication date
CN101136923B (en) 2011-02-16
KR100617183B1 (en) 2006-08-31
AU2003263618B2 (en) 2008-02-21
US7003040B2 (en) 2006-02-21
RU2005112445A (en) 2006-02-27
CA2501595C (en) 2011-02-22
EP1550326A4 (en) 2011-11-16
US20040060067A1 (en) 2004-03-25
CN1695390A (en) 2005-11-09
EP1550326A1 (en) 2005-07-06
US20040056887A1 (en) 2004-03-25
HK1082626A1 (en) 2006-06-09
AU2003263618A1 (en) 2004-04-19
US7882532B2 (en) 2011-02-01
KR20040026606A (en) 2004-03-31
CA2501595A1 (en) 2004-04-08
CN101136923A (en) 2008-03-05
CN100440988C (en) 2008-12-03
RU2334371C2 (en) 2008-09-20

Similar Documents

Publication Publication Date Title
CA2501595C (en) System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party
KR100751184B1 (en) Method for changing graphical data like avatars by mobile telecommunications terminals
US7508413B2 (en) Video conference data transmission device and data transmission method adapted for small display of mobile terminals
WO2006025461A1 (en) Push information communication system accompanied by telephone communication
CN100373851C (en) Communication method, communication server, communication terminal equipment and communication system
WO2003063483A1 (en) Communication apparatus
JP2008500750A5 (en)
KR20020079713A (en) Method of providing a phone connection screen of mobile phone and a system for thereof
WO2003021924A1 (en) A method of operating a communication system
JP4945932B2 (en) VIDEO DISTRIBUTION SYSTEM, CALL CONTROL-VIDEO DISTRIBUTION LINKING SERVER, VIDEO DISTRIBUTION METHOD USED FOR THE SAME
US7822014B2 (en) Voice communication system and a server apparatus
JP5802116B2 (en) Call system with data sharing function
AU2008202012B2 (en) System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party
KR100945162B1 (en) System and method for providing ringback tone
CN114567704A (en) Interaction method applied to call and related device
Yi et al. Enhancing personal communications with multimedia
JP2002344580A (en) Communication system, transmission-side communication terminal, and reception-side communication terminal
Subramanya et al. User-controlled, multimedia-enhanced communication using prior knowledge and experience
JP4759980B2 (en) Mobile phone with TV phone function, mobile phone system with TV phone function, and image transmission / reception methods thereof
CN101185320A (en) Image display system, terminal device, image display method, and program
KR100639252B1 (en) Method for operating virtual image communication in mobile communication terminal
JP2002152384A (en) Character data delivery support system, user terminal, network service server and character data delivering method
KR20090054724A (en) A mobile communication terminal for a video call and method for servicing a video call using the same
JP2005142873A (en) Mobile telephone and program
JP2003333555A (en) Method for utilizing video phone system using real time polygon processing, video phone system and terminal using real time polygon processing, service management center apparatus, contents holder, video phone utilizing program, and storage medium with video phone utilizing program stored thereon

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2501595

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2003263618

Country of ref document: AU

Ref document number: 661/KOLNP/2005

Country of ref document: IN

REEP Request for entry into the european phase

Ref document number: 2003798572

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2003798572

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2005112445

Country of ref document: RU

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2003824649X

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 2003798572

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Ref document number: JP