|Publication number||US8185583 B2|
|Application number||US 11/144,426|
|Publication date||May 22, 2012|
|Filing date||Jun 3, 2005|
|Priority date||Jun 3, 2005|
|Also published as||CN101238729A, CN101238729B, EP1886497A2, US20060277256, US20120200701, WO2006132724A2, WO2006132724A3|
|Publication number||11144426, 144426, US 8185583 B2, US 8185583B2, US-B2-8185583, US8185583 B2, US8185583B2|
|Inventors||Saravanakumar V. Tiruthani, Marcelo Oliveira|
|Original Assignee||Siemens Enterprise Communications, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (33), Non-Patent Citations (13), Referenced by (1), Classifications (9), Legal Events (7)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This invention relates to messaging systems which communicate presence information. In particular, this invention relates to expanding presence system capabilities to include video or image presence information.
Businesses are critically reliant on effective and efficient communication. The drive to improve communication, in conjunction with rapid advances in processing technology, have lead to sophisticated messaging systems which handle voice, text, facsimiles and other types of data. For example, instant messaging systems are available that support a text exchange between parties, along with basic presence indicators.
Presence indicators attempt to give a potential caller an indication of whether another individual is available to take a call, answer an instant message, or otherwise engage in a communication session. However, the presence indicators (e.g., ‘Busy’) are primarily manually set and adjusted, leading to inaccurate indications of presence. Thus, even though a caller may check presence prior to calling, the presence indicator is frequently incorrect. The caller then wastes time initiating a call and waiting for the callee to answer, only to be redirected to voice mail.
In limited cases, presence indicators are automatically set. As one example, an instant messaging program may watch for mouse, keyboard, or other user input. When no input is detected for a predetermined time period, the instant messaging program changes the presence state to ‘Away’ or another indicator of unavailability. However, whether an individual is present is not necessarily dependent on whether they are interacting with their computer. In other words, automatically set presence indicators are often no more accurate than manually set presence.
Each attempt to initiate a messaging session consumes valuable, limited, resources. Each time a caller places a call, for example, the supporting messaging system and network infrastructure consume a portion of those limited resources. Each call consumes processor time, network bandwidth, physical channel (e.g., TDMA time slot) capacity, and other resources. Nevertheless, in prior messaging systems, a caller would often attempt to establish a messaging session based on inaccurate or incomplete presence indicators.
A need has long existed for improved presence indication for messaging services.
A messaging system supports visual presence indication. Before establishing a communication session, the destination endpoint provides an image, video, or other visualization to the originating endpoint of the potential communication session. The visualization shows the surroundings of the destination endpoint. The originating endpoint thereby obtains a supplemented or independent indication of presence status associated with the destination endpoint. In the context of a voice call, for example, the person calling need not waste time allowing the callee's phone to ring, only to be redirected to voicemail. Instead, the caller may immediately see that the callee is not available to answer his phone and may immediately end the call attempt and return to productivity.
A presence enabled communication system determines a destination endpoint with which to establish a desired communication session. The communication system may be the originating endpoint, or may be another system interacting with the originating and destination endpoints. The communication system sends a notification message to the destination endpoint regarding the desired communication session. Prior to establishing the desired communication session and in response to the notification message, the communication system receives image capture data.
The image capture data provides a visualization of the surroundings of the endpoint. The originating endpoint uses the image capture data to provide an image capture display that may be employed as a supplemental or independent presence indicator. The originating endpoint may also obtain a decision (e.g., from an operator of the originating endpoint) regarding whether to continue or terminate the attempt to establish the desired communication session.
The image capture data may be a digital picture of the environment surrounding the destination endpoint. Alternatively or additionally, the image capture data may be a video or video stream of the environment. The image capture data may visualize any environment in which an automated or non-automated endpoint may be located, including offices, conference rooms, parking garages, video, audio or other program providers, or other environments.
The present invention is defined by the following claims, and nothing in this section should be taken as a limitation on those claims. Further aspects and advantages of the invention are discussed below in conjunction with the preferred embodiments. Any one or more of the above described aspects or aspects described below may be used independently or in combination with other aspects herein.
The elements illustrated in the Figures interoperate as explained in more detail below. Before setting forth the detailed explanation, however, it is noted that all of the discussion below, regardless of the particular implementation being described, is exemplary in nature, rather than limiting. For example, although selected aspects, features, or components of the implementations are depicted as being stored in memories, all or part of systems and methods consistent with the messaging systems may be stored on, distributed across, or read from other machine-readable media, for example, secondary storage devices such as hard disks, floppy disks, and CD-ROMs; a signal received from a network; or other forms of ROM or RAM either currently known or later developed.
Furthermore, although specific components of the messaging and presence systems will be described, methods, systems, and articles of manufacture consistent with the messaging systems may include additional or different components. For example, a processor may be implemented as a microprocessor, microcontroller, application specific integrated circuit (ASIC), discrete logic, or a combination of other types of circuits or logic. Similarly, memories may be DRAM, SRAM, Flash or any other type of memory. Flags, data, databases, tables, and other data structures may be separately stored and managed, may be incorporated into a single memory or database, may be distributed, or may be logically and physically organized in many different ways. The programs discussed below may be parts of a single program, separate programs, or distributed across several memories and processors.
The entities interacting in the network 100 include the originating endpoint 102 (e.g., a caller), the destination endpoint 104 (e.g., a callee), and the networks 106. The entities also include a messaging system 108 and a presence information system 110 which communicate through the networks 106. The messaging system 108 and presence information system 110 may be systems commercially available from Siemens Communications, Inc. Furthermore, the endpoints 102 and 104, messaging system 108, and/or presence information system 110 may incorporate or perform any of the processing described below with regard to any of the entities interacting in the network 100. For example, the originating endpoint 102 may implement the functions and/or features of both an endpoint and the messaging system 108.
In the example shown in
The image capture devices may be positioned to provide a field of view of any location in whole or in part. Thus, the image capture devices may cover the entire office 112, a part of the office around the endpoint 104 (e.g., an office desk and chair), or any other portion of the office surroundings. The image capture devices thereby provide the image capture data for rendering a visualization of whether the individual 114 is present in the office 112 and/or available to interact with the destination endpoint 104.
The image capture devices may be added to any location where a visual representation of presence is desired. As examples, image capture devices may be added to conference rooms, lunch rooms, or other office locations; elevators, parking garages, hallways, stairwells, or other publicly accessible locations; and street signs, lampposts, intersections, or other traffic locations. The image capture devices may provide image capture data for interactive destination endpoints such as office phones, cell phones, and personal data assistants, or may provide image capture data for non-interactive destination endpoints such as automated response or information systems. Examples of non-interactive destination endpoints include cable television providers which may respond with image capture data showing movies that are currently playing or which are available for play, weather information providers which may responds with image capture data showing weather conditions, or any other automated response system.
The individual 114 subscribes to the presence information system 110 and/or messaging system 108. Accordingly, the destination endpoint 104 provides presence information for the individual 114 to the entities communicating over the networks 106 directly, or indirectly through the presence information system 110. The presence information includes image capture data provided by the video camera 116, picture camera 118, or other image capture devices.
The entities and networks 106 may exchange information using a packet based protocol. For example, the messaging system 108, presence information system 110, and endpoints 102 and 104 may employ the Real Time Protocol (RTP) over the User Datagram Protocol (UDP). Other protocols, including the Transmission Control Protocol/Internet Protocol (TCP/IP) or other network protocols may be additionally or alternatively employed. In addition, the signaling between the entities may proceed according to the H.323 packet-based multimedia communications system standard published by the International Telecommunications Union (ITU). The network or interconnection of networks 110 may include the Public Switched Telephone Network (PSTN) and may deliver data to cell phones, wireline phones, internet phones, or other communication devices.
The entities in the network 100 may employ protocols that adhere to any desired specification. For example, the entities may employ the Session Initiation Protocol (SIP) developed for Internet conferencing, telephony, presence, events notification and instant messaging, the Jabber protocol, or SIP for Instant Messaging and Presence Leveraging Extensions (SIMPLE). The form and content of the presence information may be established according to protocols consistent with the Internet Engineering Task Force (IETF) Request for Comments (RFC) 2778 or IETF RFC 2779. Alternatively, the entities may employ extensions to RFC 2778 or RFC 2779, or may employ proprietary protocols.
The endpoints 102 and 104 may be communication devices, automated response systems, or other types of devices. The endpoints 102 and 104 may include audio reproduction capability employed to deliver voice messages to a subscriber. The endpoints 102 and 104 may alternatively or additionally be cellular phones, desk phones, pagers, Personal Data Assistants (PDAs), computers, specific programs executing on the computers, or other devices or programs.
The individual 114 may have one or more presence states with respect to one or more endpoints, including the destination endpoint 104. Examples of presence states include ‘Available’, when the individual 114 is in the office 112 and available to receive messages; ‘Out of Office’, when the individual 114 is not in the office and is not available to receive message; and ‘On Vacation’, when the individual 114 is out of the office on vacation.
As an addition to, or as an alterative to such presence states, the video camera 116 and picture camera 118 provide image capture data. The image capture data provides a visual representation of presence for the individual 114. The entities communicating in the network 110 may communicate the image capture data between the endpoints 102 and 104. In one implementation, the destination endpoint 104 communicates the image capture data directly to the originating endpoint 102. In other implementations, the image capture data may be stored and/or archived in the presence information system 110. The presence information system 110 may then provide the image capture data to the originating endpoint 102.
Accordingly, for example, rather than allow a callee's phone to repeatedly ring until a voicemail system answers, an operator at the originating endpoint 102 may observe that no one is present to answer the call. The operator may then instruct the originating endpoint 102 to hang up without wasting time as the callee's phone continues to ring. The early termination of the call attempt may also save network bandwidth and processing resources for handling what would otherwise be a continued, but fruitless, call attempt.
The originating endpoint 102 may provide a decision of whether to continue the call attempt. For example, when the image capture display 120 reveals that the individual 114 is absent, the originating endpoint 102 may provide a decision to terminate the call attempt. Otherwise, the call attempt may proceed, and the endpoints 102 and 104 may establish the communication session 122 with or without the assistance of the messaging system 108 and/or the presence information system 110.
The originating endpoint 102 sends a notification message 202 to the destination endpoint 104. Alternatively, the originating endpoint 102 may inform the messaging system 108 that a communication session should be established between the endpoints 102 and 104. The messaging system 108 may then send the notification message 202 to the destination endpoint 104.
The notification message 202 may be accompanied by a media specifier 204 which is part of the notification message 202 or which may be a separate message. The media specifier 204 includes one or more data fields that inform the destination endpoint 104 of the media handling capabilities and/or media requests of the originating endpoint 102. Accordingly, the media specifier 204 may indicate that the originating endpoint 102 requests and/or can process images, video, video streams, or any combination of image data.
The destination endpoint 104 communicates directly or indirectly with the capture device 206. In response to the notification 202 and informed by the media specifier 204, the destination endpoint 104 may command the capture device 206 to capture an image, multiple images, a video, begin a streaming video, or provide any combination of video information. Also, in reply to the notification 202, the destination endpoint 104 or the messaging system 108 may provide a response 208.
The response 208 may provide a status of the initiation of the communication session to the originating endpoint 102. For example, the response 208 may indicate that the destination endpoint 104 is ‘ringing’, or is otherwise awaiting a response from an operator of the destination endpoint 104. In the same response 208, or in one or more additional messages, the destination endpoint may communicate the image capture data 210 to the originating endpoint 102. Alternatively or additionally, the destination endpoint 104 may provide the image capture data 210 to the presence information system 110.
The originating endpoint 102 receives the image capture data 210. An image processing program in the originating endpoint 102 interprets and renders the image capture data as an image capture display 120. The image processing program may be a program which displays .jpg, .gif, .bmp, .mpg, .avi, or .wmv files or any other type of image or video file.
With the visualization of presence, the operator interacting with the originating endpoint 102 may decide whether to continue the call attempt or terminate the call attempt. To that end, the operator provides a decision 212 to the originating endpoint 102. When the decision is to terminate the call attempt, the originating endpoint 102 and/or messaging system 108 may release the resources previously devoted to the attempt and thereby converse valuable and limited communication and processing resources.
Otherwise, the call attempt continues, and the callee may answer. The endpoints 102 and 104 establish the communication session 122. Communication data 214 flows between both endpoints 102 and 104. The communication data 214 may represent packetized voice data, or any other type of information.
In one implementation, the notification message 202 may be a SIP/INVITE/ message. The /INVITE/ message may be followed by the media specifier 204. Similarly, the response message 208 may be a SIP/RINGING/ message. Other notification and response messages may be employed, however.
The memory 304 stores presence state data 308, image capture data 310, and an image enable flag 312. The presence state data 308 may represent manually or automatically derived presence states obtained, for example, from the presence information system 110. For example, the presence state data 308 may indicate whether the individual 114 is unavailable, in a meeting, on vacation, or any other presence information. The image capture data 310 may be image files, video files, or any other type of visualization data.
The individual 114 may set or clear one or more image enable flags 312 to determine when the destination endpoint 104 is allowed to acquire image captures. An image enable flag 312 may apply to every request for image capture data. Alternatively, image enable flags 312 may be established for image capture data requests from specific individuals or groups of individuals, for certain times or dates, or may otherwise have specific application. Furthermore, image enable flags 312 may be provided on a global or individual basis for any of the image capture devices in communication with the destination endpoint 104.
The memory 304 also stores a notification processing program 314 and a capture device control program 316. The notification processing program 314 receives the notification 202 of the desired communication session and the media specifiers 204. When the originating endpoint 102 has requested pre-communication session visualization, the notification processing program 314 may check the enable flags 312 to determine whether image capture is authorized.
When image capture is requested and authorized, the capture device control program 316 issues a capture command 318 to the capture device 206. The capture command 318 may direct the capture device 206 to obtain one or more images, obtain a video, start video streaming, or take any other visualization action. The capture device 206 thereby obtains a visualization of the surroundings of the destination endpoint 104. The capture device 206 returns capture device data 320 to the destination endpoint 104. The capture device data 320 may be raw compressed or uncompressed image data or video frames, or may be pre-processed images or video data in any format (e.g., an industry standard .jpg file), or any other type of image data.
The memory 404 includes presence state data 408, image capture data 410, and media capability flags 412. The presence state data 408 may represent manually or automatically derived presence states as described above. The originating endpoint 102 receives the present state data 408 for the destination endpoint 104 directly from the destination endpoint 104, the presence information system 110, or another presence provider. The image capture data 410 may be image files, video files, or any other type of visualization data received from the destination endpoint 104, the presence information system 110, or another presence provider.
The media capability flags 412 may be set or cleared to indicate what types of visualizations the originating endpoint 102 is capable of processing and/or displaying. As examples, the media capability flags 412 may specify image, video, video streaming, certain types, formats, encoding, or protocols for images, video, or video streaming, or other media types. The media capability flags 412 may also specify from which destination endpoints the originating endpoint 102 will request image capture data.
The memory 404 also includes an image processing program 414 and a notification processing program 416. The notification processing program 416 prepares and communicates the notification message 202. The notification processing program 416 may execute in response to a user requesting that a communication session be established between the endpoints 102 and 104. The notification processing program 416 also may read the media capability flags 412, prepare and communicate the media specifiers 204, and receive the response 208 and/or image capture data 210.
The image processing program 414 renders the image capture data as an image capture display. To that end, the image processing program 414 may read the image capture data 410, interpret the image capture data 410, and provide the image capture display 120 to the display 408. The image processing program 414 may be a picture viewer, media player, or any other type of program which renders images on the display 408.
The originating endpoint 102 may then prepare and send the notification message 202 (Act 504) and the media specifier 204 (Act 506). The originating endpoint 102 receives a response 208 (Act 508), such as an indication that a callee's phone is ‘ringing’. If requested and the destination endpoint 104 has authorized it, the originating endpoint 102 may receive image capture data 210 from the destination endpoint 104 (Act 510).
The image capture data may represent the surroundings or environment of the destination endpoint 104. Pictures, movies, streaming video, or other types of visualization may capture the surroundings. The display 408 associated with the endpoint 102 shows the visualization (Act 512). The operator of the endpoint 102 may then refer to the visualization as a supplement to other presence information for the endpoint 104, or as an independent statement of presence for the endpoint 104.
The operator of the endpoint 102 may issue a decision 212 of whether to continue the call attempt or terminate the call attempt. The endpoint 102 receives the decision (Act 514) through a keyboard, mouse, voice command, or other input mechanism. Alternatively or additionally, the endpoint 102 may perform automated or semi-automated image processing on the image capture data to determine the presence or absence of image features relevant to presence. For example, the endpoint 102 may apply an image processing algorithm to locate, identify, or determine the presence of a person, shape, or environmental condition. The endpoint 102 may then provide its own decision of whether to continue the call attempt, depending on the presence or absence of the person, shape, or environmental condition present in the image.
The endpoint evaluates the decision 212 (Act 516). When the decision 212 is to terminate, the endpoint 102 ends the call attempt. Otherwise, the endpoint 102 continues the call attempt and may establish the communication session (Act 518). Any established communication session eventually terminates (Act 520).
The media specifier (204) may indicate that the originating endpoint 102 has asked for image capture. Before filling that request, the destination endpoint 104 may read the image enable flags 312 (Act 608) and may determine whether the originating endpoint 102 is authorized to receive image captures (Act 610). When authorized, the destination endpoint 102 captures image data (Act 612) and communicates the image capture data 210 to the originating endpoint 102 or other entity in the network 100 (Act 614).
The destination endpoint 104 may or may not be capable or authorized to provide the image capture data 210. Regardless, the destination endpoint 104 may establish the communication session (Act 616). The established communication session eventually terminates (Act 618).
The origination endpoint 102 and destination endpoint 104 support visualization of presence. Before establishing a communication session 122, the destination endpoint 104 captures a picture or video and provides the picture or video to the originating endpoint. The visualization shows the surroundings of the destination endpoint. The visualized presence provides an independent indication of presence status associated with the destination endpoint.
The visualization may show the operator of the originating endpoint 102 that no one is present to respond to the instant message. Rather than allowing an instant messaging system to repeatedly prompt for an answer, the originating endpoint 102 may then terminate the attempt to establish the instant messaging session. The originating endpoint 102 thereby saves network bandwidth and processing resources devoted to what will be an unsuccessful attempt.
The destination endpoint 104 may be an automated (e.g., a parking garage camera) or non-automated (e.g., a personal cell phone) endpoint. Accordingly, the image capture data may provide a view of an office, conference room, parking garage or other space. Automated endpoints may be established at service providers, such as cable television providers. The cable television endpoints may return image capture data representative of movies available for pay-per-view, movies currently playing on one or more television channels, services available from the cable television provider, or image capture data for any other purpose.
It is therefore intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that it is the following claims, including all equivalents, that are intended to define the spirit and scope of this invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5892856 *||Dec 23, 1996||Apr 6, 1999||Intel Corporation||Method of presence detection using video input|
|US5914747 *||Jun 17, 1998||Jun 22, 1999||Dialogic Corporation||Automatic control of video conference membership|
|US6583813 *||Oct 7, 1999||Jun 24, 2003||Diebold, Incorporated||System and method for capturing and searching image data associated with transactions|
|US6693662 *||Jan 28, 2002||Feb 17, 2004||Sprint Communications Company, L.P.||Coordination of video sessions when calling an auto-attendant system|
|US7003795 *||Jun 26, 2001||Feb 21, 2006||Digeo, Inc.||Webcam-based interface for initiating two-way video communication|
|US7187402 *||Mar 12, 2003||Mar 6, 2007||Canon Kabushiki Kaisha||Communication apparatus, image processing apparatus, communication method, and image processing method|
|US7242421 *||Nov 13, 2001||Jul 10, 2007||Perceptive Network Technologies, Inc.||Methods of establishing a communications link using perceptual sensing of a user's presence|
|US7299286 *||Dec 27, 2001||Nov 20, 2007||Nortel Networks Limited||Personal user agent|
|US7499528 *||Feb 28, 2005||Mar 3, 2009||Lg Electronics Inc.||Method and communication system for identifying calling/called party|
|US7595816 *||Jun 23, 2003||Sep 29, 2009||Diebold, Incorporated||System and method for capturing and searching image data associated with transactions|
|US7852783 *||Dec 7, 2006||Dec 14, 2010||Cisco Technology, Inc.||Identify a secure end-to-end voice call|
|US7920847 *||May 16, 2005||Apr 5, 2011||Cisco Technology, Inc.||Method and system to protect the privacy of presence information for network users|
|US20010051984 *||Jan 27, 1997||Dec 13, 2001||Toshihiko Fukasawa||Coordinative work environment construction system, method and medium therefor|
|US20020118809||Dec 3, 2001||Aug 29, 2002||Alfred Eisenberg||Initiation and support of video conferencing using instant messaging|
|US20020163572 *||Nov 13, 2001||Nov 7, 2002||Center Julian L.||Methods of establishing a communications link using perceptual sensing of a user's presence|
|US20020196746 *||Jun 26, 2001||Dec 26, 2002||Allen Paul G.||Webcam-based interface for initiating two-way video communication|
|US20030187940 *||Mar 4, 2003||Oct 2, 2003||Collaboration Properties, Inc.||Teleconferencing employing multiplexing of video and data conferencing signals|
|US20040098491 *||Nov 14, 2002||May 20, 2004||Jose Costa-Requena||Accessing presence information|
|US20040240434 *||Jul 5, 2002||Dec 2, 2004||Matsushita Electrical Industrial Co., Ltd||Mobile terminal apparatus|
|US20040267885 *||Jun 27, 2003||Dec 30, 2004||Logitech Europe S.A.||Device based instant messenger client|
|US20050086311 *||Dec 30, 2003||Apr 21, 2005||Noel Enete||Regulating self-disclosure for video messenger|
|US20050195950 *||Feb 28, 2005||Sep 8, 2005||Lg Electronics Inc.||Method and communication system for identifying calling/called party|
|US20050223097 *||Dec 27, 2001||Oct 6, 2005||Ramsayer Christopher G||Personal user agent|
|US20060031291 *||Jun 4, 2004||Feb 9, 2006||Beckemeyer David S||System and method of video presence detection|
|US20060103762 *||Nov 18, 2004||May 18, 2006||Ly Ha M||Interactive imaging apparatus and method of operation thereof|
|US20060105794 *||Nov 12, 2004||May 18, 2006||International Business Machines Corporation||Push to view system for telephone communications|
|US20060123086 *||Dec 2, 2004||Jun 8, 2006||Morris Robert P||System and method for sending an image from a communication device|
|US20070005804 *||Nov 12, 2003||Jan 4, 2007||Neil Rideout||Multicast videoconferencing|
|US20070186002 *||Mar 27, 2002||Aug 9, 2007||Marconi Communications, Inc.||Videophone and method for a video call|
|US20070229652 *||May 30, 2007||Oct 4, 2007||Center Julian L Jr||Methods of establishing a communications link using perceptual sensing of a user's presence|
|US20080098068 *||May 20, 2004||Apr 24, 2008||Tomoichi Ebata||Space-time communication system|
|US20090203365 *||Aug 9, 2007||Aug 13, 2009||Sk Telecom Co., Ltd||Method for providing a receiver's terminal with multimedia contents before a call is connected|
|WO2003081892A2||Mar 27, 2003||Oct 2, 2003||Marconi Intellectual Property (Ringfence) Inc||Telecommunications system|
|1||Gross T: "Towards ubiquitous awareness: the PRAVTA prototype," Parallel and Distributed Processing, 2001. Proceedings, Ninth Euromicro Workshop on Feb. 7-9, 2001, Piscataway, NJ, USA, IEEE, Feb. 7, 2001, pp. 139-146.|
|2||H. Sugano et al., Presence Information Data Format (PDIF), IETF Instant Messaging and Presence Protocol (IMPP) Working Group, May 2003.|
|3||J. Peterson, Common Profile for Presence (CPP), IETF Instant Messaging and Presence Protocol (IMPP) Working Group, Aug. 14, 2003.|
|4||*||J. Rosenberg et al., SIP: Session Initiation Protocol, Jun. 2002, p. 1-17, http://tools.ietf.org/html/rfc3261.|
|5||J. Rosenberg, et al., A Data Format for Presence Using XML, IETF Internet Draft, Jun. 15, 2000.|
|6||*||J. Rosenberg; SIP: Session Initiation Protocol; Jun. 2002; http://www.ietf.org/rfc/rfc3261.txt.|
|7||M. Day et al., A model for Presence and Instant Messaging (Request for Comments: 2778), The Internet Society, Feb. 2000.|
|8||M. Day et al., Instant Messaging / Presence Protocol Requirements (Request for Comments: 2779), IETF Instant Messaging and Presence Protocol (IMPP) Working Group, Feb. 2000.|
|9||P. Saint-Andre, Ed., Extensible Messaging and Presence Protocol (XMPP): Core, Jabber Software Foundation, May 6, 2004.|
|10||P. Saint-Andre, Ed., Extensible Messaging and Presence Protocol (XMPP): Instant Messaging and Presence, Jabber Software Foundatin, Apr. 12, 2004.|
|11||P. Saint-Andre, Mapping the Extensible Messaging and Presence Protocol (XMPP) to Common Presence and Instant Messaging (CPIM), Jabber Software Foundation, May 3, 2004.|
|12||Rishi L et al.: "Presence and its effect on network," Personal Wireless Communications, 2005. ICPWC 2005, 2005 IEEE International Conference on New Delhi, India Jan. 23-25, 2005, Piscataway, NJ, USA, IEEE, Jan. 23, 2005, pp. 368-372. ISBN: 0-7809-8964-6.|
|13||Rosenberg et al., SIP Extensions for Presence, Internet Engineering Task Force, Jul. 20, 2001.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US20100110160 *||Oct 30, 2008||May 6, 2010||Brandt Matthew K||Videoconferencing Community with Live Images|
|U.S. Classification||709/204, 348/14.01|
|Cooperative Classification||H04N21/4788, H04N7/147, H04N21/44218|
|European Classification||H04N21/4788, H04N21/442E1, H04N7/14A3|
|Aug 17, 2005||AS||Assignment|
Owner name: SIEMENS COMMUNICATIONS, INC., FLORIDA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TIRUTHANI, SARAVANAKUMAR V.;OLIVEIRA, MARCELO;REEL/FRAME:016643/0970;SIGNING DATES FROM 20050727 TO 20050810
Owner name: SIEMENS COMMUNICATIONS, INC., FLORIDA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TIRUTHANI, SARAVANAKUMAR V.;OLIVEIRA, MARCELO;SIGNING DATES FROM 20050727 TO 20050810;REEL/FRAME:016643/0970
|Apr 27, 2010||AS||Assignment|
Owner name: SIEMENS ENTERPRISE COMMUNICATIONS, INC.,FLORIDA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS COMMUNICATIONS, INC.;REEL/FRAME:024294/0040
Effective date: 20100304
Owner name: SIEMENS ENTERPRISE COMMUNICATIONS, INC., FLORIDA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SIEMENS COMMUNICATIONS, INC.;REEL/FRAME:024294/0040
Effective date: 20100304
|Nov 10, 2010||AS||Assignment|
Owner name: WELLS FARGO TRUST CORPORATION LIMITED, AS SECURITY
Free format text: GRANT OF SECURITY INTEREST IN U.S. PATENTS;ASSIGNOR:SIEMENS ENTERPRISE COMMUNICATIONS, INC.;REEL/FRAME:025339/0904
Effective date: 20101109
|Nov 11, 2015||AS||Assignment|
Owner name: UNIFY, INC., FLORIDA
Free format text: CHANGE OF NAME;ASSIGNOR:SIEMENS ENTERPRISE COMMUNICATIONS, INC.;REEL/FRAME:037090/0909
Effective date: 20131015
|Nov 16, 2015||FPAY||Fee payment|
Year of fee payment: 4
|Jan 20, 2016||AS||Assignment|
Owner name: UNIFY INC. (F/K/A SIEMENS ENTERPRISE COMMUNICATION
Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:WELLS FARGO TRUST CORPORATION LIMITED, AS SECURITY AGENT;REEL/FRAME:037564/0703
Effective date: 20160120
|Feb 1, 2016||AS||Assignment|
Owner name: UNIFY INC., FLORIDA
Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:WELLS FARGO TRUST CORPORATION LIMITED;REEL/FRAME:037661/0781
Effective date: 20160120