|Publication number||US8140973 B2|
|Application number||US 12/018,786|
|Publication date||Mar 20, 2012|
|Filing date||Jan 23, 2008|
|Priority date||Jan 23, 2008|
|Also published as||US20090187825|
|Publication number||018786, 12018786, US 8140973 B2, US 8140973B2, US-B2-8140973, US8140973 B2, US8140973B2|
|Inventors||Jeff D Sandquist, Grace G Francisco, David D Shadle, Sanjay Parthasarathy|
|Original Assignee||Microsoft Corporation|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (26), Non-Patent Citations (4), Referenced by (24), Classifications (14), Legal Events (4)|
|External Links: USPTO, USPTO Assignment, Espacenet|
Current content sharing mechanisms allow a user to embed a video player on a Web page. The user links the video player to video content that exists somewhere on the Web. When someone browses to the Web page, the video player presents the video on the browser. This provides a convenient way for people to share pre-defined content with each other, but it does not allow user to personalize or add additional content to the shared pre-defined content.
Briefly, aspects of the subject matter described herein relate to annotating and sharing content. In aspects, an annotation tool presents a user interface that allows a user to enter and view annotations associated with content such as a video. The annotation tool allows the user to associate each annotation with a particular time segment of the video such that when that time segment is played in a video player, its associated annotation is presented. The annotation tool also presents a user interface that allows the user to share the video as annotated with other users as desired. Other users receiving the annotated video may further annotate the video and share it with others.
This Summary is provided to briefly identify some aspects of the subject matter that is further described below in the Detailed Description. This Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The phrase “subject matter described herein” refers to subject matter described in the Detailed Description unless the context clearly indicates otherwise. The term “aspects” is to be read as “at least one aspect.” Identifying aspects of the subject matter described in the Detailed Description is not intended to identify key or essential features of the claimed subject matter.
The aspects described above and other aspects of the subject matter described herein are illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
Aspects of the subject matter described herein are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with aspects of the subject matter described herein include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microcontroller-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
Aspects of the subject matter described herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. Aspects of the subject matter described herein may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computer 110 typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computer 110 and includes both volatile and nonvolatile media, and removable and non-removable media. By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVDs) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 110. Communication media typically embodies computer-readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer-readable media.
The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation,
The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media, discussed above and illustrated in
The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in
When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 typically includes a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160 or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
Annotating and Sharing Content
As mentioned previously, current content sharing mechanisms do not allow a user to easily add content to pre-existing content. A user may select different content to display, but is not given the option of changing or annotating existing content. Aspects of the subject matter described herein relate to providing a mechanism for annotating and sharing content.
The various entities may communicate with each other via various networks including intra- and inter-office networks and the network 215. In an embodiment, the network 215 may comprise the Internet. In an embodiment, the network 215 may comprise one or more private networks, virtual private networks, and the like.
Each of the clients 205-207 and the servers 210-212 may be implemented on or as one or more computers (e.g., the computer 110 as described in conjunction with
The Web server 212 provides Web pages to requesting clients. As is known to those skilled in the art, Web pages may display a variety of content including text and graphics each of which may be static or dynamic. A browser may be capable of displaying video in a Web page displayed in the browser. Tags or other data within the Web page may indicate where the video content is stored (e.g., on the content server 210). A video player component associated with the browser may use the tags or other data to locate a content server (e.g., the content server 210) upon which the video content is stored. Using this information, the video player component may download the video from the content server 210 and display the video within the browser.
Although the term Web page is sometimes, aspects of the subject matter described herein are also applicable to e-mail, a blog, other content sharing mechanisms, and the like.
Furthermore, in one embodiment, the term video as used herein is to be read to include any type of content that includes more than one displayable screen arranged in a time-sequenced manner. For example, the graphics display of a game may be considered as a video. As another example, a presentation that is presented as a set of slides may be considered as video. As some other examples, podcasts and animations may be considered as video. Based on the definition above, other examples of videos will be apparent to those skilled in the art.
In another embodiment, the term video is to be read to include any type of content that is displayed according to the traditional video format having a particular number of frames per second, and so forth.
The video players 225-227 have the capability of downloading and presenting annotations that are associated with a video. An annotation may be associated with a particular time segment of the video, with the video as a whole, with a combination of the above, and the like. Annotations may include additional content a user has associated with the video. Some exemplary annotations include text, audio including voice, music, and other sound, and graphics.
An annotation may include a skin, theme, or the like applied to user interface elements associated with a video player. A skin may comprise a custom graphical appearance associated with a video player or other user interface elements. For example, a user may annotate a video such that the video player displays a frame having a certain color and pattern. In conjunction with sharing a video, an annotator may be presented with a user interface allowing the annotator to select the skin, theme, or the like to be applied to user interface elements associated with the video player. This provides a way of customizing the playback experience in a manner selected by the annotator.
In one embodiment, graphics may be presented in the form of digital ink. For example, an annotation tool may allow a user to use a mouse or other input device to draw lines or other graphics on a segment of the video. When the video is replayed, these annotations may be presented as drawn by the user at the specific segment of the video. An image annotation may also have an accompanying text annotation. Some representations of exemplary annotations are described in conjunction with
In conjunction with presenting a video, a video player may present any annotations associated with the video. For example, at a time segment in the presentation of the video, the video player may present a message box that provides a comment about what is being presented. As another example, the video player may have a skin or theme applied to various elements of its user interface. This skin or theme may be viewable before a video is played within the video player and may be persisted until changed. As yet another example, the video player may display a persistent message regarding the video at a location next to or close to the video.
A user may be able to select an indicator associated with an annotation to jump to a segment of the video with which the annotation is associated. For example, referring to
By hovering over an annotation indicator a user may be able to see the comment as well as an icon associated with the annotator and a link that causes playback to jump to the video segment associated with the annotation. For example, by hovering over the annotation indicator 320 of
The annotations associated with a particular video may be stored on the annotation server 211. Tags within a Web page or other data may be used to identify the annotations associated with a video. The video player may retrieve all of the annotations associated with a particular video before presenting the video or may retrieve the annotations while presenting the video.
An annotation tool (e.g., each of the annotation tools 220-222) may be used to annotate videos. The annotation tool may present one or more user interface that is described in more detail in conjunction with
Although the content server 210 and the Web server 212 are shown as separate entities, in other embodiments, they may be the same entity or may by distributed over many entities. In some embodiments, one or more of the servers 210-211 or portions of the content contained thereon may be included on one of the clients 205-207.
Although when requesting Web pages and other content, the clients 205-207 may act as clients, at other times, they may act as server (e.g., when serving content or responding to requests) or peers to each other or other entities. Likewise, the servers 210-212, are not restricted to providing services such as providing content and may engage in activities in which they request content, services, a combination of content and services, and the like from other devices.
Although the environment described above includes three clients and three servers, it will be recognized that more, fewer, or a different combination of these or other entities may be employed without departing from the spirit or scope of aspects of the subject matter described herein. Furthermore, the entities and communication networks included in the environment may be configured in a variety of ways as will be understood by those skilled in the art without departing from the spirit or scope of aspects of the subject matter described herein.
The title 310 is where text indicating the title of the video may be displayed. The additional information indicator 315 may be selected or hovered over with a pointer to display additional information about the video such as an expanded title, author, chapter, section, other information, and the like.
The annotation indicators 320-321 indicate where annotations have been added to time segments of the video. When an annotation indicator is selected or hovered over, the annotation tool may display the annotation associated with the annotation indicator as described in more detail in conjunction with
The progress bar 325 indicates how far through the video the current position in the video is. The progress slider 330 may be used to quickly move the current position in the video forwards or backwards. As the progress bar 325 increases, the progress bar may fill a timeline that corresponds to the length of the video.
The play button 335 allows a user to indicate that the video is to begin or resume playing. Clicking on the play button 335 while the video is playing may cause the video to pause until the play button 335 is again selected.
The stop button 336 may cause playback of the video to stop. In some embodiments, pressing the stop button 336 may cause the current position in the video to be reset to the beginning of the video.
Clicking the download button 336 may allow the user to store a video to a hard drive or other non-volatile memory. In addition, clicking the download button 336 may also present a user interface that allows the user to select the video format (e.g., MP3, PSP, WMA, WMV, etc.) in which the video is to be stored.
The annotate button 337 may allow the user to annotate the video at the current or another position in the video. Clicking the annotate button 337 may cause a user interface to be displayed that allows the user to select what type of annotation (e.g., graphics, text, audio) the user desires to associate with the current position in the video.
The share button 338 may allow the user to share the annotated video with others. Clicking on the share button 338 may cause a user interface to be displayed that allows the user to share the annotated video.
The status indictor 340 may indicate how many minutes into the video the current position is. In another embodiment, the status indicator 340 may indicate how much time is left in the video from the current position.
The status indicator 341 may indicate may many minutes long the video is. In another embodiment, the status indicator 341 may indicate how much time is left in the video from the current position.
The volume indicator 342 may indicate the volume level of the video and may allow the user to adjust the volume.
The video display pane 350 may be used to display video and annotations. In one embodiment, the entire user interface 305 may be embedded in a browser. In another embodiment, the user interface 305 may be launched by a browser to display annotated video.
The elements described above are meant to be exemplary. In other embodiments, more, fewer, or different elements may be included in a user interface of an annotation tool. For example, there may be buttons that allow video loading, full screen video playback, the ability to navigate the video (e.g., via chapters, segments, or the like), and the like.
When the user selects or hovers a pointer over the annotation indicator 320 the message box 405 may be displayed. The message box 405 may include a user name 406 (e.g., DShadle), a time stamp 407 (e.g., 2:45) in the video with which the annotation is associated. Content of the text annotation (e.g., the text 410) may be displayed in the message box 405. The user interface elements 415 and 416 may be displayed to allow the user to jump to the previous or next annotation or to resume play of the video.
Audio and graphic annotations may also be presented in a similar user interface. For example, audio annotations may include the user name 406, time stamp 407, user interface elements 415-416, and may also include buttons for playing, stopping, rewinding, fast forwarding, and resuming the audio annotation.
Graphics annotations may include similar buttons for displaying a graphic annotation. A graphic annotation may be static (e.g., an illustration that does not change over the length that it is displayed) or dynamic. A dynamic graphic illustration may comprise a set of one or more frames that are associated with a video segment of the video. When a dynamic graphic annotation is encountered, the set of one or more annotated frames may be displayed and overlaid over the frames of the video.
In an embodiment, the user interface 600 may include a user interface element (e.g., the embed button 630). When the user clicks on the embed button 630, the user interface 600 may display a message box that displays a string that includes tags that identify where the video content and the annotations are stored. By pasting this string into a Web page, the user may share the annotated video with anyone who views the Web page who has a browser with an appropriate video player.
The share pane 605 may also include a contact box 615 and a message box 620. In the contact box 615, the user may type contact names and/or contact e-mails of people with which the annotated video is to be shared. The contact box 615 may allow the user to select contacts from a list maintained by a contact database. In the message box 620, the user may type a message to be associated with the annotated video.
When the user clicks on the share video button 625, a message may be sent to each contact in the contact box 615. In one embodiment, the message may include the message indicated in the message box 620 together with data (e.g., a hyperlink or other data) that allows the user to access the annotated video. In some embodiments, when an e-mail client is capable of displaying the annotated video, the message box 620 may be displayed in the e-mail client as an annotation to the video.
In some embodiments, when the user clicks on the share video button 625, the actions associated with the embed button 630 as described above and/or the actions described in the preceding paragraph may occur. For example, referring to
In yet another embodiment, a message may be sent to each contact in the contact box 615 together with a link to the annotated video. The message entered in the message box 620 may be displayed next or near to a video. An example of this embodiment is illustrated in
The annotation pane 700 may include an image 705 associated with the user who made the annotation, a time stamp 407, annotation text 410, and user interface elements 715-716. In one embodiment, the annotation pane 700 is displayed during the time segment 710. In another embodiment, the annotation pane 700 is displayed throughout the playback of the video. In yet another embodiment, the annotation pane 700 is displayed during video playback to present one or more annotations to the video.
The add comment user interface element 715 allows a user who is viewing the annotation to add an additional annotation that is associated with the same location in the video. If the user shares this annotated video, the users who receive the annotated video can see the annotation text 410 and the added annotation text and may also be allowed to further annotate the video.
The previous/next annotation user interface element 716 allows the user to navigate to previous and subsequent annotations associated with the video.
In addition, the pane 805 may include a user interface element 810 by which the pane 805 may be closed. Furthermore, the user interface pane 805 may include or be associated with a user interface element 815 that allows a user to scroll through information to view information that is not currently displayed in the pane 805.
The pane 805 may also include a link (not shown) to the annotated video if desired.
The format of display indicated in
Furthermore, in some embodiment, the user interface may be modified to indicate in real-time when a contact has created an annotation to a video that has been shared with the contact. For example, an entry in the column 820 may be used to indicate if a corresponding entity referenced in the entity column 825 has further annotated the shared video.
At block 915, a second user interface element is displayed that is associated with annotating the video. For example, referring to
At block 920, an indication of a selection of is received. For example, referring to
At block 925, in response to receiving an indication of selection of the second user interface element, an annotation to the video is received. For example, referring to
At block 930, one or more annotations are stored outside of data the represents the video. For example, referring to
At block 935, a request to share the annotated video is received. For example, referring to
At block 940, data regarding with whom the annotated video is to be shared is received. For example, referring to
At block 945, the video is shared with the requested entities. For example, referring to
At block 950, the actions end.
At block 1015, annotation data that includes one or more annotations to the video is retrieved. For example, referring to
At block 1020. A view of the video is displayed in a first portion of a user interface. For example, referring to
At block 1025, an annotation user interface element is displayed that provides the capability of navigating to an annotation associated with the video. For example, referring to
At block 1030, the actions end.
As can be seen from the foregoing detailed description, aspects have been described related to annotating and sharing content. While aspects of the subject matter described herein are susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit aspects of the claimed subject matter to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of various aspects of the subject matter described herein.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5717468||Dec 2, 1994||Feb 10, 1998||International Business Machines Corporation||System and method for dynamically recording and displaying comments for a video movie|
|US5742730||Mar 9, 1995||Apr 21, 1998||Couts; David A.||Tape control system|
|US6332144||Dec 3, 1998||Dec 18, 2001||Altavista Company||Technique for annotating media|
|US6988245 *||Jun 18, 2002||Jan 17, 2006||Koninklijke Philips Electronics N.V.||System and method for providing videomarks for a video program|
|US7257774||Jul 30, 2002||Aug 14, 2007||Fuji Xerox Co., Ltd.||Systems and methods for filtering and/or viewing collaborative indexes of recorded media|
|US7536713 *||Dec 11, 2003||May 19, 2009||Alan Bartholomew||Knowledge broadcasting and classification system|
|US7647555 *||Apr 13, 2000||Jan 12, 2010||Fuji Xerox Co., Ltd.||System and method for video access from notes or summaries|
|US20020089519||Jan 4, 2002||Jul 11, 2002||Vm Labs, Inc.||Systems and methods for creating an annotated media presentation|
|US20020099552 *||Jan 25, 2001||Jul 25, 2002||Darryl Rubin||Annotating electronic information with audio clips|
|US20040125133||Dec 30, 2002||Jul 1, 2004||The Board Of Trustees Of The Leland Stanford Junior University||Methods and apparatus for interactive network sharing of digital video content|
|US20040237032||Sep 26, 2002||Nov 25, 2004||David Miele||Method and system for annotating audio/video data files|
|US20050234958 *||Dec 7, 2001||Oct 20, 2005||Sipusic Michael J||Iterative collaborative annotation system|
|US20050289453 *||Jun 21, 2005||Dec 29, 2005||Tsakhi Segal||Apparatys and method for off-line synchronized capturing and reviewing notes and presentations|
|US20060161838 *||Jan 14, 2005||Jul 20, 2006||Ronald Nydam||Review of signature based content|
|US20060218481||Dec 19, 2003||Sep 28, 2006||Adams Jr Hugh W||System and method for annotating multi-modal characteristics in multimedia documents|
|US20070011206||Sep 14, 2006||Jan 11, 2007||Microsoft Corporation||Interactive playlist generation using annotations|
|US20070133437 *||Dec 13, 2005||Jun 14, 2007||Wengrovitz Michael S||System and methods for enabling applications of who-is-speaking (WIS) signals|
|US20070136656 *||Aug 7, 2006||Jun 14, 2007||Adobe Systems Incorporated||Review of signature based content|
|US20070240060||Jan 22, 2007||Oct 11, 2007||Siemens Corporate Research, Inc.||System and method for video capture and annotation|
|US20070245243||Mar 27, 2007||Oct 18, 2007||Michael Lanza||Embedded metadata in a media presentation|
|US20070250901 *||Mar 30, 2007||Oct 25, 2007||Mcintire John P||Method and apparatus for annotating media streams|
|US20090064005 *||Aug 29, 2007||Mar 5, 2009||Yahoo! Inc.||In-place upload and editing application for editing media assets|
|US20090119169 *||Sep 30, 2008||May 7, 2009||Blinkx Uk Ltd||Various methods and apparatuses for an engine that pairs advertisements with video files|
|US20090198685 *||Apr 8, 2009||Aug 6, 2009||Alan Bartholomew||Annotation system for creating and retrieving media and methods relating to same|
|US20100293190 *||May 13, 2010||Nov 18, 2010||Kaiser David H||Playing and editing linked and annotated audiovisual works|
|WO1999046702A1||Mar 1, 1999||Sep 16, 1999||Siemens Corporate Research, Inc.||Apparatus and method for collaborative dynamic video annotation|
|1||"BT Technology", pp. 1-3.|
|2||"YouTube Launches Custom Video Player | Personalised Viewing Video Tool", Webtvwire.com, pp. 1-4.|
|3||Adams, et al. "IBM Multimodal Annotation Tool" Aug. 2002, pp. 1-2.|
|4||Gruber , "Viddler: Embed Comments & Tag Video", Apr. 25, 2007, pp. 1-3.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8392821 *||Mar 16, 2007||Mar 5, 2013||Viddler, Inc.||Methods and systems for displaying videos with overlays and tags|
|US8869046 *||Jul 3, 2012||Oct 21, 2014||Wendell Brown||System and method for online rating of electronic content|
|US8990692 *||Mar 26, 2009||Mar 24, 2015||Google Inc.||Time-marked hyperlinking to video content|
|US9332302||Jul 24, 2015||May 3, 2016||Cinsay, Inc.||Interactive product placement system and method therefor|
|US9338499||Jul 24, 2015||May 10, 2016||Cinsay, Inc.||Interactive product placement system and method therefor|
|US9338500||Jul 24, 2015||May 10, 2016||Cinsay, Inc.||Interactive product placement system and method therefor|
|US9344754||Jul 24, 2015||May 17, 2016||Cinsay, Inc.||Interactive product placement system and method therefor|
|US9351032||Jul 24, 2015||May 24, 2016||Cinsay, Inc.||Interactive product placement system and method therefor|
|US9412372 *||May 8, 2013||Aug 9, 2016||SpeakWrite, LLC||Method and system for audio-video integration|
|US9459754 *||Oct 27, 2011||Oct 4, 2016||Edupresent, Llc||Interactive oral presentation display system|
|US9612726||Dec 28, 2013||Apr 4, 2017||Google Inc.||Time-marked hyperlinking to video content|
|US9674584||Mar 7, 2016||Jun 6, 2017||Cinsay, Inc.||Interactive product placement system and method therefor|
|US20070260677 *||Mar 16, 2007||Nov 8, 2007||Viddler, Inc.||Methods and systems for displaying videos with overlays and tags|
|US20100251120 *||Mar 26, 2009||Sep 30, 2010||Google Inc.||Time-Marked Hyperlinking to Video Content|
|US20120042265 *||Jul 13, 2011||Feb 16, 2012||Shingo Utsuki||Information Processing Device, Information Processing Method, Computer Program, and Content Display System|
|US20120066715 *||Dec 21, 2010||Mar 15, 2012||Jain Shashi K||Remote Control of Television Displays|
|US20120297411 *||Nov 19, 2010||Nov 22, 2012||Dwango Co., Ltd.||Communication system and communication method|
|US20120308195 *||May 31, 2012||Dec 6, 2012||Michael Bannan||Feedback system and method|
|US20130174007 *||Feb 14, 2013||Jul 4, 2013||Viddler, Inc.||Methods and systems for displaying videos with overlays and tags|
|US20130298025 *||Oct 27, 2011||Nov 7, 2013||Edupresent Llc||Interactive Oral Presentation Display System|
|US20130304465 *||May 8, 2013||Nov 14, 2013||SpeakWrite, LLC||Method and system for audio-video integration|
|US20140047022 *||Aug 13, 2012||Feb 13, 2014||Google Inc.||Managing a sharing of media content among cient computers|
|US20170132821 *||Feb 16, 2016||May 11, 2017||Microsoft Technology Licensing, Llc||Caption generation for visual media|
|CN104487926A *||Jun 5, 2013||Apr 1, 2015||谷歌公司||Mobile user interface for contextual browsing while playing digital content|
|U.S. Classification||715/719, 707/764, 725/146, 715/723, 715/721, 707/769, 704/270, 726/7, 715/751|
|Cooperative Classification||G06F17/241, G06F17/30781|
|European Classification||G06F17/24A, G06F17/30V|
|Jan 23, 2008||AS||Assignment|
Owner name: MICROSOFT CORPORATION, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SANDQUIST, JEFF D;FRANCISCO, GRACE G;SHADLE, DAVID D;ANDOTHERS;REEL/FRAME:020405/0402
Effective date: 20080122
|Dec 9, 2014||AS||Assignment|
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034542/0001
Effective date: 20141014
|Feb 3, 2015||CC||Certificate of correction|
|Sep 2, 2015||FPAY||Fee payment|
Year of fee payment: 4