Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20050132418 A1
Publication typeApplication
Application numberUS 11/051,347
Publication dateJun 16, 2005
Filing dateFeb 4, 2005
Priority dateJul 30, 1998
Also published asCA2333460A1, CA2333460C, CN1169358C, CN1311955A, CN1314265C, CN1571496A, DE69935861D1, DE69935861T2, DE69938616D1, DE69938616T2, EP1101356A1, EP1101356B1, EP1729515A1, EP1729515B1, US6233389, US7529465, US8526781, US8948569, US9002173, US9521356, US9800823, US20010019658, US20020146233, US20070166001, US20090208185, US20140003791, US20150215572, US20160360147, US20170094218, WO2000007368A1
Publication number051347, 11051347, US 2005/0132418 A1, US 2005/132418 A1, US 20050132418 A1, US 20050132418A1, US 2005132418 A1, US 2005132418A1, US-A1-20050132418, US-A1-2005132418, US2005/0132418A1, US2005/132418A1, US20050132418 A1, US20050132418A1, US2005132418 A1, US2005132418A1
InventorsJames Barton, Roderick McInnis, Alan Moskowitz, Andrew Goodman, Ching Chow, Jean Kao
Original AssigneeTivo Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Multimedia time warping system
US 20050132418 A1
Abstract
A multimedia time warping system allows the user to store selected television broadcast programs while the user is simultaneously watching or reviewing another program. The system accepts television (TV) input streams in a multitude of forms that are converted to an encoded formatted stream for internal transfer and manipulation and are parsed. Events are recorded that indicate the type of component that has been found, where it is located, and when it occurred. The program logic is notified that an event has occurred and the data is extracted from the buffers. The encoded streams are stored on a storage device and a decoder converts the encoded stream into TV output signals. User control commands affect the flow of the encoded stream.
Images(13)
Previous page
Next page
Claims(89)
1. An apparatus for storage and playback of audiovisual data, comprising:
a local storage device;
a first circuit, coupled to said local storage device, that parses an encoded stream of audiovisual data to generate event data, stores said event data, writes said encoded stream of audiovisual data to said local storage device, and utilizes said event data to facilitate reading said encoded stream of audiovisual data from said local storage device.
2. The apparatus according to claim 1, wherein said local storage device is a hard drive.
3. The apparatus according to claim 1, wherein said first circuit includes a media switch.
4. The apparatus according to claim 2, wherein said first circuit includes a media switch.
5. The apparatus according to claim 1, wherein said first circuit includes a central processing unit.
6. The apparatus according to claim 2, wherein said first circuit includes a central processing unit.
7. The apparatus according to claim 3, wherein said first circuit includes a central processing unit.
8. The apparatus according to claim 4, wherein said first circuit includes a central processing unit.
9. The apparatus according to claim 1, wherein said stream of audiovisual data is an MPEG stream.
10. The apparatus according to claim 2, wherein said stream of audiovisual data is an MPEG stream.
11. The apparatus according to claim 3, wherein said stream of audiovisual data is an MPEG stream.
12. The apparatus according to claim 4, wherein said stream of audiovisual data is an MPEG stream.
13. The apparatus according to claim 5, wherein said stream of audiovisual data is an MPEG stream.
14. The apparatus according to claim 6, wherein said stream of audiovisual data is an MPEG stream.
15. The apparatus according to claim 7, wherein said stream of audiovisual data is an MPEG stream.
16. The apparatus according to claim 8, wherein said stream of audiovisual data is an MPEG stream.
17. The apparatus according to claim 9, wherein said event data comprises data indicating the start of a video I-frame.
18. The apparatus according to claim 10, wherein said event data comprises data indicating the start of a video I-frame.
19. The apparatus according to claim 11, wherein said event data comprises data indicating the start of a video I-frame.
20. The apparatus according to claim 12, wherein said event data comprises data indicating the start of a video I-frame.
21. The apparatus according to claim 13, wherein said event data comprises data indicating the start of a video I-frame.
22. The apparatus according to claim 14, wherein said event data comprises data indicating the start of a video I-frame.
23. The apparatus according to claim 15, wherein said event data comprises data indicating the start of a video I-frame.
24. The apparatus according to claim 16, wherein said event data comprises data indicating the start of a video I-frame.
25. An apparatus for storage and playback of audiovisual data, comprising:
a local storage device;
a control unit; and
a first circuit, coupled to said local storage device and to said control unit, that accepts and parses an encoded stream of audiovisual data to generate event data, writes said event data into an event buffer, transmits said parsed encoded stream of audiovisual data to said local storage device, notifies said control unit when said event data are written into said event buffer, subsequently extracts said parsed encoded stream of audiovisual data from said local storage device, and transmits said extracted encoded stream of audiovisual data to a decoder, wherein said first circuit transmits said parsed encoded stream of audiovisual data to said local storage device autonomously with respect to said control unit.
26. An apparatus for storage and playback of audiovisual data, comprising:
a local storage device;
a central processing unit; and
a first circuit, coupled to said local storage device and to said central processing unit, comprising means for accepting an encoded stream of audiovisual data, means for parsing said encoded stream of audiovisual data to generate event data, means for writing said event data into an event buffer, means for transmitting said parsed encoded stream of audiovisual data to said local storage device, means for notifying said control unit when said event data are written to said event buffer, means for subsequently extracting said parsed encoded stream of audiovisual data from said local storage device, and means for transmitting said extracted encoded stream of audiovisual data to a decoder, wherein said first circuit transmits said parsed encoded stream of audiovisual data to said local storage device autonomously with respect to said central processing unit.
27. The apparatus of claims 25 or 26, wherein said stream of audiovisual data is an MPEG stream.
28. The apparatus according to claims 25 or 26, wherein said local storage device is a hard drive.
29. The apparatus according to claims 25 or 26, wherein said local storage device is a hard drive and said stream of audiovisual data is an MPEG stream.
30. The apparatus according to claim 27, wherein said event data comprises data indicating the start of a video I-frame.
31. The apparatus according to claim 29, wherein said event data comprises data indicating the start of a video I-frame.
32. An apparatus for storage and playback of audiovisual data, comprising:
a local storage device;
a central processing unit; and
a first circuit, coupled to said local storage device and asynchronously coupled to said central processing unit, comprising an input interface module for parsing an MPEG stream to generate event data and for writing said event data into an event buffer, a local storage device interface module for transmitting said parsed MPEG stream to said local storage device and for subsequently extracting said parsed MPEG stream from said local storage device, and an output interface module for transmitting said extracted MPEG stream to a decoder, wherein said first circuit transmits said parsed encoded stream of audiovisual data to said local storage device autonomously with respect to said central processing unit.
33. The apparatus according to claim 32, wherein said local storage device is a hard drive.
34. The apparatus of claim 32 or 33, wherein said central processing unit executes program logic and said first circuit communicates with said program logic via an interrupt mechanism.
35. The apparatus of claim 32, 33, or 34, further comprising a memory, wherein said first circuit further comprises a memory interface module for buffering said parsed MPEG stream into said memory.
36. The apparatus according to claim 32, wherein said event data comprises data indicating the start of a video I-frame.
37. A method for storage and playback of audiovisual data, comprising:
receiving an encoded stream of audiovisual data;
parsing said encoded stream of audiovisual data to generate event data;
writing said event data into an event buffer;
transmitting said encoded stream of audiovisual data from a first circuit to a local storage device;
transmitting said encoded stream of audiovisual data from said local storage device to said first circuit;
storing said encoded stream of audiovisual data in an output buffer; and
varying a flow of said encoded audiovisual data from said output buffer to an output device upon receipt of one or more control commands from a user.
38. A method for storage and playback of audiovisual data, comprising:
receiving an encoded stream of audiovisual data;
parsing said encoded stream of audiovisual data to generate event data;
transmitting said encoded stream of audiovisual data from a first circuit to a local storage device; and
utilizing said event data to facilitate subsequently reading said encoded stream of audiovisual data from said local storage device.
39. The method according to claims 37 or 38, wherein said local storage device is a hard drive.
40. The apparatus according to claims 37 or 38, wherein said stream of audiovisual data is an MPEG stream.
41. The apparatus according to claims 39 or 40, wherein said local storage device is a hard drive and said stream of audiovisual data is an MPEG stream.
42. The method according to claim 37, wherein said control commands comprise any of: a reverse control command, a play control command, a fast forward control command, or a pause control command.
43. The apparatus according to claim 40, wherein said event data comprises data indicating the start of a video I-frame.
44. The apparatus according to claim 41, wherein said event data comprises data indicating the start of a video I-frame.
45. A method for storage and playback of audiovisual data, comprising:
receiving an encoded stream of audiovisual data;
parsing said encoded stream of audiovisual data to generate event data;
storing said event data in an event buffer;
transmitting said encoded stream of audiovisual data from a first circuit to a local storage device; and
utilizing said stored event data to facilitate subsequently reading said encoded stream of audiovisual data from said local storage device.
46. A method for storage and playback of audiovisual data, comprising the steps of:
accepting an encoded stream of audiovisual data;
parsing said encoded stream of audiovisual data to generate event data;
storing said event data in an event buffer;
transmitting said parsed encoded stream of audiovisual data to a local storage device;
extracting said parsed encoded stream of audiovisual data from said local storage device;
transmitting said extracted encoded stream of audiovisual data to a decoder;
communicating asynchronously via an interrupt mechanism with program logic executed by a central processing unit to control said storage and playback of said audiovisual data.
47. The method according to claims 45 or 46, wherein said local storage device is a hard drive.
48. The method according to claims 45 or 46, wherein said stream of audiovisual data is an MPEG stream.
49. The method according to claims 45 or 46, wherein said local storage device is a hard drive and said stream of audiovisual data is an MPEG stream.
50. The method according to claims 45 or 46 further comprising the step of buffering said parsed encoded stream of audiovisual data into a memory.
51. The method according to claims 45 or 46, wherein said local storage device is a hard drive and said stream of audiovisual data is an MPEG stream, and further comprising the step of buffering said parsed encoded stream of audiovisual data into a memory.
52. The method according to claim 48, wherein said event data comprises data indicating the start of a video I-frame.
53. The method according to claim 49, wherein said event data comprises data indicating the start of a video I-frame.
54. The method according to claim 51, wherein said event data comprises data indicating the start of a video I-frame.
55. An apparatus for storage and playback of audiovisual data, comprising:
a local storage device;
a central processing unit executing program logic; and
a first circuit, coupled to said local storage device and to said central processing unit, for accepting an encoded stream of audiovisual data, parsing said encoded stream of audiovisual data to generate event data indicating the start of video or audio components within said encoded stream of audiovisual data, writing said event data into an event buffer, transmitting said parsed encoded stream of audiovisual data to said local storage device, subsequently extracting said parsed encoded stream of audiovisual data from said local storage device, transmitting said extracted encoded stream of audiovisual data to a decoder, and notifying said program logic when said event data are written into said event buffer, wherein said event data contains the location of said video or audio components in said event buffer.
56. An apparatus for storage and playback of audiovisual data, comprising:
a local storage device;
a central processing unit executing program logic; and
a first circuit, coupled to said local storage device and to said central processing unit, comprising means for accepting an encoded stream of audiovisual data, means for parsing said encoded stream of audiovisual data to generate event data indicating the start of video or audio components within said encoded stream of audiovisual data, means for writing said event data into an event buffer, means for transmitting said parsed encoded stream of audiovisual data to said local storage device, means for subsequently extracting said parsed encoded stream of audiovisual data from said local storage device, means for transmitting said extracted encoded stream of audiovisual data to a decoder, and means for notifying said program logic when said event data are written into said event buffer, wherein said event data contains the location of said video or audio components in said event buffer.
57. The method according to claims 55 or 56, wherein said local storage device is a hard drive.
58. The method according to claims 55 or 56, wherein said stream of audiovisual data is an MPEG stream.
59. The method according to claims 55 or 56, wherein said local storage device is a hard drive and said stream of audiovisual data is an MPEG stream.
60. The method according to claims 55 or 56, further comprising the step of buffering said parsed encoded stream of audiovisual data into a memory.
61. The method according to claims 55 or 56, wherein said local storage device is a hard drive and said stream of audiovisual data is an MPEG stream, and further comprising the step of buffering said parsed encoded stream of audiovisual data into a memory.
62. The method according to claim 58, wherein said event data comprises data indicating the start of a video I-frame.
63. The method according to claim 59, wherein said event data comprises data indicating the start of a video I-frame.
64. The method according to claim 61, wherein said event data comprises data indicating the start of a video I-frame.
65. An apparatus for storage and playback of audiovisual data, comprising:
a local storage device; and
a first circuit, coupled to said local storage device, that parses an encoded stream of audiovisual data to generate event data, stores said event data in an event buffer, writes said encoded stream of audiovisual data to said local storage device, subsequently transmits said encoded stream of audiovisual data from said local storage device to an output buffer, and varies a flow of said encoded audiovisual data from said output buffer to an output device upon receipt of one or more control commands from a user.
66. The apparatus according to claim 65, wherein said local storage device is a hard drive.
67. The apparatus according to claim 65, wherein said first circuit includes a media switch.
68. The apparatus according to claim 66, wherein said first circuit includes a media switch.
69. The apparatus according to claim 65, wherein said first circuit includes a central processing unit.
70. The apparatus according to claim 66, wherein said first circuit includes a central processing unit.
71. The apparatus according to claim 67, wherein said first circuit includes a central processing unit.
72. The apparatus according to claim 68, wherein said first circuit includes a central processing unit.
73. The apparatus according to claim 65, wherein said stream of audiovisual data is an MPEG stream.
74. The apparatus according to claim 66, wherein said stream of audiovisual data is an MPEG stream.
75. The apparatus according to claim 67, wherein said stream of audiovisual data is an MPEG stream.
76. The apparatus according to claim 68, wherein said stream of audiovisual data is an MPEG stream.
77. The apparatus according to claim 69, wherein said stream of audiovisual data is an MPEG stream.
78. The apparatus according to claim 70, wherein said stream of audiovisual data is an MPEG stream.
79. The apparatus according to claim 71, wherein said stream of audiovisual data is an MPEG stream.
80. The apparatus according to claim 72, wherein said stream of audiovisual data is an MPEG stream.
81. The apparatus according to claim 73, wherein said event data comprises data indicating the start of a video I-frame.
82. The apparatus according to claim 74, wherein said event data comprises data indicating the start of a video I-frame.
83. The apparatus according to claim 75, wherein said event data comprises data indicating the start of a video I-frame.
84. The apparatus according to claim 76, wherein said event data comprises data indicating the start of a video I-frame.
85. The apparatus according to claim 77, wherein said event data comprises data indicating the start of a video I-frame.
86. The apparatus according to claim 78, wherein said event data comprises data indicating the start of a video I-frame.
87. The apparatus according to claim 79, wherein said event data comprises data indicating the start of a video I-frame.
88. The apparatus according to claim 80, wherein said event data comprises data indicating the start of a video I-frame.
89. The apparatus according to claim 65, wherein said control commands comprise any of: a reverse control command, a play control command, a fast forward control command, or a pause control command.
Description
    CLAIM OF PRIORITY AND RELATED APPLICATION
  • [0001]
    This application is a continuation of and claims benefit to U.S. patent application Ser. No. 09/827,029 (Attorney Docket No. 60097-0026), filed Apr. 5, 2001, entitled “Multimedia Time Warping System”, which is a continuation of U.S. Pat. No. 6,233,389 B1, (Attorney Docket No. 60097-0025), issued on May 15, 2001, entitled “Multimedia Time Warping System”, the entire contents of which are incorporated by reference as if fully set forth herein; and is further related to U.S. patent application Ser. No. 09/935,426 (Attorney Docket No. 60097-0027), filed Aug. 22, 2001, entitled “Multimedia Signal Processing System” and U.S. patent application Ser. No. 10/081,776 (Attorney Docket No. 60097-0029), filed Feb. 20, 2002, entitled “Multimedia Time Warping System”.
  • BACKGROUND OF THE INVENTION
  • [0002]
    1. Technical Field
  • [0003]
    The invention relates to the time shifting of television broadcast signals. More particularly, the invention relates to the real time capture, storage, and display of television broadcast signals.
  • [0004]
    2. Description of the Prior Art
  • [0005]
    The Video Cassette Recorder (VCR) has changed the lives of television (TV) viewers throughout the world. The VCR has offered viewers the flexibility to time-shift TV programs to match their lifestyles.
  • [0006]
    The viewer stores TV programs onto magnetic tape using the VCR. The VCR gives the viewer the ability to play, rewind, fast forward and pause the stored program material. These functions enable the viewer to pause the program playback whenever he desires, fast forward through unwanted program material or commercials, and to replay favorite scenes. However, a VCR cannot both capture and play back information at the same time.
  • [0007]
    One approach to solving this problem is to use several VCRs. For example, if two video tape recorders are available, it might be possible to Ping-Pong between the two. In this case, the first recorder is started at the beginning of the program of interest. If the viewer wishes to rewind the broadcast, the second recorder begins recording, while the first recorder is halted, rewound to the appropriate place, and playback initiated. However, at least a third video tape recorder is required if the viewer wishes to fast forward to some point in time after the initial rewind was requested. In this case, the third recorder starts recording the broadcast stream while the second is halted and rewound to the appropriate position. Continuing this exercise, one can quickly see that the equipment becomes unwieldy, unreliable, expensive, and hard to operate, while never supporting all desired functions. In addition, tapes are of finite length, and may potentially end at inconvenient times, drastically lowering the value of the solution.
  • [0008]
    The use of digital computer systems to solve this problem has been suggested. U.S. Pat. No. 5,371,551 issued to Logan et al., on 6 Dec. 1994, teaches a method for concurrent video recording and playback. It presents a microprocessor controlled broadcast and playback device. Said device compresses and stores video data onto a hard disk. However, this approach is difficult to implement because the processor requirements for keeping up with the high video rates makes the device expensive and problematic. The microprocessor must be extremely fast to keep up with the incoming and outgoing video data.
  • [0009]
    It would be advantageous to provide a multimedia time warping system that gives the user the ability to simultaneously record and play back TV broadcast programs. It would further be advantageous to provide a multimedia time warping system that utilizes an approach that decouples the microprocessor from the high video data rates, thereby reducing the microprocessor and system requirements which are at a premium.
  • SUMMARY OF THE INVENTION
  • [0010]
    The invention provides a multimedia time warping system. The invention utilizes an easily manipulated, low cost multimedia storage and display system that allows the user to view a television broadcast program with the option of instantly reviewing previous scenes within the program. In addition, the invention allows the user to store selected television broadcast programs while the user is simultaneously watching or reviewing another program.
  • [0011]
    An embodiment of the invention accepts television (TV) input streams in a multitude of forms, for example, analog forms such as National Television Standards Committee (NTSC) or PAL broadcast, and digital forms such as Digital Satellite System (DSS), Digital Broadcast Services (DBS), or Advanced Television Standards Committee (ATSC). Analog TV streams are converted to an Moving Pictures Experts Group (MPEG) formatted stream for internal transfer and manipulation, while pre-formatted MPEG streams are extracted from the digital TV signal and presented in a similar format to encoded analog streams.
  • [0012]
    The invention parses the resulting MPEG stream and separates it into its video and audio components. It then stores the components into temporary buffers. Events are recorded that indicate the type of component that has been found, where it is located, and when it occurred. The program logic is notified that an event has occurred and the data is extracted from the buffers.
  • [0013]
    The parser and event buffer decouple the CPU from having to parse the MPEG stream and from the real time nature of the data streams. This decoupling allows for slower CPU and bus speeds which translate to lower system costs.
  • [0014]
    The video and audio components are stored on a storage device. When the program is requested for display, the video and audio components are extracted from the storage device and reassembled into an MPEG stream. The MPEG stream is sent to a decoder. The decoder converts the MPEG stream into TV output signals and delivers the TV output signals to a TV receiver.
  • [0015]
    User control commands are accepted and sent through the system. These commands affect the flow of said MPEG stream and allow the user to view stored programs with at least the following functions: reverse, fast forward, play, pause, index, fast/slow reverse play, and fast/slow play.
  • [0016]
    Other aspects and advantages of the invention will become apparent from the following detailed description in combination with the accompanying drawings, illustrating, by way of example, the principles of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0017]
    FIG. 1 is a block schematic diagram of a high level view of an embodiment of the invention according to the invention;
  • [0018]
    FIG. 2 is a block schematic diagram of an embodiment of the invention using multiple input and output modules according to the invention;
  • [0019]
    FIG. 3 is a schematic diagram of an Moving Pictures Experts Group (MPEG) data stream and its video and audio components according to the invention;
  • [0020]
    FIG. 4 is a block schematic diagram of a parser and four direct memory access (DMA) input engines contained in the Media Switch according to the invention;
  • [0021]
    FIG. 5 is a schematic diagram of the components of a packetized elementary stream (PES) buffer according to the invention;
  • [0022]
    FIG. 6 is a schematic diagram of the construction of a PES buffer from the parsed components in the Media Switch output circular buffers;
  • [0023]
    FIG. 7 is a block schematic diagram of the Media Switch and the various components that it communicates with according to the invention;
  • [0024]
    FIG. 8 is a block schematic diagram of a high level view of the program logic according to the invention;
  • [0025]
    FIG. 9 is a block schematic diagram of a class hierarchy of the program logic according to the invention;
  • [0026]
    FIG. 10 is a block schematic diagram of an embodiment of the clip cache component of the invention according to the invention;
  • [0027]
    FIG. 11 is a block schematic diagram of an embodiment of the invention that emulates a broadcast studio video mixer according to the invention;
  • [0028]
    FIG. 12 is a block schematic diagram of a closed caption parser according to the invention; and
  • [0029]
    FIG. 13 is a block schematic diagram of a high level view of an embodiment of the invention utilizing a VCR as an integral component of the invention according to the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • [0030]
    The invention is embodied in a multimedia time warping system. A system according to the invention provides a multimedia storage and display system that allows the user to view a television broadcast program with the option of instantly reviewing previous scenes within the program. The invention additionally provides the user with the ability to store selected television broadcast programs while simultaneously watching or reviewing another program and to view stored programs with at least the following functions: reverse, fast forward, play, pause, index, fast/slow reverse play, and fast/slow play.
  • [0031]
    Referring to FIG. 1, an embodiment of the invention has an Input Section 101, Media Switch 102, and an Output Section 103. The Input Section 101 takes television (TV) input streams in a multitude of forms, for example, National Television Standards Committee (NTSC) or PAL broadcast, and digital forms such as Digital Satellite System (DSS), Digital Broadcast Services (DBS), or Advanced Television Standards Committee (ATSC). DBS, DSS and ATSC are based on standards called Moving Pictures Experts Group 2 (MPEG2) and MPEG2 Transport. MPEG2 Transport is a standard for formatting the digital data stream from the TV source transmitter so that a TV receiver can disassemble the input stream to find programs in the multiplexed signal. The Input Section 101 produces MPEG streams. An MPEG2 transport multiplex supports multiple programs in the same broadcast channel, with multiple video and audio feeds and private data. The Input Section 101 tunes the channel to a particular program, extracts a specific MPEG program out of it, and feeds it to the rest of the system. Analog TV signals are encoded into a similar MPEG format using separate video and audio encoders, such that the remainder of the system is unaware of how the signal was obtained. Information may be modulated into the Vertical Blanking Interval (VBI) of the analog TV signal in a number of standard ways; for example, the North American Broadcast Teletext Standard (NABTS) may be used to modulate information onto lines 10 through 20 of an NTSC signal, while the FCC mandates the use of line 21 for Closed Caption (CC) and Extended Data Services (EDS). Such signals are decoded by the input section and passed to the other sections as if they were delivered via an MPEG2 private data channel.
  • [0032]
    The Media Switch 102 mediates between a microprocessor CPU 106, hard disk or storage device 105, and memory 104. Input streams are converted to an MPEG stream and sent to the Media Switch 102. The Media Switch 102 buffers the MPEG stream into memory. It then performs two operations if the user is watching real time TV: the stream is sent to the Output Section 103 and it is written simultaneously to the hard disk or storage device 105.
  • [0033]
    The Output Section 103 takes MPEG streams as input and produces an analog TV signal according to the NTSC, PAL, or other required TV standards. The Output Section 103 contains an MPEG decoder, On-Screen Display (OSD) generator, analog TV encoder and audio logic. The OSD generator allows the program logic to supply images which will be overlayed on top of the resulting analog TV signal. Additionally, the Output Section can modulate information supplied by the program logic onto the VBI of the output signal in a number of standard formats, including NABTS, CC and EDS.
  • [0034]
    With respect to FIG. 2, the invention easily expands to accommodate multiple Input Sections (tuners) 201, 202, 203, 204, each can be tuned to different types of input. Multiple Output Modules (decoders) 206, 207, 208, 209 are added as well. Special effects such as picture in a picture can be implemented with multiple decoders. The Media Switch 205 records one program while the user is watching another. This means that a stream can be extracted off the disk while another stream is being stored onto the disk.
  • [0035]
    Referring to FIG. 3, the incoming MPEG stream 301 has interleaved video 302, 305, 306 and audio 303, 304, 307 segments. These elements must be separated and recombined to create separate video 308 and audio 309 streams or buffers. This is necessary because separate decoders are used to convert MPEG elements back into audio or video analog components. Such separate delivery requires that time sequence information be generated so that the decoders may be properly synchronized for accurate playback of the signal.
  • [0036]
    The Media Switch enables the program logic to associate proper time sequence information with each segment, possibly embedding it directly into the stream. The time sequence information for each segment is called a time stamp. These time stamps are monotonically increasing and start at zero each time the system boots up. This allows the invention to find any particular spot in any particular video segment. For example, if the system needs to read five seconds into an incoming contiguous video stream that is being cached, the system simply has to start reading forward into the stream and look for the appropriate time stamp.
  • [0037]
    A binary search can be performed on a stored file to index into a stream. Each stream is stored as a sequence of fixed-size segments enabling fast binary searches because of the uniform timestamping. If the user wants to start in the middle of the program, the system performs a binary search of the stored segments until it finds the appropriate spot, obtaining the desired results with a minimal amount of information. If the signal were instead stored as an MPEG stream, it would be necessary to linearly parse the stream from the beginning to find the desired location.
  • [0038]
    With respect to FIG. 4, the Media Switch contains four input Direct Memory Access (DMA) engines 402, 403, 404, 405 each DMA engine has an associated buffer 410, 411, 412, 413. Conceptually, each DMA engine has a pointer 406, a limit for that pointer 407, a next pointer 408, and a limit for the next pointer 409. Each DMA engine is dedicated to a particular type of information, for example, video 402, audio 403, and parsed events 405. The buffers 410, 411, 412, 413 are circular and collect the specific information. The DMA engine increments the pointer 406 into the associated buffer until it reaches the limit 407 and then loads the next pointer 408 and limit 409. Setting the pointer 406 and next pointer 408 to the same value, along with the corresponding limit value creates a circular buffer. The next pointer 408 can be set to a different address to provide vector DMA.
  • [0039]
    The input stream flows through a parser 401. The parser 401 parses the stream looking for MPEG distinguished events indicating the start of video, audio or private data segments. For example, when the parser 401 finds a video event, it directs the stream to the video DMA engine 402. The parser 401 buffers up data and DMAs it into the video buffer 410 through the video DMA engine 402. At the same time, the parser 401 directs an event to the event DMA engine 405 which generates an event into the event buffer 413. When the parser 401 sees an audio event, it redirects the byte stream to the audio DMA engine 403 and generates an event into the event buffer 413. Similarly, when the parser 401 sees a private data event, it directs the byte stream to the private data DMA engine 404 and directs an event to the event buffer 413. The Media Switch notifies the program logic via an interrupt mechanism when events are placed in the event buffer.
  • [0040]
    Referring to FIGS. 4 and 5, the event buffer 413 is filled by the parser 401 with events. Each event 501 in the event buffer has an offset 502, event type 503, and time stamp field 504. The parser 401 provides the type and offset of each event as it is placed into the buffer. For example, when an audio event occurs, the event type field is set to an audio event and the offset indicates the location in the audio buffer 411. The program logic knows where the audio buffer 411 starts and adds the offset to find the event in the stream. The address offset 502 tells the program logic where the next event occurred, but not where it ended. The previous event is cached so the end of the current event can be found as well as the length of the segment.
  • [0041]
    With respect to FIGS. 5 and 6, the program logic reads accumulated events in the event buffer 602 when it is interrupted by the Media Switch 601. From these events the program logic generates a sequence of logical segments 603 which correspond to the parsed MPEG segments 615. The program logic converts the offset 502 into the actual address 610 of each segment, and records the event length 609 using the last cached event. If the stream was produced by encoding an analog signal, it will not contain Program Time Stamp (PTS) values, which are used by the decoders to properly present the resulting output. Thus, the program logic uses the generated time stamp 504 to calculate a simulated PTS for each segment and places that into the logical segment timestamp 607. In the case of a digital TV stream, PTS values are already encoded in the stream. The program logic extracts this information and places it in the logical segment timestamp 607.
  • [0042]
    The program logic continues collecting logical segments 603 until it reaches the fixed buffer size. When this occurs, the program logic generates a new buffer, called a Packetized Elementary Stream (PES) 605 buffer containing these logical segments 603 in order, plus ancillary control information. Each logical segment points 604 directly to the circular buffer, e.g., the video buffer 613, filled by the Media Switch 601. This new buffer is then passed to other logic components, which may further process the stream in the buffer in some way, such as presenting it for decoding or writing it to the storage media. Thus, the MPEG data is not copied from one location in memory to another by the processor. This results in a more cost effective design since lower memory bandwidth and processor bandwidth is required.
  • [0043]
    A unique feature of the MPEG stream transformation into PES buffers is that the data associated with logical segments need not be present in the buffer itself, as presented above. When a PES buffer is written to storage, these logical segments are written to the storage medium in the logical order in which they appear. This has the effect of gathering components of the stream, whether they be in the video, audio or private data circular buffers, into a single linear buffer of stream data on the storage medium. The buffer is read back from the storage medium with a single transfer from the storage media, and the logical segment information is updated to correspond with the actual locations in the buffer 606. Higher level program logic is unaware of this transformation, since it handles only the logical segments, thus stream data is easily managed without requiring that the data ever be copied between locations in DRAM by the CPU.
  • [0044]
    A unique aspect of the Media Switch is the ability to handle high data rates effectively and inexpensively. It performs the functions of taking video and audio data in, sending video and audio data out, sending video and audio data to disk, and extracting video and audio data from the disk on a low cost platform. Generally, the Media Switch runs asynchronously and autonomously with the microprocessor CPU, using its DMA capabilities to move large quantities of information with minimal intervention by the CPU.
  • [0045]
    Referring to FIG. 7, the input side of the Media Switch 701 is connected to an MPEG encoder 703. There are also circuits specific to MPEG audio 704 and vertical blanking interval (VBI) data 702 feeding into the Media Switch 701. If a digital TV signal is being processed instead, the MPEG encoder 703 is replaced with an MPEG2 Transport Demultiplexor, and the MPEG audio encoder 704 and VBI decoder 702 are deleted. The demultiplexor multiplexes the extracted audio, video and private data channel streams through the video input Media Switch port.
  • [0046]
    The parser 705 parses the input data stream from the MPEG encoder 703, audio encoder 704 and VBI decoder 702, or from the transport demultiplexor in the case of a digital TV stream. The parser 705 detects the beginning of all of the important events in a video or audio stream, the start of all of the frames, the start of sequence headers—all of the pieces of information that the program logic needs to know about in order to both properly play back and perform special effects on the stream, e.g. fast forward, reverse, play, pause, fast/slow play, indexing, and fast/slow reverse play.
  • [0047]
    The parser 705 places tags 707 into the FIFO 706 when it identifies video or audio segments, or is given private data. The DMA 709 controls when these tags are taken out. The tags 707 and the DMA addresses of the segments are placed into the event queue 708. The frame type information, whether it is a start of a video I-frame, video B-frame, video P-frame, video PES, audio PES, a sequence header, an audio frame, or private data packet, is placed into the event queue 708 along with the offset in the related circular buffer where the piece of information was placed. The program logic operating in the CPU 713 examines events in the circular buffer after it is transferred to the DRAM 714.
  • [0048]
    The Media Switch 701 has a data bus 711 that connects to the CPU 713 and DRAM 714. An address bus 712 is also shared between the Media Switch 701, CPU 713, and DRAM 714. A hard disk or storage device 710 is connected to one of the ports of the Media Switch 701. The Media Switch 701 outputs streams to an MPEG video decoder 715 and a separate audio decoder 717. The audio decoder 717 signals contain audio cues generated by the system in response to the user's commands on a remote control or other internal events. The decoded audio output from the MPEG decoder is digitally mixed 718 with the separate audio signal. The resulting signals contain video, audio, and on-screen displays and are sent to the TV 716.
  • [0049]
    The Media Switch 701 takes in 8-bit data and sends it to the disk, while at the same time extracts another stream of data off of the disk and sends it to the MPEG decoder 715. All of the DMA engines described above can be working at the same time. The Media Switch 701 can be implemented in hardware using a Field Programmable Gate Array (FPGA), ASIC, or discrete logic.
  • [0050]
    Rather than having to parse through an immense data stream looking for the start of where each frame would be, the program logic only has to look at the circular event buffer in DRAM 714 and it can tell where the start of each frame is and the frame type. This approach saves a large amount of CPU power, keeping the real time requirements of the CPU 713 small. The CPU 713 does not have to be very fast at any point in time. The Media Switch 701 gives the CPU 713 as much time as possible to complete tasks. The parsing mechanism 705 and event queue 708 decouple the CPU 713 from parsing the audio, video, and buffers and the real time nature of the streams, which allows for lower costs. It also allows the use of a bus structure in a CPU environment that operates at a much lower clock rate with much cheaper memory than would be required otherwise.
  • [0051]
    The CPU 713 has the ability to queue up one DMA transfer and can set up the next DMA transfer at its leisure. This gives the CPU 713 large time intervals within which it can service the DMA controller 709. The CPU 713 may respond to a DMA interrupt within a larger time window because of the large latency allowed. MPEG streams, whether extracted from an MPEG2 Transport or encoded from an analog TV signal, are typically encoded using a technique called Variable Bit Rate encoding (VBR). This technique varies the amount of data required to represent a sequence of images by the amount of movement between those images. This technique can greatly reduce the required bandwidth for a signal, however sequences with rapid movement (such as a basketball game) may be encoded with much greater bandwidth requirements. For example, the Hughes DirecTV satellite system encodes signals with anywhere from 1 to 10 Mb/s of required bandwidth, varying from frame to frame. It would be difficult for any computer system to keep up with such rapidly varying data rates without this structure.
  • [0052]
    With respect to FIG. 8, the program logic within the CPU has three conceptual components: sources 801, transforms 802, and sinks 803. The sources 801 produce buffers of data. Transforms 802 process buffers of data and sinks 803 consume buffers of data. A transform is responsible for allocating and queuing the buffers of data on which it will operate. Buffers are allocated as if “empty” to sources of data, which give them back “full”. The buffers are then queued and given to sinks as “full”, and the sink will return the buffer “empty”.
  • [0053]
    A source 801 accepts data from encoders, e.g., a digital satellite receiver. It acquires buffers for this data from the downstream transform, packages the data into a buffer, then pushes the buffer down the pipeline as described above. The source object 801 does not know anything about the rest of the system. The sink 803 consumes buffers, taking a buffer from the upstream transform, sending the data to the decoder, and then releasing the buffer for reuse.
  • [0054]
    There are two types of transforms 802 used: spatial and temporal. Spatial transforms are transforms that perform, for example, an image convolution or compression/decompression on the buffered data that is passing through. Temporal transforms are used when there is no time relation that is expressible between buffers going in and buffers coming out of a system. Such a transform writes the buffer to a file 804 on the storage medium. The buffer is pulled out at a later time, sent down the pipeline, and properly sequenced within the stream.
  • [0055]
    Referring to FIG. 9, a C++ class hierarchy derivation of the program logic is shown. The TiVo Media Kernel (Tmk) 904, 908, 913 mediates with the operating system kernel. The kernel provides operations such as: memory allocation, synchronization, and threading. The TmkCore 904, 908, 913 structures memory taken from the media kernel as an object. It provides operators, new and delete, for constructing and deconstructing the object. Each object (source 901, transform 902, and sink 903) is multi-threaded by definition and can run in parallel.
  • [0056]
    The TmkPipeline class 905, 909, 914 is responsible for flow control through the system. The pipelines point to the next pipeline in the flow from source 901 to sink 903. To pause the pipeline, for example, an event called “pause” is sent to the first object in the pipeline. The event is relayed on to the next object and so on down the pipeline. This all happens asynchronously to the data going through the pipeline. Thus, similar to applications such as telephony, control of the flow of MPEG streams is asynchronous and separate from the streams themselves. This allows for a simple logic design that is at the same time powerful enough to support the features described previously, including pause, rewind, fast forward and others. In addition, this structure allows fast and efficient switching between stream sources, since buffered data can be simply discarded and decoders reset using a single event, after which data from the new stream will pass down the pipeline. Such a capability is needed, for example, when switching the channel being captured by the input section, or when switching between a live signal from the input section and a stored stream.
  • [0057]
    The source object 901 is a TmkSource 906 and the transform object 902 is a TmkXfrm 910. These are intermediate classes that define standard behaviors for the classes in the pipeline. Conceptually, they handshake buffers down the pipeline. The source object 901 takes data out of a physical data source, such as the Media Switch, and places it into a PES buffer. To obtain the buffer, the source object 901 asks the down stream object in his pipeline for a buffer (allocEmptyBuf). The source object 901 is blocked until there is sufficient memory. This means that the pipeline is self-regulating; it has automatic flow control. When the source object 901 has filled up the buffer, it hands it back to the transform 902 through the pushFullBuf function.
  • [0058]
    The sink 903 is flow controlled as well. It calls nextFullBuf which tells the transform 902 that it is ready for the next filled buffer. This operation can block the sink 903 until a buffer is ready. When the sink 903 is finished with a buffer (i.e., it has consumed the data in the buffer) it calls releaseEmptyBuf. ReleaseEmptyBuf gives the buffer back to the transform 902. The transform 902 can then hand that buffer, for example, back to the source object 901 to fill up again. In addition to the automatic flow-control benefit of this method, it also provides for limiting the amount of memory dedicated to buffers by allowing enforcement of a fixed allocation of buffers by a transform. This is an important feature in achieving a cost-effective limited DRAM environment.
  • [0059]
    The MediaSwitch class 909 calls the allocEmptyBuf method of the TmkClipCache 912 object and receives a PES buffer from it. It then goes out to the circular buffers in the Media Switch hardware and generates PES buffers. The MediaSwitch class 909 fills the buffer up and pushes it back to the TmkClipCache 912 object.
  • [0060]
    The TmkClipCache 912 maintains a cache file 918 on a storage medium. It also maintains two pointers into this cache: a push pointer 919 that shows where the next buffer coming from the source 901 is inserted; and a current pointer 920 which points to the current buffer used.
  • [0061]
    The buffer that is pointed to by the current pointer is handed to the Vela decoder class 916. The Vela decoder class 916 talks to the decoder 921 in the hardware. The decoder 921 produces a decoded TV signal that is subsequently encoded into an analog TV signal in NTSC, PAL or other analog format. When the Vela decoder class 916 is finished with the buffer it calls releaseEmptyBuf.
  • [0062]
    The structure of the classes makes the system easy to test and debug. Each level can be tested separately to make sure it performs in the appropriate manner, and the classes may be gradually aggregated to achieve the desired functionality while retaining the ability to effectively test each object.
  • [0063]
    The control object 917 accepts commands from the user and sends events into the pipeline to control what the pipeline is doing. For example, if the user has a remote control and is watching TV, the user presses pause and the control object 917 sends an event to the sink 903, that tells it pause. The sink 903 stops asking for new buffers. The current pointer 920 stays where it is at. The sink 903 starts taking buffers out again when it receives another event that tells it to play. The system is in perfect synchronization; it starts from the frame that it stopped at.
  • [0064]
    The remote control may also have a fast forward key. When the fast forward key is pressed, the control object 917 sends an event to the transform 902 that tells it to move forward two seconds. The transform 902 finds that the two second time span requires it to move forward three buffers. It then issues a reset event to the downstream pipeline, so that any queued data or state that may be present in the hardware decoders is flushed. This is a critical step, since the structure of MPEG streams requires maintenance of state across multiple frames of data, and that state will be rendered invalid by repositioning the pointer. It then moves the current pointer 920 forward three buffers. The next time the sink 903 calls nextFullBuf it gets the new current buffer. The same method works for fast reverse in that the transform 902 moves the current pointer 920 backwards.
  • [0065]
    A system clock reference resides in the decoder. The system clock reference is sped up for fast play or slowed down for slow play. The sink simply asks for full buffers faster or slower, depending on the clock speed.
  • [0066]
    With respect to FIG. 10, two other objects derived from the TmkXfrm class are placed in the pipeline for disk access. One is called TmkClipReader 1003 and the other is called TmkClipWriter 1001. Buffers come into the TmkClipWriter 1001 and are pushed to a file on a storage medium 1004. TmkClipReader 1003 asks for buffers which are taken off of a file on a storage medium 1005. A TmkClipReader 1003 provides only the allocEmptyBuf and pushFullBuf methods, while a TmkClipWriter 1001 provides only the nextFullBuf and releaseEmptyBuf methods. A TmkClipReader 1003 therefore performs the same function as the input, or “push” side of a TmkClipCache 1002, while a TmkClipWriter 1001 therefore performs the same function as the output, or “pull” side of a TmkClipCache 1002.
  • [0067]
    Referring to FIG. 11, an embodiment that accomplishes multiple functions is shown. A source 1101 has a TV signal input. The source sends data to a PushSwitch 1102 which is a transform derived from TmkXfrm. The PushSwitch 1102 has multiple outputs that can be switched by the control object 1114. This means that one part of the pipeline can be stopped and another can be started at the user's whim. The user can switch to different storage devices. The PushSwitch 1102 could output to a TmkClipWriter 1106, which goes onto a storage device 1107 or write to the cache transform 1103.
  • [0068]
    An important feature of this apparatus is the ease with which it can selectively capture portions of an incoming signal under the control of program logic. Based on information such as the current time, or perhaps a specific time span, or perhaps via a remote control button press by the viewer, a TmkClipWriter 1106 may be switched on to record a portion of the signal, and switched off at some later time. This switching is typically caused by sending a “switch” event to the PushSwitch 1102 object.
  • [0069]
    An additional method for triggering selective capture is through information modulated into the VBI or placed into an MPEG private data channel. Data decoded from the VBI or private data channel is passed to the program logic. The program logic examines this data to determine if the data indicates that capture of the TV signal into which it was modulated should begin. Similarly, this information may also indicate when recording should end, or another data item may be modulated into the signal indicating when the capture should end. The starting and ending indicators may be explicitly modulated into the signal or other information that is placed into the signal in a standard fashion may be used to encode this information.
  • [0070]
    With respect to FIG. 12, an example is shown which demonstrates how the program logic scans the words contained within the closed caption (CC) fields to determine starting and ending times, using particular words or phrases to trigger the capture. A stream of NTSC or PAL fields 1201 is presented. CC bytes are extracted from each odd field 1202, and entered in a circular buffer 1203 for processing by the Word Parser 1204. The Word Parser 1204 collects characters until it encounters a word boundary, usually a space, period or other delineating character. Recall from above, that the MPEG audio and video segments are collected into a series of fixed-size PES buffers. A special segment is added to each PES buffer to hold the words extracted from the CC field 1205. Thus, the CC information is preserved in time synchronization with the audio and video, and can be correctly presented to the viewer when the stream is displayed. This also allows the stored stream to be processed for CC information at the leisure of the program logic, which spreads out load, reducing cost and improving efficiency. In such a case, the words stored in the special segment are simply passed to the state table logic 1206.
  • [0071]
    During stream capture, each word is looked up in a table 1206 which indicates the action to take on recognizing that word. This action may simply change the state of the recognizer state machine 1207, or may cause the state machine 1207 to issue an action request, such as “start capture”, “stop capture”, “phrase seen”, or other similar requests. Indeed, a recognized word or phrase may cause the pipeline to be switched; for example, to overlay a different audio track if undesirable language is used in the program.
  • [0072]
    Note that the parsing state table 1206 and recognizer state machine 1207 may be modified or changed at any time. For example, a different table and state machine may be provided for each input channel. Alternatively, these elements may be switched depending on the time of day, or because of other events.
  • [0073]
    Referring to FIG. 11, a PullSwitch is added 1104 which outputs to the sink 1105. The sink 1105 calls nextFullBuf and releaseEmptyBuf to get or return buffers from the PullSwitch 1104. The PullSwitch 1104 can have any number of inputs. One input could be an ActionClip 1113. The remote control can switch between input sources. The control object 1114 sends an event to the PullSwitch 1104, telling it to switch. It will switch from the current input source to whatever input source the control object selects.
  • [0074]
    An ActionClip class provides for sequencing a number of different stored signals in a predictable and controllable manner, possibly with the added control of viewer selection via a remote control. Thus, it appears as a derivative of a TmkXfrm object that accepts a “switch” event for switching to the next stored signal.
  • [0075]
    This allows the program logic or user to create custom sequences of video output. Any number of video segments can be lined up and combined as if the program logic or user were using a broadcast studio video mixer. TmkClipReaders 1108, 1109, 1110 are allocated and each is hooked into the PullSwitch 1104. The PullSwitch 1104 switches between the TmkClipReaders 1108, 1109, 1110 to combine video and audio clips. Flow control is automatic because of the way the pipeline is constructed. The Push and Pull Switches are the same as video switches in a broadcast studio.
  • [0076]
    The derived class and resulting objects described here may be combined in an arbitrary way to create a number of different useful configurations for storing, retrieving, switching and viewing of TV streams. For example, if multiple input and output sections are available, one input is viewed while another is stored, and a picture-in-picture window generated by the second output is used to preview previously stored streams. Such configurations represent a unique and novel application of software transformations to achieve the functionality expected of expensive, sophisticated hardware solutions within a single cost-effective device.
  • [0077]
    With respect to FIG. 13, a high-level system view is shown which implements a VCR backup. The Output Module 1303 sends TV signals to the VCR 1307. This allows the user to record TV programs directly on to video tape. The invention allows the user to queue up programs from disk to be recorded on to video tape and to schedule the time that the programs are sent to the VCR 1307. Title pages (EPG data) can be sent to the VCR 1307 before a program is sent. Longer programs can be scaled to fit onto smaller video tapes by speeding up the play speed or dropping frames.
  • [0078]
    The VCR 1307 output can also be routed back into the Input Module 1301. In this configuration the VCR acts as a backup system for the Media Switch 1302. Any overflow storage or lower priority programming is sent to the VCR 1307 for later retrieval.
  • [0079]
    The Input Module 1301 can decode and pass to the remainder of the system information encoded on the Vertical Blanking Interval (VBI). The Output Module 1303 can encode into the output VBI data provided by the remainder of the system. The program logic may arrange to encode identifying information of various kinds into the output signal, which will be recorded onto tape using the VCR 1307. Playing this tape back into the input allows the program logic to read back this identifying information, such that the TV signal recorded on the tape is properly handled. For example, a particular program may be recorded to tape along with information about when it was recorded, the source network, etc. When this program is played back into the Input Module, this information can be used to control storage of the signal, presentation to the viewer, etc.
  • [0080]
    One skilled in the art will readily appreciate that such a mechanism may be used to introduce various data items to the program logic which are not properly conceived of as television signals. For instance, software updates or other data may be passed to the system. The program logic receiving this data from the television stream may impose controls on how the data is handled, such as requiring certain authentication sequences and/or decrypting the embedded information according to some previously acquired key. Such a method works for normal broadcast signals as well, leading to an efficient means of providing non-TV control information and data to the program logic.
  • [0081]
    Additionally, one skilled in the art will readily appreciate that although a VCR is specifically mentioned above any multimedia recording device (e.g., a Digital Video Disk-Random Access Memory (DVD-RAM) recorder) is easily substituted in its place.
  • [0082]
    Although the invention is described herein with reference to the preferred embodiment, one skilled in the art will readily appreciate that other applications may be substituted for those set forth herein without departing from the spirit and scope of the present invention. For example, the invention can be used in the detection of gambling casino crime. The input section of the invention is connected to the casino's video surveillance system. Recorded video is cached and simultaneously output to external VCRs. The user can switch to any video feed and examine (i.e., rewind, play, slow play, fast forward, etc.) a specific segment of the recorded video while the external VCRs are being loaded with the real-time input video. Accordingly, the invention should only be limited by the Claims included below.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US3942190 *Mar 21, 1974Mar 2, 1976Matsushita Electric Industrial Co., Ltd.Method and apparatus for uninterrupted recording and reproduction in a multichannel mode of information on tape
US4141039 *Feb 8, 1977Feb 20, 1979Sony CorporationRecorder memory with variable read and write rates
US4258418 *Dec 28, 1978Mar 24, 1981International Business Machines CorporationVariable capacity data buffer system
US4313135 *Jul 28, 1980Jan 26, 1982Cooper J CarlMethod and apparatus for preserving or restoring audio to video synchronization
US4439785 *Nov 17, 1980Mar 27, 1984Vvr AssociatesSubscriber television system
US4506348 *Jun 14, 1982Mar 19, 1985Allied CorporationVariable digital delay circuit
US4506358 *Jun 25, 1982Mar 19, 1985At&T Bell LaboratoriesTime stamping for a packet switching system
US4566034 *May 2, 1983Jan 21, 1986Rca CorporationRemote control transmitter arrangement for one or more television devices
US4723181 *Sep 24, 1986Feb 2, 1988Eastman Kodak CompanyTape memory with integral disk index on reel
US4805217 *Sep 25, 1985Feb 14, 1989Mitsubishi Denki Kabushiki KaishaReceiving set with playback function
US4816905 *Apr 30, 1987Mar 28, 1989Gte Laboratories Incorporated & Gte Service CorporationTelecommunication system with video and audio frames
US4821121 *Feb 24, 1987Apr 11, 1989Ampex CorporationElectronic still store with high speed sorting and method of operation
US4891715 *Feb 10, 1988Jan 2, 1990Sony CorporationDigital video signal processing with cut editing feature
US4897867 *Mar 15, 1988Jan 30, 1990American Telephone And Telegraph Company, At&T Bell LaboratoriesMethod of and an arrangement for forwarding a customer order
US4920533 *Oct 26, 1988Apr 24, 1990Videotron LteeCATV subscriber terminal transmission control
US4991033 *Sep 28, 1988Feb 5, 1991Hitachi, Ltd.Signal processing method and device for digital signal reproduction apparatus
US5001568 *Sep 30, 1982Mar 19, 1991Discovision AssociatesSignal evaluation by accumulation at one rate and releasing and testing at a slower rate
US5089885 *Aug 1, 1988Feb 18, 1992Video Jukebox Network, Inc.Telephone access display system with remote monitoring
US5093718 *Sep 28, 1990Mar 3, 1992Inteletext Systems, Inc.Interactive home information system
US5109281 *Feb 6, 1991Apr 28, 1992Hitachi, Ltd.Video printer with separately stored digital signals printed in separate areas to form a print of multiple images
US5202761 *May 28, 1991Apr 13, 1993Cooper J CarlAudio synchronization apparatus
US5282247 *Nov 12, 1992Jan 25, 1994Maxtor CorporationApparatus and method for providing data security in a computer system having removable memory
US5283659 *Apr 17, 1991Feb 1, 1994Pioneer Electronic CorporationApparatus and method of high vision signal recording using expansion techniques and digital drop-out compensation
US5285272 *Feb 26, 1991Feb 8, 1994SasktelVideo store and forward on demand apparatus and method
US5287182 *Jul 2, 1992Feb 15, 1994At&T Bell LaboratoriesTiming recovery for variable bit-rate video on asynchronous transfer mode (ATM) networks
US5388264 *Sep 13, 1993Feb 7, 1995Taligent, Inc.Object oriented framework system for routing, editing, and synchronizing MIDI multimedia information using graphically represented connection object
US5481542 *Nov 10, 1993Jan 2, 1996Scientific-Atlanta, Inc.Interactive information services control system
US5485611 *Dec 30, 1994Jan 16, 1996Intel CorporationVideo database indexing and method of presenting video database index to a user
US5486687 *Sep 27, 1994Jan 23, 1996Gemplus Card InternationalMemory card having a recessed portion with contacts connected to an access card
US5488409 *Sep 16, 1993Jan 30, 1996Yuen; Henry C.Apparatus and method for tracking the playing of VCR programs
US5488433 *Mar 1, 1995Jan 30, 1996Kinya WashinoDual compression format digital video production system
US5497277 *Nov 8, 1994Mar 5, 1996Fujitsu LimitedSpindle synchronization controller using advance calculation and offset values
US5557724 *Oct 12, 1993Sep 17, 1996Intel CorporationUser interface, method, and apparatus selecting and playing channels having video, audio, and/or text streams
US5600364 *Dec 2, 1993Feb 4, 1997Discovery Communications, Inc.Network controller for cable television delivery systems
US5600379 *Oct 13, 1994Feb 4, 1997Yves C. FaroudiaTelevision digital signal processing apparatus employing time-base correction
US5603058 *Sep 8, 1994Feb 11, 1997International Business Machines CorporationVideo optimized media streamer having communication nodes received digital data from storage node and transmitted said data to adapters for generating isochronous digital data streams
US5604544 *May 31, 1995Feb 18, 1997International Business Machines CorporationVideo receiver display of cursor overlaying video
US5612749 *Jun 27, 1994Mar 18, 1997Bacher; Emil G.Apparatus and method for receiving messages from a central transmitter with a television receiver
US5614940 *Oct 21, 1994Mar 25, 1997Intel CorporationMethod and apparatus for providing broadcast information with indexing
US5615401 *Mar 30, 1994Mar 25, 1997Sigma Designs, Inc.Video and audio data presentation interface
US5706388 *Dec 30, 1996Jan 6, 1998Ricoh Company, Ltd.Recording system recording received information on a recording medium while reproducing received information previously recorded on the recording medium
US5712976 *Sep 8, 1994Jan 27, 1998International Business Machines CorporationVideo data streamer for simultaneously conveying same one or different ones of data blocks stored in storage node to each of plurality of communication nodes
US5715356 *Sep 16, 1994Feb 3, 1998Kabushiki Kaisha ToshibaApparatus for processing compressed video signals which are be recorded on a disk or which have been reproduced from a disk
US5719982 *Dec 15, 1995Feb 17, 1998Sony CorporationApparatus and method for decoding data
US5721815 *Jun 7, 1995Feb 24, 1998International Business Machines CorporationMedia-on-demand communication system and method employing direct access storage device
US5721878 *Jun 7, 1995Feb 24, 1998International Business Machines CorporationMultimedia control system and method for controlling multimedia program presentation
US5724474 *Sep 29, 1994Mar 3, 1998Sony CorporationDigital recording and reproducing apparatus and index recording method
US5729516 *Mar 11, 1997Mar 17, 1998Pioneer Electronic CoporationInformation recording medium, apparatus for recording the same and apparatus for reproducing the same
US5729741 *Apr 10, 1995Mar 17, 1998Golden Enterprises, Inc.System for storage and retrieval of diverse types of information obtained from different media sources which includes video, audio, and text transcriptions
US5856930 *Mar 28, 1994Jan 5, 1999Sony CorporationDisc-shaped recording medium, disc recording apparatus and disc reproducing apparatus
US5862342 *Oct 31, 1996Jan 19, 1999Sensormatic Electronics CorporationIntelligent video information management system with information archiving capabilities
US5864582 *Feb 24, 1997Jan 26, 1999Ford Global Technologies, Inc.Pulse width extension with analog command
US5864682 *May 21, 1997Jan 26, 1999Oracle CorporationMethod and apparatus for frame accurate access of digital audio-visual information
US5870553 *Sep 19, 1996Feb 9, 1999International Business Machines CorporationSystem and method for on-demand video serving from magnetic tape using disk leader files
US5870710 *Jan 22, 1997Feb 9, 1999Sony CorporationAudio transmission, recording and reproducing system
US5889915 *Aug 7, 1997Mar 30, 1999Hewton; Alfred F.Digital storage device for a television
US5949948 *Nov 20, 1995Sep 7, 1999Imedia CorporationMethod and apparatus for implementing playback features for compressed video data
US6018612 *Dec 17, 1996Jan 25, 2000U.S. Philips CorporationArrangement for storing an information signal in a memory and for retrieving the information signal from said memory
US6018775 *Jul 2, 1996Jan 25, 2000Gateway 2000, Inc.System with a remote wireless mass storage which provides identification of a particular mass storage stored in a cradle to the system
US6028599 *Oct 10, 1996Feb 22, 2000Yuen; Henry C.Database for use in method and apparatus for displaying television programs and related text
US6028774 *May 19, 1998Feb 22, 2000Samsung Electronics Co., Ltd.Base cards and IC cards using the same
US6065050 *Jun 5, 1996May 16, 2000Sun Microsystems, Inc.System and method for indexing between trick play and normal play video streams in a video delivery system
US6094234 *May 29, 1997Jul 25, 2000Hitachi, Ltd.Method of and an apparatus for decoding video data
US6169843 *Sep 19, 1997Jan 2, 2001Harmonic, Inc.Recording and playback of audio-video transport streams
US6172605 *Jul 1, 1998Jan 9, 2001Matsushita Electric Industrial Co., Ltd.Remote monitoring system and method
US6172712 *Dec 30, 1998Jan 9, 2001Intermec Ip Corp.Television with hard disk drive
US6181706 *Sep 26, 1997Jan 30, 2001International Business Machines CorporationCommon buffer for multiple streams and control registers in an MPEG-2 compliant transport register
US6192189 *Aug 10, 1998Feb 20, 2001Sony CorporationData recording method and apparatus, data recorded medium and data reproducing method and apparatus
US6198877 *Aug 1, 1996Mar 6, 2001Sony CorporationMethod and apparatus for recording programs formed of picture and audio data, data recording medium having programs formed of picture and audio data recorded thereon, and method and apparatus for reproducing programs having picture and audio data
US6209041 *Apr 4, 1997Mar 27, 2001Microsoft CorporationMethod and computer program product for reducing inter-buffer data transfers between separate processing components
US6343179 *Jan 27, 1998Jan 29, 2002Sony CorporationMethod and apparatus for recording a TV signal, method and apparatus for reproducing a TV signal, apparatus for recording and reproducing a TV signal, and recording medium
US6353461 *Jun 11, 1998Mar 5, 2002Panavision, Inc.Multiple camera video assist control system
US6356708 *Mar 18, 1999Mar 12, 2002Imedia CorporationMethod and apparatus for implementing playback features for compressed video data
US6359636 *Jul 17, 1995Mar 19, 2002Gateway, Inc.Graphical user interface for control of a home entertainment system
US6360320 *Apr 14, 1998Mar 19, 2002Sony CorporationInformation processing apparatus, information processing method, information processing system and recording medium using an apparatus id and provided license key for authentication of each information to be processed
US6363212 *Jul 29, 1999Mar 26, 2002Sony CorporationApparatus and method for encoding and decoding digital video data
US6504990 *Jun 3, 1999Jan 7, 2003Max AbecassisRandomly and continuously playing fragments of a video segment
US6516467 *Aug 4, 1998Feb 4, 2003Gateway, Inc.System with enhanced display of digital video
US6529685 *Jan 26, 2001Mar 4, 2003International Business Machines CorporationMultimedia direct access storage device and formatting method
US6535465 *Jun 18, 1998Mar 18, 2003Kabushiki Kaisha OptromIntegrated-disk drive having an intelligent electronic circuit mounted as part of the disk
US6694200 *Nov 16, 1999Feb 17, 2004Digital5, Inc.Hard disk based portable device
US6697944 *Oct 1, 1999Feb 24, 2004Microsoft CorporationDigital content distribution, transmission and protection system and method, and portable device for use therewith
US6698020 *Jun 15, 1998Feb 24, 2004Webtv Networks, Inc.Techniques for intelligent video ad insertion
US6704493 *Mar 6, 2000Mar 9, 2004Sony CorporationMultiple source recording
US6708251 *May 31, 2001Mar 16, 2004Keen Personal Media, Inc.Disk drive having separate interfaces for host commands and audiovisual data
US6839851 *Jul 28, 1999Jan 4, 2005Hitachi, Ltd.Digital signal processing apparatus
US6985584 *Mar 27, 2000Jan 10, 2006Sony CorporationNon-volatile record medium, recording method, and recording apparatus
US6993567 *May 19, 1999Jan 31, 2006Sony CorporationRecording/reproducing apparatus, data reproducing method, and data recording/reproducing method
US7346582 *Dec 20, 2000Mar 18, 2008Sony CorporationElectronic money, electronic use right, charging system, information processing apparatus, and reproducing method and reproduction control method of contents data
US7877765 *Oct 26, 2006Jan 25, 2011International Business Machines CorporationViewing pattern data collection
US20010003554 *Oct 17, 1996Jun 14, 2001Kenji MoriDevice for compressing audio and video data and method therefor
US20020003949 *Mar 13, 2001Jan 10, 2002Toshio MamiyaHard disk drive
US20020012531 *Jul 21, 1998Jan 31, 2002Michael R. FlanneryOptical storage media drive adapter for stand-alone use
US20020017558 *May 31, 2001Feb 14, 2002Graves Marcel A.Smart card technology
US20020028063 *Jun 14, 2001Mar 7, 2002Isamu HanedaMethod and device for recording and reproducing TV programs
US20030026589 *Jul 5, 2002Feb 6, 2003Barton James M.Smart card digital video recorder system
US20030040962 *Apr 19, 2002Feb 27, 2003Lewis William H.System and data management and on-demand rental and purchase of digital data products
US20050025469 *Sep 3, 2004Feb 3, 2005Geer James L.Systems and methods for storing a plurality of video streams on re-writable random-access media and time- and channel-based retrieval thereof
US20050066362 *Sep 24, 2003Mar 24, 2005Qwest Communications International IncSystem and method for simultaneously displaying video programming and instant messaging
US20060045470 *Aug 25, 2004Mar 2, 2006Thomas PoslinskiProgess bar with multiple portions
US20110041146 *Oct 26, 2010Feb 17, 2011William Henry LewisSystem for Data Management and On-Demand Rental and Purchase of Digital Data Products
US20140003791 *Sep 3, 2013Jan 2, 2014Tivo Inc.Multiple Outlet Digital Video Recording System
US20140016912 *Sep 17, 2013Jan 16, 2014Tivo Inc.Multimedia Signal Processing System
USRE33535 *Oct 23, 1989Feb 12, 1991 Audio to video timing equalizer method and apparatus
WO1997015143A1 *Oct 17, 1996Apr 24, 1997Sony CorporationDevice for compressing audio and video data and method therefor
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7409140May 10, 2002Aug 5, 2008Scientific-Atlanta, Inc.Channel buffering and display management system for multi-tuner set-top box
US7657614Dec 19, 2007Feb 2, 2010Motorola, Inc.Multiple participant, time-shifted dialogue management
US7668435May 21, 2008Feb 23, 2010Tivo Inc.Multimedia signal processing system
US7885936 *Dec 29, 2006Feb 8, 2011Echostar Technologies L.L.C.Digital file management system
US7962011Dec 6, 2001Jun 14, 2011Plourde Jr Harold JControlling substantially constant buffer capacity for personal video recording with consistent user interface of available disk space
US8136140Nov 20, 2007Mar 13, 2012Dish Network L.L.C.Methods and apparatus for generating metadata utilized to filter content from a video stream using text data
US8156520May 30, 2008Apr 10, 2012EchoStar Technologies, L.L.C.Methods and apparatus for presenting substitute content in an audio/video stream using text data
US8165450Nov 19, 2007Apr 24, 2012Echostar Technologies L.L.C.Methods and apparatus for filtering content in a video stream using text data
US8165451Nov 20, 2007Apr 24, 2012Echostar Technologies L.L.C.Methods and apparatus for displaying information regarding interstitials of a video stream
US8352990May 10, 2011Jan 8, 2013Encore Interactive Inc.Realtime broadcast stream and control data conversion system and method
US8380041Jul 5, 2002Feb 19, 2013Tivo Inc.Transportable digital video recorder system
US8380049Aug 3, 2007Feb 19, 2013Tivo Inc.Playback of audio/video content with control codes
US8407735May 4, 2009Mar 26, 2013Echostar Technologies L.L.C.Methods and apparatus for identifying segments of content in a presentation stream using signature data
US8437617Jun 17, 2009May 7, 2013Echostar Technologies L.L.C.Method and apparatus for modifying the presentation of content
US8457476Jul 6, 2009Jun 4, 2013Tivo Inc.Multimedia signal processing system
US8510771May 4, 2009Aug 13, 2013Echostar Technologies L.L.C.Methods and apparatus for filtering content from a presentation stream using signature data
US8526781Apr 24, 2009Sep 3, 2013Tivo Inc.Multiple output digital video recording system
US8538241Feb 23, 2010Sep 17, 2013Tivo Inc.Multimedia signal processing system
US8565578Dec 6, 2001Oct 22, 2013Harold J. Plourde, Jr.Dividing and managing time-shift buffering into program specific segments based on defined durations
US8577201Feb 19, 2009Nov 5, 2013Cisco Technology, Inc.Buffering of prior displayed television channels upon accessing a different channel
US8577205Apr 18, 2003Nov 5, 2013Tivo Inc.Digital video recording system
US8588579May 4, 2009Nov 19, 2013Echostar Technologies L.L.C.Methods and apparatus for filtering and inserting content into a presentation stream using signature data
US8606085Mar 20, 2008Dec 10, 2013Dish Network L.L.C.Method and apparatus for replacement of audio data in recorded audio/video stream
US8620135Jul 3, 2007Dec 31, 2013Harold J. Plourde, Jr.Selection and retention of buffered media content
US8726309Feb 29, 2012May 13, 2014Echostar Technologies L.L.C.Methods and apparatus for presenting substitute content in an audio/video stream using text data
US8824865Nov 13, 2008Sep 2, 2014Tivo Inc.Digital video recorder system with an integrated DVD recording device
US8839313Jan 7, 2013Sep 16, 2014Encore Interactive Inc.Realtime broadcast stream and control data conversion system and method
US8854427 *Sep 16, 2008Oct 7, 2014Koninklijke Philips N.V.Method and system for encoding a video data signal, encoded video data signal, method and system for decoding a video data signal
US8934758Feb 9, 2010Jan 13, 2015Echostar Global B.V.Methods and apparatus for presenting supplemental content in association with recorded content
US8943030Oct 22, 2009Jan 27, 2015Echostar Technologies L.L.C.Digital file management system
US8965173May 19, 2010Feb 24, 2015Tivo Inc.Multimedia stream processing system
US8965177Nov 11, 2011Feb 24, 2015Echostar Technologies L.L.C.Methods and apparatus for displaying interstitial breaks in a progress bar of a video stream
US8977106Nov 11, 2011Mar 10, 2015Echostar Technologies L.L.C.Methods and apparatus for filtering content in a video stream using closed captioning data
US9002173Mar 20, 2007Apr 7, 2015Tivo Inc.Digital security surveillance system
US9094724Aug 3, 2007Jul 28, 2015Tivo Inc.Multi-channel playback of audio/video content
US9113212Dec 6, 2007Aug 18, 2015Tivo Inc.Simultaneous recording and playback of audio/video programs
US9264686Apr 22, 2013Feb 16, 2016Tivo Inc.Tag-based menus in video streams
US9300902Feb 13, 2013Mar 29, 2016Tivo Inc.Playback of audio/video content with control codes
US9319733Jun 13, 2011Apr 19, 2016Cisco Technology, Inc.Management of buffer capacity for video recording and time shift operations
US9344668Jun 30, 2015May 17, 2016Tivo Inc.System and method for time-shifted program viewing
US9350934Jun 30, 2015May 24, 2016Tivo Inc.System and method for time-shifted program viewing
US9357260May 12, 2014May 31, 2016Echostar Technologies L.L.C.Methods and apparatus for presenting substitute content in an audio/video stream using text data
US9467749Mar 28, 2016Oct 11, 2016Tivo Inc.Playback of audio/video content with control codes
US9479273Feb 27, 2015Oct 25, 2016Sirius Xm Radio Inc.Method and apparatus for multiplexing audio program channels from one or more received broadcast streams to provide a playlist style listening experience to users
US9521356Apr 7, 2015Dec 13, 2016Tivo Inc.Digital security surveillance system
US9788049Sep 17, 2013Oct 10, 2017Tivo Solutions Inc.Multimedia signal processing system
US9800823Dec 13, 2016Oct 24, 2017Tivo Solutions Inc.Digital security surveillance system
US20020069414 *Dec 4, 2001Jun 6, 2002Alticast, Corp.Method for re-utilizing contents data for digital broadcasting and system therefor
US20020168178 *May 10, 2002Nov 14, 2002Rodriguez Arturo A.Channel buffering and display management system for multi-tuner set-top box
US20030110504 *Dec 6, 2001Jun 12, 2003Plourde Harold J.Dividing and managing time-shift buffering into program specific segments based on defined durations
US20030110513 *Dec 6, 2001Jun 12, 2003Plourde Harold J.Controlling substantially constant buffer capacity for personal video recording with consistent user interface of available disk space
US20040004599 *Jul 3, 2003Jan 8, 2004Scott ShepardSystems and methods for facilitating playback of media
US20040024582 *Jul 2, 2003Feb 5, 2004Scott ShepardSystems and methods for aiding human translation
US20080138033 *Feb 19, 2008Jun 12, 2008Rodriguez Arturo AMulti-tuner multi-buffer digital home communication terminal
US20080162545 *Dec 29, 2006Jul 3, 2008Jarrod AustinDigital file management system
US20090165000 *Dec 19, 2007Jun 25, 2009Motorola, Inc.Multiple Participant, Time-Shifted Dialogue Management
US20090307741 *Jun 9, 2008Dec 10, 2009Echostar Technologies L.L.C.Methods and apparatus for dividing an audio/video stream into multiple segments using text data
US20100043023 *Oct 22, 2009Feb 18, 2010EchoStar Technologies L.L.C., formerly known as EchoStar Technologies CorporationDigital File Management System
US20100110163 *Sep 16, 2008May 6, 2010Koninklijke Philips Electronics N.V.Method and system for encoding a video data signal, encoded video data signal, method and sytem for decoding a video data signal
US20100188978 *Oct 8, 2007Jul 29, 2010TdfMethod for time delaying digital content flows, corresponding device, and computer program product
US20100321375 *Jun 10, 2010Dec 23, 2010Sony CorporationInformation reproducing apparatus, information reproducing method, and program
US20150281297 *Mar 26, 2015Oct 1, 2015Tivo Inc.Multimedia Pipeline Architecture
WO2005057826A2 *Jun 30, 2004Jun 23, 2005Bbnt Solutions LlcSystems and methods for facilitating playback of media
WO2005057826A3 *Jun 30, 2004Dec 14, 2006Bbnt Solutions LlcSystems and methods for facilitating playback of media
WO2008043738A1 *Oct 8, 2007Apr 17, 2008TdfMethod for time-delaying a flow of digital contents, and related device and computer software product
Classifications
U.S. Classification725/134, 386/E05.07, 725/142, G9B/27.012, 348/E05.108, 386/E05.001, 348/E05.007, G9B/27.01, G9B/27.019, 725/89, 386/356, 386/345, 386/349, 386/328
International ClassificationH04N5/44, H04N7/26, G11B20/10, H04N5/92, H04N5/782, H04N5/00, G11B27/30, H04N7/16, H04N21/4147, H04N21/422, H04N21/434, H04N21/4402, H04N21/433, H04N21/845, H04N21/43, H04N21/45, H04N21/472, H04N21/454, H04N21/432, H04N21/8547, H04N21/426, H04N21/488, H04N9/806, G11B27/10, G11B27/00, H04N5/76, H04N5/775, G11B27/024, G11B27/031, G11B27/034, G11B27/032, H04N9/804
Cooperative ClassificationG06K9/00771, G11B27/105, G11B27/032, H04N5/775, G11B27/024, H04N9/8063, H04N9/8042, H04N9/7921, H04N5/781, G11B27/002, H04N5/783, G11B2220/2562, H04N5/76, H04N21/454, H04N21/4344, G11B2220/2575, G11B27/3027, G11B27/031, H04N21/4147, H04N5/782, H04N21/4334, H04N21/4402, H04N21/4884, H04N21/472, H04N21/4307, H04N21/47214, H04N21/4532, H04N21/4263, H04N21/8456, G11B2220/216, G11B27/034, H04N5/4401, H04N21/42615, G11B27/3054, H04N21/8547, H04N21/4341, G11B2220/455, H04N21/432, G11B2220/90, H04N21/4345, H04N21/42204, H04N21/440281
European ClassificationH04N21/426B1, H04N21/432, H04N21/488S, H04N21/43S2, H04N21/845T, H04N21/454, H04N21/4402T, H04N21/434R, H04N21/426B3, H04N21/433R, H04N21/8547, H04N21/434A, H04N21/4147, H04N21/45M3, H04N21/422R, H04N21/236R, H04N21/472R, H04N21/4402, H04N5/782, G11B27/30C1B, H04N21/472, G11B27/30C, H04N21/434S, G11B27/10A1, H04N5/44N, G11B27/034, G11B27/031, H04N5/76, H04N5/775
Legal Events
DateCodeEventDescription
Feb 8, 2007ASAssignment
Owner name: CITYGROUP GLOBAL MARKETS REALTY CORP., NEW YORK
Free format text: SECURITY AGREEMENT;ASSIGNOR:TIVO INC.;REEL/FRAME:018866/0510
Effective date: 20070125
Owner name: CITYGROUP GLOBAL MARKETS REALTY CORP.,NEW YORK
Free format text: SECURITY AGREEMENT;ASSIGNOR:TIVO INC.;REEL/FRAME:018866/0510
Effective date: 20070125
Feb 12, 2007ASAssignment
Owner name: CITIGROUP GLOBAL MARKETS REALTY CORP., NEW YORK
Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF RECEIVING PARTY PREVIOUSLY RECORDED ON REEL 018866 FRAME 0510;ASSIGNOR:TIVO INC.;REEL/FRAME:018875/0933
Effective date: 20070125
Owner name: CITIGROUP GLOBAL MARKETS REALTY CORP.,NEW YORK
Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF RECEIVING PARTY PREVIOUSLY RECORDED ON REEL 018866 FRAME 0510. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY AGREEMENT;ASSIGNOR:TIVO INC.;REEL/FRAME:018875/0933
Effective date: 20070125
Owner name: CITIGROUP GLOBAL MARKETS REALTY CORP., NEW YORK
Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF RECEIVING PARTY PREVIOUSLY RECORDED ON REEL 018866 FRAME 0510. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY AGREEMENT;ASSIGNOR:TIVO INC.;REEL/FRAME:018875/0933
Effective date: 20070125
May 10, 2011ASAssignment
Owner name: TIVO INC., CALIFORNIA
Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CITIGROUP GLOBAL MARKETS REALTY CORP., AS ADMINISTRATIVE AGENT;REEL/FRAME:026250/0086
Effective date: 20110429
Dec 7, 2016ASAssignment
Owner name: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL
Free format text: SECURITY INTEREST;ASSIGNOR:TIVO SOLUTIONS INC.;REEL/FRAME:041076/0051
Effective date: 20160915
Jan 25, 2017ASAssignment
Owner name: TIVO SOLUTIONS INC., CALIFORNIA
Free format text: CHANGE OF NAME;ASSIGNOR:TIVO INC.;REEL/FRAME:041493/0822
Effective date: 20160908