Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20080031213 A1
Publication typeApplication
Application numberUS 11/871,859
Publication dateFeb 7, 2008
Filing dateOct 12, 2007
Priority dateJan 2, 2002
Also published asUS7305467, US20030154262
Publication number11871859, 871859, US 2008/0031213 A1, US 2008/031213 A1, US 20080031213 A1, US 20080031213A1, US 2008031213 A1, US 2008031213A1, US-A1-20080031213, US-A1-2008031213, US2008/0031213A1, US2008/031213A1, US20080031213 A1, US20080031213A1, US2008031213 A1, US2008031213A1
InventorsWilliam Kaiser, Lars Newberg, Gregory Pottie
Original AssigneeKaiser William J, Newberg Lars F, Pottie Gregory J
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Autonomous tracking wireless imaging sensor network
US 20080031213 A1
Abstract
A wireless integrated network sensor (WINS) system is provided that integrates articulating tracking systems with WINS network components including visual or infrared sensors and imaging devices to enable precise tracking and targeting of objects moving through a sensor field or past a single integrated sensing and targeting unit. Further, arrays of sensors together with local signal processing are used to trigger cameras and tracking systems, and to provide an alternative location capability for improved robustness. The system is self-configuring and remotely controllable, and enables remote systems and operators to query for collected data, including sensory and image data, and control the system in response to the collected data.
Images(3)
Previous page
Next page
Claims(25)
1. A wireless sensor network comprising: a plurality of nodes, the plurality of nodes are self-configurable and coupled to at least one remote system via at least one coupling and components over a wide area network, the nodes automatically organize to form the sensor network in response to information communicated among the nodes, the information including amount of power available to each of the nodes, the automatic organizing comprises automatically coupling and configuring the nodes to form the sensor network and automatically controlling data transfer, processing, and storage within the sensor network, functions of the nodes are remotely controllable and programmable via internetworking among the nodes by the remote system, and the nodes include at least one sensor to collect data from an environment.
2. The sensor network of claim 1, wherein at least one of the nodes includes an articulating sensor, and wherein the articulating sensor is at least one of a tracking system, an imaging system, and an antenna.
3. The sensor network of claim 1, wherein the plurality of nodes includes two or more node types, wherein a first node type includes at least one passive sensor and a second node type includes an articulating sensor.
4. The sensor network of claim 1, wherein at least one of the plurality of nodes is a gateway that communicates with the components of the wide area network.
5. A sensor node of a wireless sensor network comprising:
at least one processor coupled to at least one communication device, the at least one processor automatically couples the sensor node to and configures the sensor node among a plurality of network elements and automatically controls communication with and control of a flow of information among the network elements, the information including amount of power available to each of the network elements, the network elements are self-configurable and couple among an environment and at least one remote client system, via at least one coupling and components over a wide area network, to support remote controllability of the sensor node via the remote client system; and
at least one articulating sensor to gather information from the environment.
6. The node of claim 5, further comprising at least one sensor coupled to the processor to detect at least one target.
7. The node of claim 5, further comprising a photographic system.
8. The node of claim 5, wherein the articulating sensor is at least one of tracking system, a laser tracking system, and an optical tracking system.
9. The node of claim 5, wherein the plurality of network elements includes at least one gateway, at least one server, and components of at least one wide area network.
10. A method of collecting data in a wireless sensor network, comprising:
automatically organizing a plurality of network elements including a plurality of nodes locally disposed among an environment and at least one remote client system, the plurality of nodes are self-configurable, the organizing includes automatically coupling and configuring the plurality of nodes for self-assembly and further includes coupling and controlling a flow of information among the network elements, the information including amount of power available to each of the nodes, and at least one of the plurality of nodes includes an articulating sensor;
remotely controlling at least one function of the plurality of nodes, via at least one coupling and components over a wide area network, using the at least one remote client system;
detecting a target in the environment using information gathered by the articulating sensor; and
collecting and transferring data associated with the target to the remote client system.
11. The method of claim 10, further comprising manipulating the collected data, wherein manipulating includes at least one of routing, fusing, processing, evaluating, and storing the collected data.
12. The method of claim 11, wherein the plurality of nodes comprises a first node, and wherein fusing comprises the first node collecting and processing data from at least another of the plurality of nodes.
13. The method of claim 10, wherein:
the plurality of nodes comprises at least one passive sensor; and
detecting the target comprises using the at least one passive sensor in addition to the articulating sensor.
14. The method of claim 10, wherein the articulating sensor is at least one of a tracking system, an antenna, and an active sensor.
15. The method of claim 14, wherein the tracking system is at least one of a laser tracking system, an optical tracking system, and an imaging system.
16. The method of claim 10, further comprising:
collecting optical data of the target using at least one optical sensor of the plurality of nodes; and
identifying and designating the target using the optical data.
17. The method of claim 13, wherein the at least one passive sensor and the articulating sensor are on different nodes of the plurality of nodes.
18. The method of claim 10, wherein the plurality of network elements includes at least one gateway, at least one server, and components of at least one communication network.
19. A sensor network comprising a plurality of nodes, wherein:
the plurality of nodes comprises a first node including at least one sensor to collect data from an environment;
the plurality of nodes is coupled to communicate with at least one remote system;
the plurality of nodes automatically couples and configures to form the sensor network in response to information communicated among the plurality of nodes;
at least one function of the plurality of nodes is remotely controllable; and
the plurality of nodes distributes processing of the collected data from the first node to another of the plurality of nodes in response to the available node energy of at least one of the plurality of nodes.
20. The sensor network of claim 19, wherein automatically couples and configures includes organizing the plurality of nodes to reduce energy consumption within the sensor network.
21. A method for collecting and processing data in a sensor network, comprising:
automatically coupling a plurality of nodes among an environment, wherein the plurality of nodes includes a first node having at least one processor and at least one sensor coupled to provide data to the at least one processor, and wherein the plurality of nodes are surveyed at random intervals for new nodes;
collecting data from the environment using the at least one sensor; and
remotely controlling at least one function of the plurality of nodes.
22. The method of claim 21, wherein the at least one processor is operable to cycle into and out of a low power state.
23. The method of claim 21, wherein the at least one processor comprises:
a first processor operable to perform real-time processing of sensor data received from the at least one sensor and to provide output data corresponding to the sensor data; and
a second processor, coupled to the first processor, operable to perform processing of the output data.
24. The method of claim 23, wherein the first processor and the second processor are disposed on a single chip.
25. The method of claim 21, wherein at least one node of the plurality of nodes comprises two radio ports for automatically coupling the at least one node to two other nodes of the plurality of nodes.
Description
    RELATED APPLICATIONS
  • [0001]
    This application is a continuation of U.S. patent application Ser. No. 10/329,069, filed Dec. 23, 2002, which claims the benefit of U.S. Provisional Application No. 60/345,198, filed Jan. 2, 2002, and of U.S. Provisional Application No. 60/366,877, filed Mar. 22, 2002. Each of the foregoing applications is incorporated in its entirety herein by reference. This application is related to U.S. patent application Ser. Nos. 09/684,706, 09/684,565, 09/685,020, 09/685,019, 09/684,387, 09/684,490, 09/684,742, 09/680,550, 09/685,018, 09/684,388, 09/684,162, and 09/680,608, all filed Oct. 4, 2000, 10/184,527, filed Jun. 28, 2002, 10/188,514, filed Jul. 3, 2002, and 60/366,877 filed Mar. 22, 2002.
  • FIELD
  • [0002]
    The present invention relates to the sensing and tracking of moving objects using wireless integrated sensor networks.
  • BACKGROUND
  • [0003]
    The Related Applications referenced above describe a network of wireless sensor nodes, referred to as wireless integrated network sensors (WINS). These nodes include communications, signal processing, data storage, and sensing capabilities, and the ability to autonomously form networks and perform cooperative signal processing tasks. These processing tasks include, for example, cooperative acoustic or seismic beam forming to locate targets or other nodes. This information can then, for example, control a camera to train upon the indicated location, if associated identification algorithms indicate that the target is of an interesting class. Human operators can be involved in the identification if information is conveyed from the sensor network. For example, the images and sensor data may be displayed using standard browsing tools, and commands sent to re-prioritize the activities of the remote network.
  • [0004]
    The seismic and acoustic location techniques can be vulnerable to a variety of environmental factors, and thus can have limited accuracy in some deployment circumstances. For example, non-homogeneity of the terrain results in multipath propagation and variable propagation speeds, while wind and certain thermal conditions can affect the usefulness of acoustic ranging systems. Such systems can also have difficulty separating targets that are in close proximity. These deficiencies can, to some extent, be ameliorated using a sufficiently dense network of sensors, but the signal processing tasks can then become very complicated. Moreover, it may demand energy-intensive communication of large quantities of data for coherent processing.
  • [0005]
    By contrast, if a line of sight exists between a node and a target, laser tracking systems like those described in U.S. Pat. No. 4,063,819, for example, are highly selective among targets and insensitive to most environmental conditions on the ground except extreme fog. Numerous commercial realizations of the laser tracking systems exist in compact form factors, such as for example the AN/PAQ-1 compact laser designator. On the other hand, constant scanning by active lasers is power intensive because of the laser and the associated servo mechanisms, and the requirements for large amounts of power can be problematic in compact self-sufficient node packages.
  • BRIEF DESCRIPTION OF THE FIGURES
  • [0006]
    FIG. 1 is a block diagram of a wireless integrated network sensor (WINS) system or network configured to locate and track objects, under an embodiment.
  • [0007]
    FIG. 2 is a block diagram of an imaging node including a tracking system, referred to herein as an imaging and tracking node, under an embodiment.
  • [0008]
    FIG. 3 is a block diagram of an imaging and tracking node, under an alternative embodiment of FIG. 2.
  • [0009]
    FIG. 4 is a flow diagram of a method for collecting data, under the embodiment of FIG. 1.
  • [0010]
    In the drawings, the same reference numbers identify identical or substantially similar elements or acts. To easily identify the discussion of any particular element or act, the most significant digit or digits in a reference number refer to the Figure number in which that element is first introduced (e.g., element 104 is first introduced and discussed with respect to FIG. 1).
  • [0011]
    The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of the claimed invention.
  • DETAILED DESCRIPTION
  • [0012]
    A wireless integrated sensor network is described below that includes articulating tracking systems. In the following description, numerous specific details are included to provide a thorough understanding of, and enabling description for, embodiments of the invention. One skilled in the relevant art, however, will recognize that the invention can be practiced without one or more of the specific details, or with other components, systems, etc. In other instances, well-known structures or operations are not shown, or are not described in detail, to avoid obscuring aspects of the invention.
  • [0013]
    The wireless integrated sensor network described herein combines the power and efficiency of passive sensors with the accuracy and selectivity of high-performance optical systems by integrating tracking systems like laser tracking systems, for example, with wireless integrated sensor networks equipped with visual or infrared imaging devices. Use of the tracking system allows components of the network to provide precise location, tracking, and targeting of objects moving through a sensor field or past a single integrated sensing and targeting unit. Further embodiments support arrays of sensors together with local signal processing in order to trigger cameras and laser tracking systems, or to provide an alternative location means for improved robustness. The wireless integrated sensor network of an embodiment is remotely controllable and configurable, with communication links enabling remote operators to receive information from the network via queries for sensory and image data, and re-task the system.
  • [0014]
    The sensor node technology described in the Related Applications referenced above combines functions including signal processing, sensing, and radio communications together in one package. The nodes are capable of self-configuration, that is, the organization and maintenance of their own network. Gateways provide connections to the outside world. Such systems enable monitoring of and control of the physical world through sensors and actuators. Their reach and use are greatly expanded through the use of technology that enables their control and monitoring using standard web browsing tools. Using this WINS web server technology, parameters of the remote nodes can be updated and new software and/or data loaded as it becomes available. Standard web protocols are used to enable secure communications sessions. Thus, the WINS nodes can manage communications to outside entities providing low installation cost, and allowing remote upgrades of software.
  • [0015]
    In security applications, there is a need for systems that can locate and track in real-time objects that have penetrated a security perimeter. FIG. 1 is a block diagram of a wireless integrated network sensor (WINS) system or network 100 configured to locate and track objects, under an embodiment. The network 100 of an embodiment includes a variety of nodes 102-106, including gateway nodes 102, imaging nodes 104, and sensor nodes 106. The nodes 102-106 function to couple an environment 199 to a remote command system 120, or remote system, via a communication network like a large-area network 110. In general, the nodes 102-106 accommodate any type of sensor input so that any physical input can be accommodated by the nodes 102-106, as described in the Related Applications.
  • [0016]
    The sensor nodes 106 include non-imaging sensors, like for example acoustic or thermal sensors, and may be used to relay communications, establish approximate target locations, and trigger activation of cameras. The sensor nodes 106 of an embodiment can also include tracking systems, but are not so limited.
  • [0017]
    The imaging nodes 104 use information propagated among components of the network 100 to focus on target regions and, once targets are detected or acquired, track the targets. The imaging nodes 104 provide imaging capability using cameras coupled to the sensor ports of the imaging node 104, but the embodiment is not so limited. The imaging nodes can also track the targets using a tracking system, for example a laser tracking system or a video tracking system where the tracking system includes articulating components. The imaging nodes 104 of various alternative embodiments include components of the sensor nodes 106, like the non-imaging or other passive sensors, to form hybrid sensor/imaging nodes.
  • [0018]
    The gateway nodes 102, often referred to as gateways 102, while communicating with various combinations and configurations of network components or elements like imaging nodes 104 and/or sensor nodes 106, establish links with wide- or large-area networks 110. The links between the gateway nodes 102 and the large-area network, for example, can be through a local command post or base station, and thence possibly to the Internet, but are not so limited. In this manner the gateway nodes 102 couple the components of the network 100, and hence information of the environment 199, to the large-area network 110. The gateway nodes 102 can also include any number and/or combination of sensor suites, imaging devices, and tracking devices; indeed, the local network 100 might comprise only a small number of the gateway nodes 102. The gateway nodes 102 of various alternative embodiments can include different combinations of components of the imaging nodes 104 and the sensor nodes 106 to form hybrid nodes.
  • [0019]
    A remote command system or remote system 120 collects and stores data from the nodes 102-106 of the deployed sensor network via the large-area network 110. The data is made available to users who can then query for particular information from the nodes 102-106 or command actions of the nodes 102-106, as described in the Related Applications. The network 100 of an embodiment might include a single gateway 102 equipped with imaging and non-imaging sensors, or multiple gateway nodes 102 that support different views of the objects entering the field, or a mix of components that include tags that get attached to objects entering the area under surveillance.
  • [0020]
    Using the software architecture described in the Related Applications above, the nodes 102-106 can accept downloads of new or additional software, grant secure and prioritized access to sensing and communications devices, and access remote services. For example, each node 102-106 of an embodiment can include templates of identifying information of vehicles for use in processing collected data; the templates can include acoustic, thermal, and image data or information, for example. In cases where vehicle identification certainty is insufficient based on local node processing, the nodes 102-106 can access information of larger databases accessible via couplings with other nodes and/or the large-area network. Also, the decision may be made using more sophisticated algorithms and merging data from many sources; this can be accomplished by a combination of automatic processing and decisions by human operators.
  • [0021]
    The WINS node architecture supports integration of numerous types and/or combinations of components, including the imaging and tracking systems described above, as well as being incrementally and remotely upgradeable with software in support of the integrated components, as described in the Related Applications. FIG. 2 is a block diagram of an imaging node 104 including a tracking system, under an embodiment. The imaging node 104 includes, but is not limited to, at least one main processor 201 and a real-time processor 202 or set of real time processors coupled to one or more buses 204. In an embodiment, the real-time processor 202 mediates the buses 204 to control real-time processes, including sensors, actuators, and communications components.
  • [0022]
    As an example of on-board processes, the imaging node 104 of an embodiment includes and/or couples to a Global Positioning System (GPS) 210, an imaging system/device 212, a tracking system/device 214, sensors 216 and 218, and communication components 220 such as radios. Additional components are added to the node 104 via couplings through the appropriate node mating ports with the buses 204, using the appropriate device drivers as described in the Related Applications. Higher level functions such as target identification, data and image compression, tracking, and network configuration can be hosted on the main processor 201, but are not so limited.
  • [0023]
    The processors 201 and 202, as described in this embodiment, couple among the buses 204 and the components 210-220 of the imaging and tracking node 104, under program control. Alternatively, various other components (not shown) of the network of which the imaging nodes 104 are components can also couple among and communicate with the processors 201 and 202 and the components 210-220 of the imaging nodes 104 to provide data of the environment from the imaging nodes 104 to a remote operator.
  • [0024]
    While one main processor 201, one real-time processor 202, one bus 204, two sensors 216 and 218, and one each of the GPS 210, imaging system 212, tracking system 214, and communications system 220 are shown, various alternative embodiments include any number and/or type of each of these components coupled in various configurations or combinations contemplated by one skilled in the art. Further, while the components 201-220 of the imaging node 104 are shown as separate blocks, some or all of these blocks can be monolithically integrated onto a single chip, distributed among a number of chips or components of a host system or network, and/or provided by some combination of algorithms. The algorithms of the node components 210-220 can be implemented in software algorithm(s), firmware, hardware, and any combination of software, firmware, and hardware. The term “processor” as generally used herein refers to any logic processing unit, such as one or more central processing units (CPUs), digital signal processors (DSPs), application-specific integrated circuits (ASIC), etc.
  • [0025]
    FIG. 3 is a block diagram of an imaging node 300, under an alternative embodiment of FIG. 2. The imaging node 300 includes, but is not limited to, a main processor 201 and a real-time processor 202 or set of real time processors coupled to one or more buses 204. In an embodiment, the real-time processor 202 mediates the buses 204 to control real-time processes of components coupled to the buses 204. As an example, the node 300 of an embodiment includes Global Positioning System (GPS) 210, an imaging system/device in the form of a camera 312, a tracking system/device in the form of a laser tracking system 314, sensors 216 and 218, and communication components 220. These components are added to the imaging node 300 using couplings through the appropriate node mating ports to the buses 204, with appropriate device drivers.
  • [0026]
    The camera system 312 of an embodiment includes any combination of visual and thermal or infrared imaging elements. The camera system 312 can share servo mechanisms (not shown) with the laser tracking system 314 to enable two degrees of rotational freedom or, alternatively, employ a less finely calibrated set of motors. The imaging devices of the camera system 312 can include various zoom capabilities, but are not so limited. Acoustic sensors like directional microphones or microphone arrays can likewise share any servo mechanisms of the imaging node 300 in support of the gathering of directional acoustic information, as can any number/type of antenna systems.
  • [0027]
    The imaging node of an embodiment can be constructed using a variety of form factors. One embodiment can include a camera, sensor, laser designator, and antenna assembly mounted on a telescoping appendage to provide improved line of sight and elevation, but which may be lowered for unobtrusiveness or protection from harsh environmental conditions. In another embodiment, the imager is coupled to the host node/platform via wiring and be mounted on a fixed facility (e.g., a building, a post, a tree).
  • [0028]
    Articulating tracking imaging systems improve the deployability of the networks of which they are a component because, when camera orientation is fixed, precise deployment of the network is required to ensure overlapping fields of view. Ability to both change orientation and zoom enables far more freedom in node deployment, making possible alternatives to hand emplacement. Further, attention can be focused upon interesting events in the field of view, permitting a smaller number of network elements to be deployed. Likewise, articulation enables directional antennas to be employed, enabling extended range communication at low power, without the need for manual alignment of the antennas. In this way, images can be conveyed over longer distances than would be possible with fixed omnidirectional elements. Such features are large advantages in situations such as military operations in which rapid, autonomous deployment of sensing systems will free personnel from risk and undue use of their time and attention. Given that the WINS technology also provides for autonomous establishment of the sensor network and for remote re-tasking, the result is that the complete tracking imaging system can be conveniently established.
  • [0029]
    Regarding tracking systems of the imaging node 300, the use of a laser tracking system 314 provides a tight beam and a long technological history, enabling reliable tracking of particular targets even in the presence of many targets. However, as noted above, this may be supplemented with or replaced by other tracking devices such as tags, acoustic or seismic beam forming, and/or proximity detection in dense sensor fields to deal with loss of line of sight due to weather or physical obstructions. These other tracking devices can assist with acquisition and reacquisition of targets or enable a lower level of tracking accuracy that may suffice in certain instances. Moreover, integration of the optical systems with other components can increase the number of events that can automatically be identified, reducing the frequency of human operator interactions and the bandwidth required for communications with remote networks.
  • [0030]
    Images, whether alone or in combination with acoustic signals, are particularly effective means for human operators to identify particular objects in that natural faculties are engaged. Image or acoustic processing software together with software for analysis of other sensory outputs as is known in the art may be used in the triggering decision or to assist the human operator in the identification. However, such software is rarely definitive as to making decisions for actions. Thus, any of the nodes 102-106 of an embodiment, with reference to FIG. 1, can host software that fuses information from different sensor types like imaging and non-imaging sensors, so that vehicle types of interest can automatically be made subjects of the node tracking system.
  • [0031]
    Tracking by the network as a whole can be enhanced by fusing information from multiple sensors, including cameras, and forwarding information on targets being tracked to nearby nodes. In this way, nodes go to higher levels of alertness to resume tracking of targets that may temporarily have been occluded by obstructions. With the use of fusing, the role of the remote operator becomes that of determining which vehicles are the subject of targeting or surveillance by other network assets. This decision can be assisted, for example, by confidence levels from signal processing algorithms operating on the seismic, magnetic, or acoustic sensor array outputs, or can be made purely from the images or acoustic streams.
  • [0032]
    FIG. 4 is a flow diagram 400 for collecting data using imaging nodes, under the embodiment of FIG. 1. The nodes of an embodiment are self-organizing in that they automatically organize among the elements of the network of which they are a member, at block 402. The organizing includes coupling and configuring the nodes for information gathering and transfer among other nodes of the network and at least one remote system, as described in the Related Applications. In an embodiment, the nodes are coupled to the remote system via a communication network like a large-area network, but are not so limited. The nodes are remotely controlled via the remote system, at block 404.
  • [0033]
    Component systems of the nodes include at least one of location systems, communication systems, numerous types of sensors, and articulating sensors like imaging systems and tracking systems. These component systems use the on-board sensors along with couplings to information of neighboring nodes to detect targets in the environment local to the node, at block 406. The articulating sensors use the on-board sensor information to track the detected targets, at block 408. The articulating sensors include tracking systems like laser tracking systems, but are not so limited. The information gathered by the sensors and the articulating sensors is transferred to the remote system, at block 410, via a combination of other network nodes and components of the large-area network.
  • [0034]
    As an operational example, consider the scenario in which a single vehicle enters a secure perimeter or environment. Sensor nodes detect the vehicle's presence using acoustic or thermal sensors, for example, and possibly provide a preliminary indication of the vehicle type. The sensor nodes can also cooperate to determine the approximate vehicle position. Two imaging nodes or sensor nodes with imaging systems are controlled to take pictures. A remote operator is alerted, who then selects the target of interest in the browser image. The laser tracking system thereafter tracks the selected target while it remains within a line of sight of the imaging nodes.
  • [0035]
    In another example scenario, multiple vehicles enter the perimeter under surveillance. The remote operator selects particular vehicles for tracking (for example, the first and last vehicles of a convoy), and the imaging nodes begin tracking of the selected vehicles using the information of the selected vehicle. The remote system can further link the tracking information of the imaging nodes to weapon or other targeting systems in a situation where further ingress of the area by the vehicles is to be prevented.
  • [0036]
    Alternatively, tracking can be accomplished without the assistance of a laser tracking system or designator by using recognition software operating on the image data. The recognition software can be hosted on any nodes or components of the network or alternatively, distributed among the nodes and components of the network. In this embodiment, the camera moves to keep the target vehicle or person within the field of view. Tracking can be assisted by the use of other sensors, either resident on the node with the camera or elsewhere in the network.
  • [0037]
    Examples of security applications using the WINS systems described herein include establishment of perimeters around factories, airports and other public facilities, military forces, and securing of borders. Such systems can also include face or speech recognition software along with the targeting to improve recognition probabilities.
  • [0038]
    While object location, identification, and tracking has been described largely in the context of sensor networks, it will be apparent to those skilled in the art that the architecture described above will be of use in a wide variety of other human-machine interface applications. These applications include, but are not limited to, notebook computers, personal digital assistants, personal computers, security posts, and situations in which computing devices and/or peripherals are upgraded over time.
  • [0039]
    Aspects of the invention may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (PLDs), such as field programmable gate arrays (FPGAs), programmable array logic (PAL) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits (ASICs). Some other possibilities for implementing aspects of the invention include: microcontrollers with memory (such as electronically erasable programmable read-only memory (EEPROM)), embedded microprocessors, firmware, software, etc. If aspects of the invention are embodied as software, the software may be carried by any computer readable medium, such as magnetically- or optically-readable disks (fixed or floppy), modulated on a carrier signal or otherwise transmitted, etc. Furthermore, aspects of the invention may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. The underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, etc.
  • [0040]
    Unless the context clearly requires otherwise, throughout the description, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of “including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words “herein,” “hereunder,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application.
  • [0041]
    The above description of illustrated embodiments of the invention is not intended to be exhaustive or to limit the invention to the precise form disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize. The teachings of the invention provided herein can be applied to other processing and sensor systems, not only for the processing and sensor systems described above.
  • [0042]
    The elements and acts of the various embodiments described above can be combined to provide further embodiments. All of the above references and U.S. patents and patent applications are incorporated herein by reference. Aspects of the invention can be modified, if necessary, to employ the systems, functions and concepts of the various patents and applications described above to provide yet further embodiments of the invention.
  • [0043]
    These and other changes can be made to the invention in light of the above detailed description. In general, the terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification and the claims, but should be construed to include all systems that operate under the claims. Accordingly, the invention is not limited by the disclosure, but instead the scope of the invention is to be determined entirely by the claims.
  • [0044]
    While certain aspects of the invention are presented below in certain claim forms, the inventors contemplate the various aspects of the invention in any number of claim forms. Accordingly, the inventors reserve the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the invention.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US3475755 *Apr 21, 1967Oct 28, 1969Us ArmyQuarter wave-length ring antenna
US4062819 *Sep 7, 1976Dec 13, 1977Emery Industries, Inc.Polyamide blends having improved processing characteristics
US4063819 *Aug 27, 1976Dec 20, 1977The United States Of America As Represented By The Secretary Of The Air ForceHigh energy laser pointing and tracking system utilizing beam angle/focus dither method of operation
US4405016 *Jun 29, 1981Sep 20, 1983Smith International, Inc.Underwater Christmas tree cap and lockdown apparatus
US4406016 *Nov 27, 1981Sep 20, 1983The United States Of America As Represented By The Secretary Of The ArmyVHF Sensor in-band radio relay
US4520674 *Nov 14, 1983Jun 4, 1985Technology For Energy CorporationVibration monitoring device
US4649524 *Aug 22, 1983Mar 10, 1987Potash Corporation Of Saskatchewan Mining LimitedIntegrated acoustic network
US4812820 *Jul 23, 1986Mar 14, 1989Chatwin Ian MalcolmElectronic surveillance system and transceiver unit therefor
US4855713 *Oct 7, 1988Aug 8, 1989Interactive Technologies, Inc.Learn mode transmitter
US4951029 *Feb 16, 1988Aug 21, 1990Interactive Technologies, Inc.Micro-programmable security system
US5184311 *Jun 19, 1990Feb 2, 1993At&T Bell LaboratoriesMethod of operating an installation that comprises one or more long electrical conductors
US5241542 *Aug 23, 1991Aug 31, 1993International Business Machines CorporationBattery efficient operation of scheduled access protocol
US5247564 *Oct 24, 1990Sep 21, 1993Gte Mobile Communications Service Corp.Adaptive vehicle alarm detection and reporting system
US5295154 *May 3, 1993Mar 15, 1994Norand CorporationRadio frequency local area network
US5377189 *Dec 6, 1993Dec 27, 1994British Telecommunications Public Limited CompanyHybrid data communications systems
US5428636 *May 7, 1993Jun 27, 1995Norand CorporationRadio frequency local area network
US5475687 *Jul 27, 1993Dec 12, 1995Echelon CorporationNetwork and intelligent cell for providing sensing, bidirectional communications and control
US5553076 *May 2, 1994Sep 3, 1996Tcsi CorporationMethod and apparatus for a wireless local area network
US5608643 *Sep 1, 1994Mar 4, 1997General Programming Holdings, Inc.System for managing multiple dispensing units and method of operation
US5615175 *Sep 19, 1995Mar 25, 1997The United States Of America As Represented By The Secretary Of The NavyPassive direction finding device
US5659195 *Jun 8, 1995Aug 19, 1997The Regents Of The University Of CaliforniaCMOS integrated microsensor with a precision measurement circuit
US5701120 *Jun 1, 1995Dec 23, 1997Siemens Business Communication Systems, Inc.Partitioned point-to-point communications networks
US5726911 *Aug 22, 1996Mar 10, 1998Csi Technology, Inc.Electric motor monitor
US5729542 *Nov 21, 1995Mar 17, 1998Motorola, Inc.Method and apparatus for communication system access
US5732074 *Jan 16, 1996Mar 24, 1998Cellport Labs, Inc.Mobile portable wireless communication system
US5734699 *May 4, 1995Mar 31, 1998Interwave Communications International, Ltd.Cellular private branch exchanges
US5737529 *Apr 12, 1996Apr 7, 1998Echelon CorporationNetworked variables
US5742829 *Mar 10, 1995Apr 21, 1998Microsoft CorporationAutomatic software installation on heterogeneous networked client computer systems
US5745759 *Oct 14, 1994Apr 28, 1998Qnx Software Systems, Ltd.Window kernel
US5794164 *Nov 29, 1995Aug 11, 1998Microsoft CorporationVehicle computer system
US5852351 *Aug 20, 1997Dec 22, 1998Csi TechnologyMachine monitor
US5854994 *Aug 23, 1996Dec 29, 1998Csi Technology, Inc.Vibration monitor and transmission system
US5889477 *Mar 25, 1997Mar 30, 1999Mannesmann AktiengesellschaftProcess and system for ascertaining traffic conditions using stationary data collection devices
US5907491 *Apr 4, 1997May 25, 1999Csi Technology, Inc.Wireless machine monitoring and communication system
US5946083 *Oct 1, 1997Aug 31, 1999Texas Instruments IncorporatedFixed optic sensor system and distributed sensor network
US5958009 *Feb 27, 1997Sep 28, 1999Hewlett-Packard CompanySystem and method for efficiently monitoring quality of service in a distributed processing environment
US5973309 *Aug 27, 1997Oct 26, 1999Trw Inc.Target-tracking laser designation
US6009363 *Jun 24, 1996Dec 28, 1999Microsoft CorporationVehicle computer system with high speed data buffer and serial interconnect
US6023223 *Mar 18, 1999Feb 8, 2000Baxter, Jr.; John FrancisEarly warning detection and notification network for environmental conditions
US6028537 *Jun 13, 1997Feb 22, 2000Prince CorporationVehicle communication and remote control system
US6028857 *Jul 25, 1997Feb 22, 2000Massachusetts Institute Of TechnologySelf-organizing network
US6078269 *Nov 10, 1997Jun 20, 2000Safenight Technology Inc.Battery-powered, RF-interconnected detector sensor system
US6100925 *Nov 25, 1997Aug 8, 2000Princeton Video Image, Inc.Image insertion in video streams using a combination of physical sensors and pattern recognition
US6140957 *Feb 18, 1999Oct 31, 2000Trimble Navigation LimitedMethod and apparatus for navigation guidance
US6144905 *Mar 18, 1998Nov 7, 2000Motorola, Inc.Method for registering vehicular bus functionality
US6145082 *Mar 20, 1998Nov 7, 2000Motorola, Inc.Method for a vehicular gateway to transport information, including a method for programming the gateway
US6175789 *Sep 10, 1999Jan 16, 2001Microsoft CorporationVehicle computer system with open platform architecture
US6181994 *Apr 7, 1999Jan 30, 2001International Business Machines CorporationMethod and system for vehicle initiated delivery of advanced diagnostics based on the determined need by vehicle
US6185491 *Jul 31, 1998Feb 6, 2001Sun Microsystems, Inc.Networked vehicle controlling attached devices using JavaBeans™
US6202008 *Sep 10, 1999Mar 13, 2001Microsoft CorporationVehicle computer system with wireless internet connectivity
US6208247 *Aug 18, 1998Mar 27, 2001Rockwell Science Center, LlcWireless integrated sensor network using multiple relayed communications
US6245013 *Jan 27, 1999Jun 12, 2001Medtronic, Inc.Ambulatory recorder having synchronized communication between two processors
US6246935 *Dec 28, 1998Jun 12, 2001Daimlerchrysler CorporationVehicle instrument panel computer interface and display
US6252544 *Jan 25, 1999Jun 26, 2001Steven M. HoffbergMobile communication device
US6330562 *Jan 29, 1999Dec 11, 2001International Business Machines CorporationSystem and method for managing security objects
US6392692 *Feb 25, 1999May 21, 2002David A. MonroeNetwork communication techniques for security surveillance and safety system
US6400281 *Mar 17, 1998Jun 4, 2002Albert Donald Darby, Jr.Communications system and method for interconnected networks having a linear topology, especially railways
US6414955 *Mar 23, 1999Jul 2, 2002Innovative Technology Licensing, LlcDistributed topology learning method and apparatus for wireless networks
US6452910 *Jul 20, 2000Sep 17, 2002Cadence Design Systems, Inc.Bridging apparatus for interconnecting a wireless PAN and a wireless LAN
US6480900 *Dec 28, 1998Nov 12, 2002Bull, S.A.Communication method in a set of distributed systems via an internet type network
US6505086 *Aug 13, 2001Jan 7, 2003William A. Dodd, Jr.XML sensor system
US6532494 *May 28, 1999Mar 11, 2003Oracle International CorporationClosed-loop node membership monitor for network clusters
US6545601 *Feb 25, 1999Apr 8, 2003David A. MonroeGround based security surveillance system for aircraft and other commercial vehicles
US6546419 *May 7, 1999Apr 8, 2003Richard HumplemanMethod and apparatus for user and device command and control in a network
US6580979 *Jul 10, 2001Jun 17, 2003Hrl Laboratories, LlcMethod and apparatus for terrain reasoning with distributed embedded processing elements
US6584382 *May 17, 2001Jun 24, 2003Abraham E. KaremIntuitive vehicle and machine control
US6615088 *Jun 9, 1999Sep 2, 2003Amx CorporationSystem and method of device interface configuration for a control system
US6640145 *Jun 3, 2002Oct 28, 2003Steven HoffbergMedia recording device with packet data interface
US6662091 *Dec 20, 2001Dec 9, 2003Battelle Memorial InstituteDiagnostics/prognostics using wireless links
US6728514 *Dec 13, 2000Apr 27, 2004Wi-Lan Inc.Scalable wireless network topology systems and methods
US6735630 *Oct 4, 2000May 11, 2004Sensoria CorporationMethod for collecting data using compact internetworked wireless integrated network sensors (WINS)
US6751455 *Sep 15, 2000Jun 15, 2004The Regents Of The University Of CaliforniaPower- and bandwidth-adaptive in-home wireless communications system with power-grid-powered agents and battery-powered clients
US6801662 *Oct 10, 2000Oct 5, 2004Hrl Laboratories, LlcSensor fusion architecture for vision-based occupant detection
US6813542 *Feb 12, 2001Nov 2, 2004The Stanley WorksModules for use in an integrated intelligent assist system
US6826607 *Oct 4, 2000Nov 30, 2004Sensoria CorporationApparatus for internetworked hybrid wireless integrated network sensors (WINS)
US6832251 *Oct 4, 2000Dec 14, 2004Sensoria CorporationMethod and apparatus for distributed signal processing among internetworked wireless integrated network sensors (WINS)
US6859831 *Oct 4, 2000Feb 22, 2005Sensoria CorporationMethod and apparatus for internetworked wireless integrated network sensor (WINS) nodes
US6911997 *Oct 12, 2000Jun 28, 2005Matsushita Electric Industrial Co., Ltd.Monitoring system, camera adjusting method and vehicle monitoring system
US6990080 *Aug 7, 2001Jan 24, 2006Microsoft CorporationDistributed topology control for wireless multi-hop sensor networks
US7020701 *Oct 4, 2000Mar 28, 2006Sensoria CorporationMethod for collecting and processing data using internetworked wireless integrated network sensors (WINS)
US7027773 *May 24, 2000Apr 11, 2006Afx Technology Group International, Inc.On/off keying node-to-node messaging transceiver network with dynamic routing and configuring
US7049953 *Dec 16, 2002May 23, 2006E-Watch, Inc.Ground based security surveillance system for aircraft and other commercial vehicles
US7069188 *May 12, 2004Jun 27, 2006Eye On Solutions, LlcInformation management system
US7161926 *Jul 3, 2002Jan 9, 2007Sensoria CorporationLow-latency multi-hop ad hoc wireless network
US7468661 *Mar 31, 2006Dec 23, 2008Hunt Technologies, Inc.System and method for monitoring and controlling remote devices
US7797367 *Oct 4, 2000Sep 14, 2010Gelvin David CApparatus for compact internetworked wireless integrated network sensors (WINS)
US20020036750 *Sep 22, 2001Mar 28, 2002Eberl Heinrich A.System and method for recording the retinal reflex image
US20020067475 *Oct 25, 2001Jun 6, 2002Reinhard WaibelOptoelectronic laser distance-measuring instrument
US20020111213 *Feb 13, 2001Aug 15, 2002Mcentee Robert A.Method, apparatus and article for wagering and accessing casino services
US20020154631 *Jul 31, 1998Oct 24, 2002Tarek MakansiMethod and apparatus for transmitting messages
US20040006424 *Jun 30, 2003Jan 8, 2004Joyce Glenn J.Control system for tracking and targeting multiple autonomous objects
US20040008651 *Jan 28, 2003Jan 15, 2004Osman AhmedBuilding system with reduced wiring requirements and apparatus for use therein
US20040049428 *Sep 5, 2002Mar 11, 2004Soehnlen John PiusWireless environmental sensing in packaging applications
US20050267638 *Jul 13, 2005Dec 1, 2005The Stanley WorksSystem and architecture for providing a modular intelligent assist system
US20060083217 *Sep 2, 2005Apr 20, 2006Keun-Ah BaeElement management method and system in multiple networks
US20100148940 *Feb 22, 2010Jun 17, 2010Gelvin David CApparatus for internetworked wireless integrated network sensors (wins)
US20100201516 *Apr 12, 2010Aug 12, 2010Gelvin David CApparatus for Compact Internetworked Wireless Integrated Network Sensors (WINS)
US20110029644 *Oct 13, 2010Feb 3, 2011Gelvin David CMethod for Vehicle Internetworks
US20110035491 *Oct 21, 2010Feb 10, 2011Gelvin David CMethod for Internetworked Hybrid Wireless Integrated Network Sensors (WINS)
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7844687Oct 4, 2000Nov 30, 2010Gelvin David CMethod for internetworked hybrid wireless integrated network sensors (WINS)
US7873043Jul 25, 2008Jan 18, 2011Dust Networks, Inc.Digraph network superframes
US7881239May 10, 2005Feb 1, 2011Dust Networks, Inc.Low-powered autonomous radio node with temperature sensor and crystal oscillator
US7891004Oct 4, 2000Feb 15, 2011Gelvin David CMethod for vehicle internetworks
US7904569Oct 4, 2000Mar 8, 2011Gelvin David CMethod for remote access of vehicle components
US7961664Jun 13, 2005Jun 14, 2011Dust Networks, Inc.Digraph network subnetworks
US8059629Jun 13, 2005Nov 15, 2011Dust Networks, Inc.Digraph network timing synchronization
US8079118Oct 13, 2010Dec 20, 2011Borgia/Cummins, LlcMethod for vehicle internetworks
US8090264 *Nov 24, 2008Jan 3, 2012The Boeing CompanyArchitecture for enabling network centric communications, sensing, computation, and information assurance
US8140658Oct 4, 2000Mar 20, 2012Borgia/Cummins, LlcApparatus for internetworked wireless integrated network sensors (WINS)
US8193936Dec 9, 2009Jun 5, 2012Solarbeam Security, LlcSolar powered security system
US8194655 *Aug 5, 2004Jun 5, 2012Dust Networks, Inc.Digraph based mesh communication network
US8364136Sep 23, 2011Jan 29, 2013Steven M HoffbergMobile system, a method of operating mobile system and a non-transitory computer readable medium for a programmable control of a mobile system
US8369967Mar 7, 2011Feb 5, 2013Hoffberg Steven MAlarm system controller and a method for controlling an alarm system
US8416120 *Feb 17, 2009Apr 9, 2013Sungkyunkwan University Foundation For Corporate CollaborationMethod of sensor network localization through reconstruction of radiation pattern
US8447847 *Jun 28, 2007May 21, 2013Microsoft CorporationControl of sensor networks
US8601595Dec 1, 2011Dec 3, 2013Borgia/Cummins, LlcMethod for vehicle internetworks
US8812654Oct 21, 2010Aug 19, 2014Borgia/Cummins, LlcMethod for internetworked hybrid wireless integrated network sensors (WINS)
US8829821Mar 1, 2013Sep 9, 2014Cree, Inc.Auto commissioning lighting fixture
US8832244Feb 22, 2010Sep 9, 2014Borgia/Cummins, LlcApparatus for internetworked wireless integrated network sensors (WINS)
US8836503Apr 12, 2010Sep 16, 2014Borgia/Cummins, LlcApparatus for compact internetworked wireless integrated network sensors (WINS)
US8892495Jan 8, 2013Nov 18, 2014Blanding Hovenweep, LlcAdaptive pattern recognition based controller apparatus and method and human-interface therefore
US8912735Mar 1, 2013Dec 16, 2014Cree, Inc.Commissioning for a lighting network
US8975827Mar 1, 2013Mar 10, 2015Cree, Inc.Lighting fixture for distributed control
US9155165Mar 1, 2013Oct 6, 2015Cree, Inc.Lighting fixture for automated grouping
US9155166Mar 1, 2013Oct 6, 2015Cree, Inc.Efficient routing tables for lighting networks
US9247216 *Jul 20, 2010Jan 26, 2016Verint Systems Ltd.Systems and methods for video- and position-based identification
US9338858Mar 1, 2013May 10, 2016Cree, Inc.Handheld device for communicating with lighting fixtures
US9433061Mar 1, 2013Aug 30, 2016Cree, Inc.Handheld device for communicating with lighting fixtures
US9456482Apr 8, 2015Sep 27, 2016Cree, Inc.Daylighting for different groups of lighting fixtures
US20050213612 *May 10, 2005Sep 29, 2005Dust NetworksLow-powered autonomous radio node with temperature sensor and crystal
US20060029060 *Aug 5, 2004Feb 9, 2006Dust NetworksDigraph based mesh communication network
US20080285582 *Jul 25, 2008Nov 20, 2008Dust Networks, Inc.Digraph network superframes
US20090006589 *Jun 28, 2007Jan 1, 2009Microsoft CorporationControl of sensor networks
US20100085242 *Feb 17, 2009Apr 8, 2010Sungkyunkwan University Foundation For Corporate CollaborationMethod of sensor network localization through reconstruction of radiation pattern
US20100127852 *Nov 24, 2008May 27, 2010Hunt Jeffrey HArchitecture for enabling network centric communications, sensing, computation, and information assurance
US20100194565 *Dec 9, 2009Aug 5, 2010Robert HoustonSolar powered security system
US20110018995 *Jul 20, 2010Jan 27, 2011Verint Systems Ltd.Systems and methods for video- and position-based identification
USD744669Apr 22, 2013Dec 1, 2015Cree, Inc.Module for a lighting fixture
Classifications
U.S. Classification370/338
International ClassificationH04Q7/24, G01S13/86, H04L29/06, H04L29/08, G01S13/72, G01S13/87, G01S17/87, G01S17/66, G01S19/19, G01S5/14
Cooperative ClassificationH04L67/04, H04L67/12, G01S13/72, G01S13/87, G01S17/87, G01S19/14, H04L29/06, G01S17/66, G01S13/867, G01S17/023
European ClassificationG01S17/87, H04L29/08N3, H04L29/06, G01S13/72, G01S13/87, H04L29/08N11, G01S13/86, G01S17/66
Legal Events
DateCodeEventDescription
Nov 15, 2007ASAssignment
Owner name: BORGIA/CUMMINS, LLC, DELAWARE
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SENSORIA CORPORATION;REEL/FRAME:020120/0802
Effective date: 20060310
Owner name: SENSORIA CORPORATION, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KAISER, WILLIAM J.;NEWBERG, FREDRIC;POTTIE, GREGORY J.;REEL/FRAME:020120/0785;SIGNING DATES FROM 20030319 TO 20030320
Jul 23, 2010ASAssignment
Owner name: BORGIA/CUMMINS, LLC, DELAWARE
Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE SPELLING OF THE ASSIGNEE S CITY OF RESIDENCE ERRONEOUSLY SPELLED "WILLINGTON" PREVIOUSLY RECORDED ON REEL 020120 FRAME 0802. ASSIGNOR(S) HEREBY CONFIRMS THE CORRECT SPELLING OF THE ASSIGNEE S CITY OF RESIDENCE IS "WILMINGTON";ASSIGNOR:SENSORIA CORPORATION;REEL/FRAME:024743/0755
Effective date: 20060310