|Publication number||US7096120 B2|
|Application number||US 10/635,869|
|Publication date||Aug 22, 2006|
|Filing date||Aug 5, 2003|
|Priority date||Aug 6, 2002|
|Also published as||US20040030491|
|Publication number||10635869, 635869, US 7096120 B2, US 7096120B2, US-B2-7096120, US7096120 B2, US7096120B2|
|Original Assignee||Hewlett-Packard Development Company, L.P.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (50), Non-Patent Citations (24), Referenced by (9), Classifications (13), Legal Events (5)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The present invention relates to a method and arrangement for guiding a user along a target path such as, for example, a path through an exhibition space.
In many mobile computing applications, there may be a requirement that users follow a particular path through a physical space. However, the physical space may be devoid of physical signs to indicate a specified path though that space. There are many uses of audio to guide navigation, including the use of audio beacons to attract users to its source, and the use of sonar to indicate obstacles ahead. A system of audio cues known as the “Oboe” system was also used in the Second World War to guide the pilots of RAF (the British Royal Air Force) bombers to targets; in this system monaural audio cues were presented to the pilot through headphones and represented three ternary states, namely: turn left, turn right, and straight ahead.
It is an object of the present invention to provide sound based cues for guiding a user along a target path.
According to a first aspect of the present invention, there is provided a method of guiding a user along a target path, comprising the steps of:
According to a second aspect of the present invention, there is provided an arrangement for guiding a user along a target path, the arrangement comprising:
According to a third aspect of the present invention, there is provided a method of guiding a user along a target path, comprising the steps of:
According to a fourth aspect of the present invention, there is provided a method of guiding a user along a target path, comprising the steps of:
According to a fifth aspect of the present invention, there is provided a method of guiding a user along a target path, comprising the steps of:
Embodiments of the invention will now be described, by way of non-limiting example, with reference to the accompanying diagrammatic drawings, in which:
On entering the exhibition hall 10, a user 30 collects a mobile device 31 from the reception desk 18 (or the user may have their own device). This device 31 cooperates with location-related infrastructure to permit the location of the user in the hall 10 to be determined. A number of techniques exist for enabling the location of the user to be determined with reasonable accuracy and any such technique can be used; in the present example, the technique used is based on an array of ultrasonic emitters 33 (represented in
The exhibition hall is equipped with a wireless LAN infrastructure 36 comprising a distribution system and access points 37. The wireless LAN has a coverage encompassing substantially all of the hall 10, the boundary of the coverage being indicated by chain-dashed line 38 in
It will be appreciated that communication between the device 31 and service system 35 can be effected by any suitable means and is not limited to being a wireless LAN.
Much of the foregoing functionality will typically be provided by a program-controlled general purpose processor though other implementations are, of course, possible.
The visit data held by memory 44 will typically include a user/device profile data (for example, indicating the subjects of interest to the user, the intended visit duration, and the media types that can be handled by the device), an electronic map of the hall 10, the user's current location as determined by the subsystem 40, and details of a planned route being followed by the user.
The service system 35 comprises the following main functional elements:
The functional elements of the service system 35 can be configured as a set of servers all connected to the LAN 51 or be arranged in any other suitable manner as will be apparent to persons skilled.
It is to be understood that the split of functionality between the mobile device 31 and service subsystem 35 can be varied substantially form that indicated for the
In general terms, a user starting a visit can request a route to follow using the user interface 48 of the mobile device 31 to indicate parameters to be satisfied by the route. This route request is sent by the visit manager to route planner 59 and results in the download to the mobile device 31 of a planned route. The path guide 49 then provides the user (typically, though not necessarily, only when asked) with guide indications to assist the user in following the planned route. Where the interface 48 includes a visual display, this can conveniently be done by displaying a map showing the user's current location and the planned route; in contrast, where only an audio interface is available, this can be done by audio cues to indicate the direction to follow. A user need not request a planned route and in this case will receive no guide indications. A user may request a route plan at any stage of a visit (for example a route to an exhibit of interest).
As the user moves through the hall, the location determination subsystem 40 sends periodic location reports (see
Having described the general operation of the mobile device 31 and service system 35, a more detailed description will now be given of how a user can be guided along a route by an audio-based implementation of the path guide unit 49 of device 31.
However a route is determined by the route planner 59, details of the planned route are passed back to the mobile device 31 for storage in the memory 43. Alternatively, a route to follow may have been determined in the device itself, for example by the user specifying on the stored map locations to be visited and the visit manager 47 locally determining the shortest path between these locations. Typically, the route will have been specified by a series of locations defining a path. The path guide unit 49 is operative to use these stored details to provide guidance to the user for following the path concerned. Whilst the path guide unit 49 can be arranged to use a visual display of user interface 48 to provide this guidance, an audio-based embodiment of unit 49 is described below for using non-verbal audio output to guide a user along a specified path (referred to below as the “target” path).
In the audio-based embodiment of the path guide unit 49, a 3D audio spatialisation processor is used to project a virtual audio beacon ahead of the user along the target path. As the user approaches (or arrives at) the location of the virtual beacon, the beacon is repositioned further along the path. This process repeats until the user has traversed the entire path. The user is preferably guided along the whole length of the path, both where the path is not physically defined (for example, in the middle of a hall) and where the path is physically defined (for example, in a corridor). The process is illustrated in
When the user is positioned at start point 121 and the unit activated for guiding the user along target path 120, the unit determines at least a first segment 123A of the piecewise linear approximation to the target, this approximation being generated according to a heuristic which, for example, both keeps the area between this first segment and the target path to below a predetermined limit value, and keeps the length of the segment to no more than a predetermined maximum length. After determining this first segment 123A, the unit 49 determines a position 125A for a virtual sound beacon such that it lies a fixed distance “K” beyond the end of the first segment 123A in the direction of extent of the latter. The unit 49 then uses its 3D audio spatialisation processor to produce a world-stabilised virtual sound beacon at this position 125A in the sound field of the user, the output of the 3D audio spatialisation processor being via stereo headphones 60 or other suitable audio output devices (such as the shoulder mounted speakers previously mentioned). Also as previously mentioned, in order to render the virtual beacon in a world stabilized position, the unit 49 is provided with the direction of facing of the user's head or body, depending on where the audio output devices are carried. An electronic compass mounted with the audio output devices can be used to provide the required direction of facing data to the unit 49.
As a result, regardless of the direction of facing of the user, the user is provided with a sound beacon positioned in space to indicate the direction in which the user should move to follow the target path 122.
The user now sets off towards the position 125A of the virtual sound beacon. Upon the user approaching to within distance “K” of the sound beacon, the above process is repeated with the user's current position being taken as the start of the target path (whether or not actually on the target path). Thus, a second linear piecewise approximation segment 123B is determined and the virtual sound beacon is re-positioned to appear to be at a location 125B a distance “K” beyond this newly-determined segment in the direction of extent of the latter.
In this manner, the virtual sound beacon is moved in a succession of steps to guide the user along a piecewise linear approximation of the target path until the user reaches the path end point 122.
It should be noted that where the target path includes a long straight section, this will be split up into several segments by the above process in order to keep the segment level down to the aforesaid predetermined maximum length thereby ensuring that the virtual sound beacon is never more than that distance plus the value “K” beyond the user. This is illustrated in
Many variants are possible to the above-described embodiment of the path guide unit 49. For example, the distance “K” may have a value of zero.
Furthermore, the unit 49 can advantageously be arranged to check that no obstruction exists between the current location of the user and the position, or proposed position, of the sound beacon. This check is made using the electronic map of the hall 10 held in the visit data memory 43, obstructions (or what are to be treated as obstructions) being marked on this map. If the unit 49 determines that the straight line path from the user to the sound beacon passes through an obstruction, the unit 49 modifies the position (or proposed position) of the beacon until this is no longer the case. This check can be carried out either simply when the beacon is first positioned or on a continuous basis each time the user's location is determined. Of course, rather than a check being carried out in respect of a particular location of the audio beacon, the position of the latter can be chosen such that it lies outside a “dead zone” of locations formed by the shadow of the obstruction (considering the user as a point “light source”).
In another variant, rather than providing only a single virtual sound beacon, one or more further beacons can be provided beyond the first beacon, for example, at positions at the end of (or a distance “K” beyond) second and third segments of the piecewise linear approximation of the target path—in this case, the maximum length of each path segment will typically be less (for example, half) of that used in the case where only a single virtual beacon is being presented. For example, and as illustrated in
The beacons can be caused to vary in intensity, frequency, or some other audible characteristic, to indicate the order in which they should be approached with each beacon sounding in turn (potentially with overlap) in a cyclic manner with the further sounds being quieter. In the
The virtual beacons are rendered to appear static and as the user approaches or reaches the first, it is removed and replaced by a new distant virtual sound beacon, last in the series of three beacons, positioned at the end of a further piecewise linear approximation segment. The new beacon may be caused to appear just before, at the same time as, or just after the first beacon is caused to disappear. This process of replacing beacons as they are approached or reached is repeated as the user moves along the target path.
The above-described check for obstructions can, of course, also be carried out where multiple beacons are being used.
As a further general variant, it is possible, though not preferred, to arrange for the beacon or beacons to remain a constant distance ahead of the user, at least over a substantial portion of the target path, as the latter seeks to move towards them.
As already noted, the distribution of functionality between mobile devices and the service system is not limited to the distributions described above since the availability of communication resources makes it possible to place functionality where most suitable from technical and commercial considerations. Furthermore, in the foregoing reference to a mobile device is not to be construed as requiring functionality housed in a single unit and the functionality associated with a mobile device can be provided by a local aggregation of units.
The above described methods and arrangements are not limited to use in exhibition halls or similar public or private buildings; the methods and arrangements disclosed can be applied not only to internal spaces but also to external spaces or combinations of the two.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US1942327 *||Jun 24, 1933||Jan 2, 1934||Aircraft Radio Corp||Radioreceiver|
|US2107155 *||Jul 27, 1935||Feb 1, 1938||Bell Telephone Labor Inc||Radio directional indicator|
|US2489248 *||Sep 1, 1943||Nov 29, 1949||Sperry Corp||Navigation system|
|US2726039 *||Nov 27, 1945||Dec 6, 1955||Richard K Mosher||Beacon navigation system|
|US2784307 *||Aug 25, 1952||Mar 5, 1957||Flite Tronics Inc||Marker beacon receiver|
|US3108223 *||Oct 23, 1961||Oct 22, 1963||Arthur Leinwohl||Miniature radio beacon apparatus|
|US3161881 *||Sep 7, 1962||Dec 15, 1964||Beacon locator system|
|US3247464 *||Sep 8, 1961||Apr 19, 1966||Rca Corp||Audio amplifier including volume compression means|
|US4001828 *||Apr 2, 1975||Jan 4, 1977||Texas Instruments Incorporated||Beacon tracking receiver|
|US4021807 *||Apr 2, 1975||May 3, 1977||Texas Instruments Incorporated||Beacon tracking system|
|US4023176 *||Apr 2, 1975||May 10, 1977||Texas Instruments Incorporated||Beacon tracking display system|
|US4106023 *||Feb 24, 1975||Aug 8, 1978||Baghdady Elie J||Navigation aid system|
|US4819053 *||Apr 17, 1987||Apr 4, 1989||Halavais Richard A||Single-point locating system|
|US4991126||May 13, 1987||Feb 5, 1991||Lothar Reiter||Electronic-automatic orientation device for walkers and the blind|
|US5299227 *||Apr 13, 1993||Mar 29, 1994||American Electronics, Inc.||Individual beacon identification system|
|US5334987||Apr 1, 1993||Aug 2, 1994||Spectra-Physics Laserplane, Inc.||Agricultural aircraft control system using the global positioning system|
|US5577961 *||Jun 28, 1994||Nov 26, 1996||The Walt Disney Company||Method and system for restraining a leader object in a virtual reality presentation|
|US5689270 *||Mar 12, 1996||Nov 18, 1997||Pinterra Corporation||Navigation and positioning system and method using uncoordinated becon signals|
|US5889843||Mar 4, 1996||Mar 30, 1999||Interval Research Corporation||Methods and systems for creating a spatial auditory environment in an audio conference system|
|US6133867 *||Dec 28, 1998||Oct 17, 2000||Eberwine; David Brent||Integrated air traffic management and collision avoidance system|
|US6314406 *||Aug 29, 1997||Nov 6, 2001||Telxon Corporation||Customer information network|
|US6405107 *||Jan 11, 2001||Jun 11, 2002||Gary Derman||Virtual instrument pilot: an improved method and system for navigation and control of fixed wing aircraft|
|US6490513 *||Aug 22, 2001||Dec 3, 2002||Matsushita Electrical Industrial Co., Ltd.||Automobile data archive system having securely authenticated instrumentation data storage|
|US6513015 *||Sep 25, 1998||Jan 28, 2003||Fujitsu Limited||System and method for customer recognition using wireless identification and visual data transmission|
|US6539393 *||Sep 30, 1999||Mar 25, 2003||Hill-Rom Services, Inc.||Portable locator system|
|US6594044 *||Mar 15, 2000||Jul 15, 2003||Lucent Technologies Inc.||Apparatus and method for automatic port identity discovery in heterogenous optical communications systems|
|US6741856 *||May 9, 2001||May 25, 2004||Vesuvius Inc.||Communique system for virtual private narrowcasts in cellular communication networks|
|US6963795 *||Jul 16, 2002||Nov 8, 2005||Honeywell Interntaional Inc.||Vehicle position keeping system|
|US7009980 *||Mar 13, 2000||Mar 7, 2006||Lucent Technologies Inc.||Apparatus and method for automatic port identity discovery in hierarchical heterogenous systems|
|US20020017989 *||Jul 3, 2001||Feb 14, 2002||Forster Ian J.||Deactivation of field-emitting electronic device upon detection of a transportation vessel|
|US20020037716 *||May 9, 2001||Mar 28, 2002||Vesuvius, Inc.||Communique system for virtual private narrowcasts in cellular communication networks|
|US20020165731 *||Mar 11, 2002||Nov 7, 2002||Sentinel Wireless, Llc||System and method for performing object association at a tradeshow using a location tracking system|
|US20020174021 *||May 15, 2001||Nov 21, 2002||International Business Machines Corporation||Optimized shopping list process|
|US20030086562||Nov 8, 2001||May 8, 2003||Wong John Patrick||Hands-free speaker telephone|
|US20030223602||Jun 4, 2002||Dec 4, 2003||Elbit Systems Ltd.||Method and system for audio imaging|
|US20040013232||Jul 17, 2002||Jan 22, 2004||Jeffrey Rahn||Pixel circuitry for imaging system|
|US20040030491 *||Aug 5, 2003||Feb 12, 2004||Hewlett-Packard Development Company, L.P.||Method and arrangement for guiding a user along a target path|
|US20050120200 *||Dec 27, 2004||Jun 2, 2005||Cyril Brignone||Limiting access to information corresponding to a context|
|US20050151662 *||Oct 22, 2004||Jul 14, 2005||Douglas Kashuba||Avalanche transceiver|
|CN1600640A *||Mar 20, 2000||Mar 30, 2005||弗朗索瓦·贝尔纳德||Appts for deploying load to underwater target position with enhanced accuracy and method to control such appts.|
|EP0503214A1||Mar 11, 1991||Sep 16, 1992||CONTRAVES ITALIANA S.p.A.||Orientation device|
|GB2219658A||Title not available|
|GB2287535A||Title not available|
|GB2382288A||Title not available|
|JPH0719887A||Title not available|
|JPH0757190A||Title not available|
|WO1997043599A1||May 13, 1997||Nov 20, 1997||Rockwell-Collins France||Personal direction finding apparatus|
|WO1999067904A1||Jun 23, 1998||Dec 29, 1999||Swisscom Ag||Display device for displaying received information and for transmitting received digital data to external mobile apparatuses and corresponding request method|
|WO2001035600A2||Oct 26, 2000||May 17, 2001||Kaplan Richard D||Method and apparatus for web enabled wireless tour-guide system|
|WO2001055833A1||Jan 29, 2001||Aug 2, 2001||Lake Technology Limited||Spatialized audio system for use in a geographical environment|
|1||"Learning's in the air: museums, microcosms, and the future f the mobile net," Mpulse: A Cooltown Magazine, INTERNET: <http//www.cooltown.com/mpulse/0901-museums.asp?print=yes>3 pages total (Mar. 13, 2002).|
|2||"Radar Recollections-A Bournemouth University / ChiDE/ HLF project," INTERNET: <http://chide.bournemouth.ac.uk/Oral<SUB>-</SUB>History/Talking<SUB>-</SUB>About<SUB>-</SUB>Technology/radar<SUB>-</SUB>research/glossary.html>10 pages total (Jul. 18, 2003).|
|3||*||Azuma et al., Advanced human-computer interfaces for air traffic management and simulation, American Institute of Aeronautics and Astronautics, 1996.|
|4||Bederson, B.B., "Audio Augmented Reality: A Prototype Automated Tour Guide," ACM Human Computer in Computing Systems conference (CHI '95), pp. 210-211, INTERNET: <http://www.cs.umd.edu/-bederson/papers/chi-95-aar/index.html> 4 pages total (Feb. 2, 2002).|
|5||*||Benjamin B. Benderson, Audio augmented reality: a prototype automated tour guide, Feb. 21, 2002 from http://www.cs.umd.edu/ (4 pages).|
|6||*||Dorfmuller et al., Real-time hand and head tracking for virtual environments using infrared beacons, published in 1998 (from Dialog(R) File 34, acc. No. 08029802).|
|7||*||Engel, Image features as virtual beacons for local navigation, Proc. of the SPIE, vol. 1002, p. 626-33, published in 1989 (from Dialog(R) File 2, acc. No. 04418939).|
|8||*||Mirjana Spasojevic et al., A study of an augmented museum experience, Internet and Mobile Systems Laboratory, Hewlett-Packard Laboratories, Palo Alto, CA 94304, USA (6 pages).|
|9||*||Naomi Ehrich Leonard et al., Virtual leaders, artificial potentials and coordinated control of groups, Proc. 40th IEEE Conf. Decision & Control, 2001, p. 2968-2973.|
|10||*||Naomi Ehrich Leonard, Optimization & Systems Theory Semincar, the abstract, Apr. 27, 2001 (1 page).|
|11||*||Ogren et al., Obstacle avoidance in formation,. Proc. IEEE International Conference on Robotics & Automation, 2003 (6 pages).|
|12||*||Payton et al., Pheromone robotics, Proceedings of the SPIE, vol. 4195, p. 67-75, published 2001 (from Dialog(R) File 2, acc. No. 08044009).|
|13||*||Petter Ogren et al., A tractable convergent dynamic window approach to obstacle avoidance, Proceedings of IEEE IROS, Oct. 2002 (6 pages).|
|14||*||R Dan Jacobson et al., GIS & people with visual impairments or blindness . . . , School of Geosciences, Queen's University of Belfast, UK, Pearson Professional Limited 1997 (18 pages).|
|15||*||R. Anderson et al., GAINS instrumentation, AIAA International Balloon Technology Conference, Norfolk, VA, Jun. 28-Jul. 1, 1999 (Conf. Technical Papers A99-33301 Aug. 01).|
|16||*||R. Dan Jacobson, Talking tactile maps & environmental audio beacons: . . . , Intitue of earth studies, Univ. of Wales Aberystwyth, Ceredigion, SY 23 3DB, UK, published date Mar. 30, 1999 from ica-llub.doc (22 pages).|
|17||*||Ross et al., Development of a wearable computer orientation system, Personal and Ubiquitous Computing, vol. 6 No. 1 p. 49-63, 2002 (from Dialog(R) File 2 acc. No. 08286003).|
|18||*||Ross, D.A. et al., Evaluation of orientation interfaces for wearable computers, IEEE Explore, The 4th International Symposium on wearable computers, Oct. 16, 2000-Oct. 17, 2000 at Atlanta, GA, USA.|
|19||Ross, D.A., et al., "Evaluation of orientation interfaces for wearable computers," The Fourth International Symposium on Wearable Computers 2000, IEEE, pp. 51-58 (2002).|
|20||*||Shu Du, Routing in large-scale ad hoc networks based on a self-organizing coordinate system, Rice University, vol. 42/05 of Masters Abstracts, p. 1748 (from Dialog(R) File 35, acc. No. 02002572).|
|21||Spasojevic, M., et al., "A Study of an Augmented Museum Experience," 6 pages total.|
|22||U.S. Serial No. 10/635,874, filed Aug. 5, 2003, Hull.|
|23||*||Wall et al., Integrated communications architecture for road transport informatics, Vehicle Navigation & Information Systems Conference Proceedings, Part 2 (of 2)-Proceedings-Society of Automotive Engineers n P-253 pt 2, published in 1991 (from Dialog(R) File 8, acc. No. 03431396).|
|24||*||Wichtel et al., Virtual beacons for RTI/VHS data distribution, IEEE published on 1994 (from Dialog(R) File 2 acc. No. 05839887).|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8532962||Dec 23, 2009||Sep 10, 2013||Honeywell International Inc.||Approach for planning, designing and observing building systems|
|US8766763 *||Jan 6, 2010||Jul 1, 2014||Sony Corporation||Function control method using boundary definition, function control system using boundary definition, function control server using boundary definition and program|
|US8773946||Apr 21, 2011||Jul 8, 2014||Honeywell International Inc.||Portable housings for generation of building maps|
|US8885442||Jun 23, 2011||Nov 11, 2014||Sony Corporation||Method for determining an acoustic property of an environment|
|US8990049||May 3, 2010||Mar 24, 2015||Honeywell International Inc.||Building structure discovery and display from various data artifacts at scene|
|US9342928||Jun 29, 2012||May 17, 2016||Honeywell International Inc.||Systems and methods for presenting building information|
|US20100171585 *||Jan 6, 2010||Jul 8, 2010||Yuichiro Takeuchi||Function control method using boundary definition, function control system using boundary definition, function control server using boundary definition and program|
|US20110153279 *||Dec 23, 2009||Jun 23, 2011||Honeywell International Inc.||Approach for planning, designing and observing building systems|
|EP2410769A1 *||Jul 23, 2010||Jan 25, 2012||Sony Ericsson Mobile Communications AB||Method for determining an acoustic property of an environment|
|U.S. Classification||701/433, 340/988, 701/441|
|International Classification||G01C21/26, H04L12/28, H04L29/08, H04L29/06|
|Cooperative Classification||H04L67/18, H04L69/329, H04W4/04, H04L29/06|
|European Classification||H04L29/06, H04L29/08N17|
|Aug 5, 2003||AS||Assignment|
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEWLETT-PACKARD LIMITED;HULL, RICHARD;REEL/FRAME:015106/0715
Effective date: 20030723
|Feb 22, 2010||FPAY||Fee payment|
Year of fee payment: 4
|Apr 4, 2014||REMI||Maintenance fee reminder mailed|
|Aug 1, 2014||FPAY||Fee payment|
Year of fee payment: 8
|Aug 1, 2014||SULP||Surcharge for late payment|
Year of fee payment: 7