|Publication number||US20020067372 A1|
|Application number||US 09/945,771|
|Publication date||Jun 6, 2002|
|Filing date||Sep 4, 2001|
|Priority date||Mar 2, 1999|
|Also published as||EP1157314A1, EP1157314B1, EP1157315A1, EP1157315B1, EP1157316A1, EP1157316B1, EP1159657A1, EP1159657B1, EP1183578A1, EP1183578B1, US6941248, US8373618, US20020046368, US20020049566, US20020069072, US20080100570, WO2000052536A1, WO2000052537A1, WO2000052538A1, WO2000052539A1, WO2000052540A1, WO2000052541A1, WO2000052542A1|
|Publication number||09945771, 945771, US 2002/0067372 A1, US 2002/067372 A1, US 20020067372 A1, US 20020067372A1, US 2002067372 A1, US 2002067372A1, US-A1-20020067372, US-A1-2002067372, US2002/0067372A1, US2002/067372A1, US20020067372 A1, US20020067372A1, US2002067372 A1, US2002067372A1|
|Inventors||Wolfgang Friedrich, Wolfgang Wohlgemuth|
|Original Assignee||Wolfgang Friedrich, Wolfgang Wohlgemuth|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (5), Referenced by (20), Classifications (16), Legal Events (1)|
|External Links: USPTO, USPTO Assignment, Espacenet|
 The invention relates to an augmented-reality system and method for transmitting first information data from a user, for example a skilled operator at a first location, to a remote expert at a second location.
 Such a system and method are used, for example, in the field of automation technology, for production machinery and machine tools, in diagnostic/service support systems, and for complex components, equipment and systems such as, for example, vehicles and industrial machinery and plants.
 The contribution from Daude R. et al: “Head-Mounted Display als facharbeiterorientierte Unterstutzungs kom po nente an CNC-Werkzeugmaschinen”, [“Head-Mounted Display as a component to assist skilled operators of CNC machine tools”], Werkstattstechnik, D E, Springer Verlag, Berlin, Vol. 86, No. 5, May 1, 1996, pp. 248-252, XP000585192 ISSN: 0340-4544, describes the head-mounted display (HMD) as a component to assist the skilled operator with the steps of setting up, feeding and malfunction management in milling operations. The technical integration of the HMD and modern NC control is explained, and the results of a laboratory trial of the HMD are mentioned.
 It is an object of the invention to specify a system and a method which, in concrete operational situations, permits rapid and reliable access to expert knowledge in a simple and cost-effective manner.
 This object is achieved by a system and by a method having the features specified in claims 1 and 6 respectively.
 The invention is based on the insight that present-day machine tools and/or production machinery are of more complex design, which in many cases, for example in the event of service and repairs, require the use of a specialist, often the expert, for example, from the manufacturer. This means that the expert has to travel to the site and that a number of experts must be trained to achieve greater availability. The technical solution for mitigating this problem then consists in the first information data, i.e. all the information necessary for achieving a task requiring an expert being transmitted on-line with the aid of augmented-reality means from a user, e.g. a skilled operator, to an expert at a remote second location. As a result, the expert is practically linked virtually into the proceedings on site and in turn is able, with the aid of the augmented-reality system, to transmit to the skilled operator in situ his knowledge in the form of second information data transmitted to him. This results in synchronous presence of service-relevant data at the point of the skilled operator and the specialized expert who works centrally, for example at the manufacturer's site.
 The information and/or documentation data can, for example, be data compiled and collected while a plant, an automation technology-controlled system or a process was set up, and/or documentation data maintained and, when necessary, updated according to predefinable criteria during operation of a plant or an automation system. These documentation data can.
 Advantageous refinements consist in the documentation data being static and/or dynamic information data. Examples of such static information include technical data from manuals, exploded views, maintenance instructions, etc. Examples of dynamic information include process values such as temperature, pressure, signals, etc.
 Rapid, situationally appropriate access to the documentation data is further assisted by the feature that the acquisition means include an image recording device, that the analyzing means are provided for analyzing the real information in such a way that an operational context, particularly an object of the documentation data, is determined from the real information, and that the system includes visualization means for visualizing the documentation data.
 Rapid, situationally appropriate access to the documentation data is further assisted by the feature that the acquisition means are user-controlled and are designed, in particular, as speech-controlled acquisition means and/or acquisition means controlled by control data.
 The deployment of augmented-reality techniques on the basis of the static and/or dynamic documentation data and/or process data can be optimized for many applications by the acquisition means and/or the visualizing means being designed as data goggles.
 The invention is described and explained below in more detail with reference to the specific embodiments depicted in the figures, in which:
FIG. 1 shows a block diagram of a first embodiment of an augmented-reality system;
FIG. 2 shows a further block diagram of a first embodiment of an augmented-reality system; and
FIG. 3 shows a specific application for situationally appropriate access to expert knowledge and/or documentation data.
FIG. 1 shows a schematic depiction of an augmented-reality system for transmitting first information data from a first location O1 to a remote second location O2 of an expert for providing assistance to a user at the first location O1, for example in the event of servicing or a repair, by the remote expert at the second location. The user, not explicitly shown in FIG. 1, is equipped with mobile equipment 4, 6. The mobile equipment 4, 6 includes data goggles 4 fitted with a video camera 2 and a microphone 11. The data goggles are linked to a device for communication without the use of wires, for example a radio transceiver 6, which can communicate with the automation system A1 . . . An via a radio interface 15. The automation system A1 . . . An can be linked, via a data link 14, to an augmented-reality system 10, hereinafter also abbreviated as AR system. The AR system includes an information module 1 b for storing or accessing information data, an AR base module 8 and an AR application module 9. The AR system 10 can be linked to the Internet 5 via a data link 13, with optional access to further storage data and documentation data 1 a via an Internet link 12 shown by way of example.
 The user who is equipped with the data goggles 4 and the mobile radio transceiver 7 is able to move freely within the plant A1 . . . An for maintenance and service purposes. For example, if maintenance of, or repair to, a particular subcomponent of plants A1 . . . An has to be carried out, appropriate access to the relevant documentation data 1 a, 1 b is established with the aid of the camera 2 of the data goggles 4, optionally controlled by speech commands detected by microphone 11. To do this, a data link with plant A1 . . . An or with an appropriate radio transceiver unit is set up via the radio interface 15, and the data transmitted to the AR system 10. Within the AR system, the data obtained from the user are analyzed in accordance with the situation, and information data 1 a, 1 b are accessed automatically or in a manner controlled interactively by the user. The relevant documentation data 1 a, 1 b obtained are transmitted via the data links 14, 15 to the transceiver 6, with the overall result that an analysis is carried out on the basis of the operational situation detected, said analysis forming the basis for the selection of data from the available static information. This results in a situationally appropriate, object-oriented or component-oriented selection of relevant knowledge from the most up-to-date data sources 1 a, 1 b. Information is displayed with the aid of the visualization component used in each case, for example a handheld PC or data goggles. Referred to AR-based technologies. The operator in situ is therefore provided only with the in-formation he needs. This information is always up-to-date. The service technician therefore does not suffer from information overload from a “100-page manual”.
FIG. 2 shows a further specific application of a documentation processing system for service and maintenance. The system consists of an augmented-reality system 10 which comprises an information module 1 b for storing information data, an AR base system 8 and an AR application module 9. The AR system 10 can be linked to the Internet 5 via connecting lines 13, 18. Thence a link is possible, via an exemplary data link 12, to a remote PC 16 with a remote expert 22. Linkage between the individual modules of the AR system 10 is effected via links 19, 20, 21. The user communication between a user 7 and the AR system is effected via interfaces 8, 23. To this end, the AR system can be linked to a transceiver which enables bidirectional data communication between the AR system 10 and the user 7 via data goggles 4, either directly via the interface 8 or via a transceiver 17, located in the vicinity of the user 7, via an interface 23. The link 23 can be implemented via a separate data link or via the mains as a “power-line” modem. As well as a display device disposed in the vicinity of the eye pieces, the data goggles 4 comprise an image acquisition device 2 in the form of a camera and a microphone 11. With the aid of the data goggles 4, the user 7 can move round the plants A1 . . . An and carry out service or maintenance activities.
 With the aid of the data goggles 4 and the corresponding radio transceivers, e.g. the radio transceiver 17 worn by personnel directly on the body, it is possible to achieve preventive functionality: the initial step is the detection of the respective operational situation, for example by the camera 2 or via location by the personnel 7. On the basis of the operational situation detected, a selection of data from the plant A1 . . . An undergoing maintenance is made in the AR system. The fundamental advantage of the system depicted in FIG. 3 is that this system assists the cooperation of the individual single functionalities in an application-relevant manner: i.e. a concrete operational situation is detected automatically, and this operational situation is then analyzed, the aspects relevant at that point being determined automatically from the most up-to-date available static information in conjunction with the dynamic data acquired instantaneously. As a result, for example, assembly suggestions are correlated with current process data. As a result, personnel 7 are provided with a situationally appropriate display of the relevant information, for example by a superposed visualization of the respective data in such a way that the real operational situation in the field of view of the personnel is expanded by the information acquired. As a result, personnel 7 are very rapidly put in the position of being able to act, thereby ensuring the requisite machine operating times. Assistance to the maintenance technician 7 in situ can also be provided via the remote expert 22 and the knowledge 16 available at the location of the remote expert 22.
FIG. 3 shows a specific application of situationally appropriate access to documentation data. FIG. 3 shows a first monitor region B 1 which shows a plant component. Shown in the right-hand monitor region B2 is a user 7 who, for example, is looking at an individual plant component. The user 7 is equipped with data goggles 4 which comprise a camera 2 as an acquisition means. Additionally disposed on the data goggles 4 are a microphone 11 and a loudspeaker 16. The left-hand monitor region B1 shows a view of conduits which can be viewed with the data goggles shown in window B2. Marked in the left-hand monitor region B I are two points P1, P2 which each represent two image details viewed with the aid of the data goggles 4. After the first point P1 has been viewed, i.e. after the conduit disposed at or near point P1 has been viewed, additional information is visualized for the user 7 in the data goggles 4. This additional information 11 consists of documentation data which, regarding the first point P1, include operational instructions for this pipe section and, regarding point P2, comprise the installation instruction to be implemented in a second step. The installation instruction in this case consists of the user 7 being informed of the torque and the sense of rotation of the screwed joint of point P2 via visualization of the additional data 112. The user 7 is therefore very quickly provided with situationally appropriate instructions for the object being viewed. If an intelligent tool is used which is able to detect the torque applied at any given moment, it is also possible for the user to be told, on the basis of the current torque, to increase or reduce the torque as required.
 Below, background information is provided to the field of application of the invention: this involves an application-oriented requirement analysis and development of AR-based systems to support operational processes being developed, production and service of complex engineering products and plants in fabrication and process technology, and for service support systems as with motor vehicles, or for maintaining any industrial equipment.
 Augmented reality, AR in brief, is a novel type of man-machine interaction of major potential for supporting industrial operational processes. With this technology, the field of view of the observer is enriched with computer-generated virtual objects, which means that intuitive use can be made of product or process information. In addition to the extremely simple interaction, the deployment of portable computers opens up AR application fields involving high mobility requirements, for example if process, measured or simulation data are linked to the real object.
 The situation of German industry is characterized by increasing customer requirements in terms of individuality and quality of products and by the development processes taking substantially less time. Especially in developing, producing and servicing complex industrial products and plants it is possible, by means of innovative solutions to man-machine interaction, both to achieve jumps in efficiency and productivity and to design the work so as to enhance competence and training, by the users' need for knowledge and information being supported in a situationally appropriate manner on the basis of data available in any case.
 Augmented reality is a technology with numerous innovative fields of application:
 In development, for example, a “mixed mock-up” approach based on a mixed-virtual environment can result in a distinct acceleration of the early phases of development. Compared with immersive “virtual reality” (VR) solutions, the user is at a substantial advantage in that the haptic properties can be depicted faithfully with the aid of a real model, whereas aspects of visual perception, e.g. for display variants, can be manipulated in a virtual manner. In addition, there is a major potential for user-oriented validation of computer-assisted models, e.g. for component verification or in crash tests.
 In flexible production it is possible, inter alia, to considerably facilitate the process of setting up machinery for qualified skilled operators by displaying, e.g. via mobile AR components, mixed-virtual clamping situations directly in the field of view. Fabrication planning and fabrication control appropriate to the skilled worker in the workshop is facilitated if information regarding the respective order status is perceived directly in situ in connection with the corresponding products. This also applies to fitting, with the option of presenting the individual procedural steps to the fitter in a mixed-virtual manner even in the training phase. In this connection it is possible, e.g. by comparing real fitting procedures with results of simulations, to achieve comprehensive optimizations which both improve the quality of operation scheduling and simplify and accelerate the fitting process in the critical start-up phase.
 Finally, regarding service, conventional technologies are by now barely adequate for supporting and documenting the complex diagnostic and repair procedures. Since, however, these processes in many fields are in any case planned on the basis of digital data, AR technologies provide the option of adopting the information sources for maintenance purposes and of explaining the dismantling process to an engineer, e.g. in the data goggles, via the superposition with real objects. Regarding cooperative operation, the AR-assisted “remote eye” permits a distributed problem solution by virtue of a remote expert communicating across global distances with the member of staff in situ. This case is particularly relevant for the predominantly medium-sized machine tool manufacturers. Because of globalization, they are forced to set up production sites for their customers worldwide. Neither, however, is the presence of subsidiaries in all the important markets achievable on economic grounds, nor is it possible to dispense with the profound knowledge of experienced service staff of the parent company with respect to the increasingly more complex plants.
 The special feature of man-machine interaction in augmented reality is the very simple and intuitive communication with the computer, supplemented, for example, by multimode interaction techniques such as speech processing or gesture recognition. The use of portable computer units in addition enables entirely novel mobile utilization scenarios, with the option of requesting the specific data at any time via a wireless network. Novel visualization techniques permit direct annotation, e.g. of measured data or simulation data, to the real object or into the real environment. In conjunction with distributed applications, a number of users are able to operate in a real environment with the aid of a shared database (shared augmented environments) or to cooperate with AR support in different environments.
 Augmented reality has been the subject of intense research only in the last few years. Consequently, only a few applications exist, either on the national or the international level, usually in the form of scientific prototypes in research establishments.
 U.S.A.: As with many novel technologies, the potential uses of augmented reality were first tapped in North America. Examples include cockpit design or maintenance of mechatronic equipment. The aircraft manufacturer Boeing has already carried out initial field trials using AR technology in the assembly field. The upshot is that in this hi-tech area too the U.S.A. occupy a key position, potentially making them technological leaders.
 Japan: Various AR developments are being pushed in Japan, e.g. for mixed-virtual building design, telepresence or “cyber-shopping”. The nucleus is formed by the Mixed Reality Systems Laboratory founded in 1997, which is supported jointly as a center of competence by science and by commerce and industry. Particular stimuli in the consumer goods field are likely in the future from the Japanese home electronics industry.
 Europe: So far, only very few research groups have been active in Europe in the AR field. One group at the University of Vienna is working on approaches to mixed-real visualization. The IGD group, as part of the ACTS project CICC, which has now come to an end, has developed initial applications for the building industry and a scientific prototype for staff training in car manufacturing.
 The invention in particular should be seen in the specific context of the fields of application “production machinery and machine tools” (NC-controlled, automation-technology processes) and “diagnostics/service support systems for complex engineering components/equipment/systems” (e.g. vehicles, but also industrial machinery and plants).
 The technical problem posed in this context is AR-based interaction between remote experts, operators in situ and the system. The complexity of present-day machine tools and production machinery in many cases requires the additional deployment of a specialist, often the expert of the firm that supplied the machine.
 The solutions up to now have required the expert to come on site. For increased availability is necessary for a number of experts to have been trained and to be made available. With the support of AR technologies, not only the operator in situ, but also, the expert at the supplier's headquarters is provided, for example via a built-in video camera (mounted on the data goggles); with all the real and virtual information. The remote expert is integrated “live” into problem solution support and is able to make additional suggestions to the personnel in situ—again via AR-based technologies.
 For the small and medium-sized companies it is hardly possible to have the necessary experts available at all manufacturing sites e.g. of the car industry. AR makes it possible for these companies, which are so important for us in Germany, to play along in the concert of the “global players”. What constitutes the inventive step? Synchronous presence of service-relevant data at the point of the operator (in situ, the client's plant . . . somewhere in the world) and the specialized expert (centrally at the supplier's headquarters, for example in Germany). A particularly significant feature, for example, is that current process data are made available to the remote expert, or conversely the operator in situ is assisted by the transmission of characteristic data/curves which are based on extensive measurement results and therefore could not have been compiled on site.
 To sum up, the invention therefore relates to a system and a method of utilizing expert knowledge at a remote site, which involves the transmission, by means of augmented-reality means, of information data, e.g. in the form of video images, from a first location occupied by a skilled operator to a remote expert at a second location, and further involves the transmission of additional information data in the form of augmented-reality information by the remote expert to the skilled operator at the first location.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5742263 *||Dec 18, 1995||Apr 21, 1998||Telxon Corporation||Head tracking system for a head mounted display system|
|US6091546 *||Oct 29, 1998||Jul 18, 2000||The Microoptical Corporation||Eyeglass interface system|
|US6172657 *||Feb 26, 1997||Jan 9, 2001||Seiko Epson Corporation||Body mount-type information display apparatus and display method using the same|
|US6345207 *||Jul 15, 1998||Feb 5, 2002||Honda Giken Kogyo Kabushiki Kaisha||Job aiding apparatus|
|US6349001 *||Jan 11, 2000||Feb 19, 2002||The Microoptical Corporation||Eyeglass interface system|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6871322||Sep 6, 2001||Mar 22, 2005||International Business Machines Corporation||Method and apparatus for providing user support through an intelligent help agent|
|US6973620 *||Sep 6, 2001||Dec 6, 2005||International Business Machines Corporation||Method and apparatus for providing user support based on contextual information|
|US6976067||Sep 6, 2001||Dec 13, 2005||International Business Machines Corporation||Method and apparatus for providing entitlement information for interactive support|
|US7126558||Oct 19, 2001||Oct 24, 2006||Accenture Global Services Gmbh||Industrial augmented reality|
|US7171627 *||Jul 2, 2001||Jan 30, 2007||Sony Corporation||Device for displaying link information and method for displaying the same|
|US7372451||Oct 18, 2002||May 13, 2008||Accenture Global Services Gmbh||Industrial augmented reality|
|US7852355 *||Nov 3, 2004||Dec 14, 2010||Siemens Aktiengesellschaft||System and method for carrying out and visually displaying simulations in an augmented reality|
|US7998741||Aug 31, 2005||Aug 16, 2011||Sysmex Corporation||Remote control system|
|US8042050 *||Jul 24, 2002||Oct 18, 2011||Hewlett-Packard Development Company, L.P.||Method and apparatus for interactive broadcasting|
|US8158431||Mar 28, 2011||Apr 17, 2012||Sysmex Corporation||Remote control system|
|US8394636||Mar 8, 2012||Mar 12, 2013||Sysmex Corporation||Remote control method, remote control system, status informing device and control apparatus|
|US8743146||Dec 5, 2012||Jun 3, 2014||Huawei Device Co., Ltd.||Method and system for implementing augmented reality application|
|US8760471||Mar 30, 2011||Jun 24, 2014||Ns Solutions Corporation||Information processing system, information processing method and program for synthesizing and displaying an image|
|US20040183751 *||Oct 18, 2002||Sep 23, 2004||Dempski Kelly L||Industrial augmented reality|
|US20110316845 *||Dec 29, 2011||Palo Alto Research Center Incorporated||Spatial association between virtual and augmented reality|
|US20120019547 *||Jan 26, 2012||Pantech Co., Ltd.||Apparatus and method for providing augmented reality using additional data|
|US20130083063 *||Apr 4, 2013||Kevin A. Geisner||Service Provision Using Personal Audio/Visual System|
|CN102292707A *||May 11, 2011||Dec 21, 2011||华为终端有限公司||实现增强现实应用的方法及系统|
|CN102292707B||May 11, 2011||Jan 8, 2014||华为终端有限公司||Method and system for implementing augmented reality applications|
|EP1630708A1 *||Aug 30, 2005||Mar 1, 2006||Sysmex Corporation||Remote control method, remote control system, status informing device and control apparatus|
|International Classification||G05B19/409, G05B23/02, G05B19/418, H04Q9/00|
|Cooperative Classification||G05B2219/35494, G05B2219/35482, G05B2219/31027, G05B19/409, G05B19/41875, G05B19/4183, G05B2219/35495, G05B2219/32014|
|European Classification||G05B19/418Q, G05B19/418D, G05B19/409|
|Jan 7, 2002||AS||Assignment|
Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FRIEDRICH, WOLFGANG;WOHLGEMUTH, WOLFGANG;REEL/FRAME:012446/0830;SIGNING DATES FROM 20010809 TO 20010911