|Publication number||US8098235 B2|
|Application number||US 11/863,741|
|Publication date||Jan 17, 2012|
|Filing date||Sep 28, 2007|
|Priority date||Sep 28, 2007|
|Also published as||CN101809526A, CN101809526B, CN104881175A, EP2203798A1, EP2755114A2, EP2755114A3, US8451245, US8542216, US8963882, US20090085878, US20120081326, US20120081327, US20130314354, WO2009042424A1|
|Publication number||11863741, 863741, US 8098235 B2, US 8098235B2, US-B2-8098235, US8098235 B2, US8098235B2|
|Inventors||Robert W. Heubel, Danny A. Grant|
|Original Assignee||Immersion Corporation|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (14), Non-Patent Citations (4), Referenced by (22), Classifications (8), Legal Events (2)|
|External Links: USPTO, USPTO Assignment, Espacenet|
One embodiment of the present invention is directed to haptic effects. More particularly, one embodiment of the present invention is directed to haptic effects for a multi-touch touchscreen device.
Electronic device manufacturers strive to produce a rich interface for users. Conventional devices use visual and auditory cues to provide feedback to a user. In some interface devices, kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, texture, and heat) is also provided to the user, more generally known collectively as “haptic feedback” or “haptic effects”. Haptic feedback can provide cues that enhance and simplify the user interface. Specifically, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.
Haptic feedback has also been increasingly incorporated in portable electronic devices, such as cellular telephones, personal digital assistants (PDAs), portable gaming devices, and a variety of other portable electronic devices. For example, some portable gaming applications are capable of vibrating in a manner similar to control devices (e.g., joysticks, etc.) used with larger-scale gaming systems that are configured to provide haptic feedback. Additionally, devices such as cellular telephones and PDAs are capable of providing various alerts to users by way of vibrations. For example, a cellular telephone can alert a user to an incoming telephone call by vibrating. Similarly, a PDA can alert a user to a scheduled calendar item or provide a user with a reminder for a “to do” list item or calendar appointment.
Increasingly, portable devices are moving away from physical buttons in favor of touchscreen-only interfaces. This shift allows increased flexibility, reduced parts count, and reduced dependence on failure-prone mechanical buttons and is in line with emerging trends in product design. When using the touchscreen input device, a mechanical confirmation on button press or other user interface action can be simulated with haptic effects. Further, many devices are now capable of multi-touch in which a touchscreen recognizes multiple simultaneous touch points and includes software to interpret simultaneous touches.
Based on the foregoing, there is a need for a system and method for generating haptic effects for a multi-touch device.
One embodiment is a system for generating haptic effects. The system senses at least two generally simultaneous touches on a touchscreen and, in response, generates a dynamic haptic effect.
One embodiment is touchscreen multi-touch device that includes a haptic feedback system for generating dynamic haptic effects in response to multi-touches.
The haptic feedback system includes a processor 12. Coupled to processor 12 is a memory 20 and an actuator drive circuit 16, which is coupled to a vibration actuator 18. Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire telephone 10, or may be a separate processor. Processor 12 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency and duration. Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect. A haptic effect may be considered “dynamic” if it includes some variation of these parameters when the haptic effect is generated or a variation of these parameters based on a user's interaction.
Processor 12 outputs the control signals to drive circuit 16 which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage to cause the desired haptic effects. Actuator 18 is a haptic device that generates a vibration on telephone 10. Actuator 18 can include one or more force applying mechanisms which are capable of applying a vibrotactile force to a user of telephone 10 (e.g., via the housing of telephone 10). Actuator 18 may be, for example, an electromagnetic actuator, an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys. Memory device 20 can be any type of storage device or computer-readable medium, such as random access memory (“RAM”) or read-only memory (“ROM”). Memory 20 stores instructions executed by processor 12. Memory 20 may also be located internal to processor 12, or any combination of internal and external memory.
Touchscreen 11 recognizes touches, as well as the position and magnitude of touches on a touch sensitive surface. The data corresponding to the touches is sent to processor 12, or another processor within telephone 10, and processor 12 interprets the touches and in response generates haptic effects. Touchscreen 11 may sense touches using any sensing technology, including capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, etc. Touchscreen 11 can sense multi-touch contacts and is capable of distinguishing multiple touches that occur at the same time. Touchscreen 11 may further display images for the user to interact with, such as keys, dials, etc., or may be a touchpad with minimal or no images.
Although the embodiment of
At 102, the multi-touch contact is sensed and the position each contact point and the number of contact points is determined.
At 104, a dynamic haptic effect is calculated based the position and number of contact points, and based on any number of other factors such as the factors disclosed above (e.g., distance between points, direction of motion of points, etc.). The haptic effect is dynamic in that one or more parameters such as amplitude, vibration, frequency, etc. are varied over time. The dynamic nature of the haptic effect provides additional information to the user in contrast to static haptic effects. The need to provide additional information increases as two or more generally simultaneously touches are sensed on a multi-touch device. In one embodiment, multiple dynamic haptic effects may be calculated, one for each contact point, so that each contact object (e.g., each finger) may experience a different haptic effect rather than a single dynamic haptic effect that is applied to the entire telephone 10 or touchscreen 11.
At 106, the calculated dynamic haptic effect at 104 is output to drive circuit 16 and actuator 18 so that the effect is implemented in the form of vibrations or other haptics.
In operation, embodiments creates dynamic haptic effects in response to multi-touch contacts to enhance the functionality and usability of telephone 10. For example, when the multi-touch contacts are two or more fingers, a user may be able to move their fingers apart while touching or in close proximity to touchscreen 11 in order to zoom in on a displayed image. In response, a dynamic haptic effect can be generated that increases in amplitude or frequency to communicate the sensation of a growing or increasing virtual window or object size and/or volume. The action of bringing the fingers back together can result in an equal and opposite decreasing amplitude or frequency to communicate the sensation of a shrinking or decreasing virtual window or object size and/or volume.
In another example, two or more fingers may be moved apart for the purpose of moving through a displayed list of contacts, text, or menu items and in response a dynamic haptic effect of increasing amplitude or frequency may be generated that is based on the distance between the finger points. The further apart the user's fingers are, the greater the amplitude or frequency of the haptic effects would be in order to communicate the sensation of increasing speed or movement through the list of contacts or menu items. The action of bringing the fingers back together would result in an equal and opposite decreasing amplitude, or frequency to communicate the sensation of a decreasing speed or movement through the list of contacts or menu items.
Further, two or more fingers can make a rotating gesture equivalent to turning a virtual knob on touchscreen 11. In response, dynamic haptic effects can be generated as the virtual knob is turned to simulate those sensations felt in turning a mechanical knob such as detents and barriers. Other dynamic effects can be generated that are not typically associated with a rotary knob but provide information such as scroll-rate control, end-of-list/top-of-list notifications, etc.
In another embodiment, two or more fingers may set a boundary box (selection area) that allows the user to interact with all virtual grasped items contained in the bounded box. While sliding the bounded box, dynamic haptic effects may generate a sliding feeling and can vary depending on the speed of the dragging or how far the box is being dragged. Further, the interaction of resizing the items by increasing or decreasing the distance between two fingers may generate an equivalent dynamic haptic effect of increasing or decreasing amplitude or frequency, or a haptic effect of consistent amplitude and frequency that communicates the action of relative increasing or decreasing object sizes. Further, the interaction of rotating the grasped items by rotating fingers clockwise or counter-clockwise may generate an equivalent dynamic haptic event of increasing or decreasing haptic amplitude or frequency, or a haptic effect of consistent amplitude and frequency that communicates the action of rotating the object(s) away from their initial starting location or virtual setting. Further, the interaction of dragging the items by moving the fingers across the screen may generate an equivalent dynamic haptic effect of increasing or decreasing haptic amplitude or frequency, or a haptic effect of consistent amplitude and frequency that communicates the action of physically dragging the object(s) away from their initial starting location or virtual setting.
In another embodiment, telephone 10 includes a foot pedal or switch so that a user can use one foot to control a pedal button/switch while manipulating a touchscreen to interact with virtual objects. In one embodiment, the foot pedal's button action could be used in the same way a mouse button click could be used for anchoring a cursor point. The user's hands would then be free to manipulate a touchscreen and performing functions such as activating, navigating, resizing, reshaping, moving, combining of menus, menu items, windows, virtual shapes or virtual objects. Each of these interactions could have a dynamic haptic effect triggered at the same time to better communicate these interactions in the absence of real mechanical buttons, switches, or the actual physical objects being represented.
In another embodiment, multiple users may apply multi-touch contacts to touchscreen 11 and each user may need different haptic effects based on the specific application or interaction they are making at any given moment in time. Further, a single, multi-hand user may need different haptic effects for each hand based on the specific actions each hand is making at any given moment. For example, one hand may be using two fingers to grab or pinch a virtual object while the other hand is using two fingers to manipulate the virtual object or to zoom in/out or even scroll through a separate menu list. Both actions may be happening simultaneously on the same surface and benefit from different dynamic haptic effects being generated at each hand.
As disclosed, embodiments generate dynamic haptic effects in response to multi-touch interactions on a touchscreen. As a result, a user can more easily and more effectively make use of functionality of a touchscreen multi-touch device.
Several embodiments are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US6337678||Jul 21, 1999||Jan 8, 2002||Tactiva Incorporated||Force feedback computer input and output device with coordinated haptic elements|
|US6819312||Nov 1, 2001||Nov 16, 2004||Tactiva Incorporated||Force feedback computer input and output device with coordinated haptic elements|
|US20020044132 *||Nov 1, 2001||Apr 18, 2002||Fish Daniel E.||Force feedback computer input and output device with coordinated haptic elements|
|US20060026536||Jan 31, 2005||Feb 2, 2006||Apple Computer, Inc.||Gestures for touch sensitive input devices|
|US20060119586||Oct 11, 2005||Jun 8, 2006||Immersion Corporation, A Delaware Corporation||Haptic feedback for button and scrolling action simulation in touch input devices|
|US20060197752||Feb 16, 2006||Sep 7, 2006||Hurst G S||Multiple-touch sensor|
|US20080180406 *||Jan 31, 2008||Jul 31, 2008||Han Jefferson Y||Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques|
|US20080216001 *||Jan 4, 2007||Sep 4, 2008||Bas Ording||Portable electronic device with content-dependent touch sensitivity|
|US20090256817 *||Feb 26, 2009||Oct 15, 2009||New York University||Method and apparatus for providing input to a processor, and a sensor pad|
|US20100149134 *||Apr 10, 2009||Jun 17, 2010||Wayne Westerman||Writing using a touch sensor|
|US20100265208 *||Nov 6, 2007||Oct 21, 2010||Korea Research Institute Of Standards And Science||Touch screen using tactile sensors, method for manufacturing the same, and algorithm implementing method for the same|
|US20100313124 *||Dec 9, 2010||Xerox Corporation||Manipulation of displayed objects by virtual magnetism|
|US20100328053 *||Aug 18, 2009||Dec 30, 2010||J Touch Corporation||Array-type tactile feedback touch panel|
|US20110043527 *||Feb 24, 2011||Bas Ording||Portable Electronic Device with Multi-Touch Input|
|1||Buxton, Multi-Touch Systems that I Have Known and Loved, http://www.billbuxton.com/multitouchOverview.html.|
|2||Chang et al., ComTouch: Design of a Vibrotactile Communication Device, DIS2002, London Copyright 2002 ACM, 10 pages.|
|3||International Search Report and Written Opinion-PCT/US2008/076343.|
|4||International Search Report and Written Opinion—PCT/US2008/076343.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8279193||May 16, 2012||Oct 2, 2012||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US8451245 *||May 28, 2013||Immersion Corporation||Multi-touch device having dynamic haptic effects|
|US8493354||Aug 23, 2012||Jul 23, 2013||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US8542216 *||Dec 13, 2011||Sep 24, 2013||Immersion Corporation||Multi-touch device having dynamic haptic effects|
|US8570296||May 16, 2012||Oct 29, 2013||Immersion Corporation||System and method for display of multiple data channels on a single haptic display|
|US8659571 *||Feb 21, 2013||Feb 25, 2014||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US8711118||Feb 15, 2012||Apr 29, 2014||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US8823674 *||Feb 10, 2014||Sep 2, 2014||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US8866788 *||Jul 28, 2014||Oct 21, 2014||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US8875054 *||Jul 30, 2010||Oct 28, 2014||Apple Inc.||Hybrid knob/slider control|
|US8963882||Aug 1, 2013||Feb 24, 2015||Immersion Corporation||Multi-touch device having dynamic haptic effects|
|US9069452||Dec 1, 2010||Jun 30, 2015||Apple Inc.||Morphing a user-interface control object|
|US9285880||Mar 14, 2013||Mar 15, 2016||Panasonic Intellectual Property Management Co., Ltd.||Touch panel device and method of controlling a touch panel device|
|US20090125848 *||Nov 14, 2007||May 14, 2009||Susann Marie Keohane||Touch surface-sensitive edit system|
|US20120030626 *||Jul 30, 2010||Feb 2, 2012||Apple Inc.||Hybrid Knob/Slider Control|
|US20120054667 *||Aug 31, 2010||Mar 1, 2012||Blackboard Inc.||Separate and simultaneous control of windows in windowing systems|
|US20120081326 *||Apr 5, 2012||Immersion Corporation||Multi-touch device having dynamic haptic effects|
|US20120081327 *||Apr 5, 2012||Immersion Corporation||Multi-touch device having dynamic haptic effects|
|US20130113715 *||Nov 7, 2011||May 9, 2013||Immersion Corporation||Systems and Methods for Multi-Pressure Interaction on Touch-Sensitive Surfaces|
|US20130300683 *||Feb 21, 2013||Nov 14, 2013||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US20140184497 *||Feb 10, 2014||Jul 3, 2014||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|US20140333565 *||Jul 28, 2014||Nov 13, 2014||Immersion Corporation||Interactivity model for shared feedback on mobile devices|
|U.S. Classification||345/173, 345/156|
|Cooperative Classification||G06F2203/04808, G06F3/04883, G06F3/016|
|European Classification||G06F3/01F, G06F3/0488G|
|Sep 28, 2007||AS||Assignment|
Owner name: IMMERSION CORPORATION, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEUBEL, ROBERT W.;GRANT, DANNY A.;REEL/FRAME:019897/0270;SIGNING DATES FROM 20070926 TO 20070927
Owner name: IMMERSION CORPORATION, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEUBEL, ROBERT W.;GRANT, DANNY A.;SIGNING DATES FROM 20070926 TO 20070927;REEL/FRAME:019897/0270
|Jul 17, 2015||FPAY||Fee payment|
Year of fee payment: 4