|Publication number||US20020003469 A1|
|Application number||US 09/577,307|
|Publication date||Jan 10, 2002|
|Filing date||May 23, 2000|
|Priority date||May 23, 2000|
|Also published as||US6459364|
|Publication number||09577307, 577307, US 2002/0003469 A1, US 2002/003469 A1, US 20020003469 A1, US 20020003469A1, US 2002003469 A1, US 2002003469A1, US-A1-20020003469, US-A1-2002003469, US2002/0003469A1, US2002/003469A1, US20020003469 A1, US20020003469A1, US2002003469 A1, US2002003469A1|
|Original Assignee||Hewlett -Packard Company|
|Export Citation||BiBTeX, EndNote, RefMan|
|Referenced by (57), Classifications (14), Legal Events (7)|
|External Links: USPTO, USPTO Assignment, Espacenet|
 This invention relates to reading devices for the visual impaired, and to methods for displaying electronic files such as internet web pages.
 As the Internet has become an important communication tool, the visually impaired require display devices that permit Internet content such as web pages to be displayed. Electro-mechanical devices have served to translate text into a tactilely readable format such as the Braille character set, which employs a matrix of tactile elements for each character, symbol, or word, with each element either being a flat spot or a raised bump. The standard Braille character set uses an 8-dot matrix (2 columns of 4 dots), allowing adequate permutations, with the character matrices spaced apart on a surface to allow them to be distinguished.
 To display graphic content, such as icons, symbols, a cursor, borders, arrows, drawings, and photo images, a tactile device requires that the tactile elements be evenly and closely spaced apart. Such devices have been proposed which use a matrix arranged on a standard Braille 1.5 mm dot pitch, so that Braille characters are displayed by leaving intervening columns and rows of dots blank or flat, and so that simple graphic images are displayed in a dot matrix fashion using all available tactile elements.
 Other systems may be developed having a tactile element matrix with a finer resolution than the standard Braille dot spacing, with each dot generated by raising a cluster of tactile elements, and a number of inactive tactile elements between each adjacent Braille dot. This would permit a finer resolution for graphical purposes than provided for by the standard Braille dot pitch.
 However, these systems are believed to be currently limited to the simple translation of electronic text (such as may be received in ASCII format) into strings of tactilely displayed Braille symbols. Since much of the content of a web page or other file may be in non-text form, this is not discernable to a visually impaired user of current Braille display devices. Current systems lack a means of identifying which words are selectable hypertext links to other web pages or downloadable content, nor do they provide a convenient means to locate such links on a page of text or to select and activate such links. Because the visually impaired user is unable to find these links “at a glance” in the manner of sighted users, he or she must serially read through the entire text to find a link of interest.
 Web browsing often involves proceeding through several “layers” of pages at a web site to reach the page with the desired content. For a sighted person, this can be rapid; for the visually impaired, it can be time consuming to read up to the entire content of each page to find the desired link to the next page. This delay is exacerbated by the limited rate at which Braille text may be read.
 The present invention overcomes the limitations of the prior art by providing a method of communicating electronic information via a display device having a matrix of movable tactile elements. The method includes displaying a representation of a file containing hypertext links on a first portion of the matrix, and displaying a list of the hypertext links on a second portion of the matrix. The representation may include graphical elements and text symbols such as Braille.
FIG. 1 is a perspective view of a tactile interface device according to a preferred embodiment of the invention.
FIG. 2 is a sample tactile screen display of a device operated according to the embodiment of FIG. 1.
FIG. 1 shows a portable computer device 10 having a housing 12 containing a display screen 14 and a keyboard 16. The device is intended for connection to a computer network such as the Internet via any means, including a hard-wired connection (not shown) for use with desktop applications, or a wireless communication link for portable usage. The device is intended to serve as hardware for browsing the world wide web, but may be used for other computer or entertainment tasks, including creating and reading documents.
 The screen 14 is a tactile feedback display capable of displaying the Braille character set in standard format, and includes a matrix of individually addressable dot or tactile elements 20, as shown in FIG. 2. The elements are arranged in an evenly spaced apart grid, so that each Braille character is spaced apart from an adjacent character by at least one blank row or column between the 2-dot by 4-dot individual character fields, unlike conventional Braille devices that do not provide active dots between character fields. The use of active dots at all points on the grid permits the entire display or any portion to be used for displaying graphic images.
 Each tactile element 20 is switchable between an inactive position in which the dot is flush with the surrounding surface, and an active position in which the dot is raised above the surrounding surface, in the manner of a raised embossed dot on a Braille-imprinted paper document page. The mechanism for this may be of any type, including electromechanical actuators, electromagnetic elements, and switchable dimpled plastic film. The mechanism may be transparent, to permit a conventional visual flat panel display to be positioned behind the tactile screen, to aid sighted users including those assisting the visually impaired, as well as users with limited visual impairment that may be able to perceive some visual elements. Thus, a Braille word may be superimposed over the same word in conventional text. A graphical image such as an icon, a picture, a line, a cursor, or an arrow may underlay the raised dots corresponding to the image.
 The display screen may also include a touch sensor operable to detect pressure from a user's finger on a particular part of the screen. The touch sensor may be a film behind a flexible tactile transducer film, or may be a thin film in front of the tactile elements, with sufficient flexibility and compliance that it dies not impair the tactile perception of the screen. The touch sensor may also be a film layer of the tactile surface itself, or may include touch sensitive elements interspersed adjacent the tactile dot elements so that neither the touch sensor nor tactile display elements overlay the other. Other touch screens may use grids of interruptible beams, capacitive discharge sensors, and conductive grids sensing a circuit made across nearby nodes. Similarly, the visual display may include picture elements in the form of switchable emitters or reflectors adjacent to each tactile element, aiding a partially sighted user to identify the locations on the screen where tactile readable elements are raised.
FIG. 2 shows a sample display screen in detail. The screen includes three main portions: A content frame 22, an index frame 24, and a button frame 26. The content frame displays a Braille-translated version of a web page or other document. All text from the original downloaded file is translated to Braille, and graphic images are selectively simplified and converted to a dot matrix image of suitable scale on the screen. A text title 30 identifies the web page contents; a text block 32 is displayed in Braille. A number of images and symbols are displayed in the content frame, including a cursor arrow or icon 34, a small line drawing 36 (showing a star shape in the example), and a larger image rendering 40 (showing a pine tree).
 Like typical web page text content, the text block includes several hypertext links 42, 44, 46, 50, each corresponding to a different web address, and leading to a different web page. On conventional visually displayed web pages the links may be in text form, either listing the actual URL or web address, or including other words (e.g. “click here for more information”). Links may conventionally be tied to an icon, image, or region of the screen, which if clicked with a user's cursor will cause the web browsing software to connect to an associated site. However, with tactile displays, a detailed image is unlikely to convey to the user adequate information about the link. Therefore, the link is preferably indicated by a Braille text string identifying the link.
 Because the links are often dispersed amongst the other words of text on the content frame, there is no way for the user relying on touch to quickly locate and identify the links. All the text must be read to ensure that all links are located. Therefore, the system software extracts all active links in the displayed page, and lists them in the index frame. In the illustrated example, the four links are listed in the order they are found in the text. The user may select other sort modes, such as segregating links by class (e.g. those pointing to other pages at the current site, to other sites, to commercial advertisements, and essential index links such as “search site”, “contact us”, “what's new”, and the like, often found in a separate page frame or at the bottom of each page.
 Using pressure- or presence-sensing transducers, the device may detect a user's finger hovering over a link, and offer additional information about the link before actually connecting to the associated site or page. This information may be provided by an audible voice synthesizer, or by creating a temporary Braille text box containing the descriptive text adjacent to the touched link, in the manner of a drop down menu used in a graphical user interface. The matrix tactile elements may be the pressure sensors, with the signal-to-motion transducers operating in reverse from the display function, such that a pressure generates a signal.
 When a link is selected or clicked by application of a deliberate pressure, or by other input such as tapping the link with a given force profile or pattern (e.g. a double tap,) the system software retrieves the page associated with the clicked link, converts it to the tactile format, and displays it in the content frame by raising the appropriate tactile elements or dots. The system also extracts the hypertext link information from the retrieved page, and lists the included links in the index frame.
 The index frame includes several symbols 52 corresponding to the common browsing commands of “back”, “stop”, and “forward.” These may be used in the conventional manner, or may be used in a way more useful to visually impaired users to access a chronological history of pages visited, as opposed to the conventional approach that does not permit the back command to bring the user into a deeper level page reached via a parent page from which the user accesses a different deeper level page (i.e. the user may only back upward in the hierarchy to the first page visited.) The preferred embodiment permits the user to “back” in sequence through all pages visited, even if going “back” takes the user to a lower level, or in a conceptually retrograde direction.
 The button frame 26 includes buttons 54 in the form of Braille text labels identifying their functions. Buttons may serve to select the type of information displayed in the index frame 24, such as “list”, which indicates the current preferred status in which hyperlinks are listed; “search”, which displays a search utility; “favorites”, which lists book-marked favorite web pages; “home”, and “help”, which have conventional usage. Other buttons may open features for controlling software settings, and operations such as printing and saving of files.
 All frames are bordered and divided from each other by boxes or lines 60 formed by raised dots in straight single rows. These lines formed from the matrix of tactile elements permit the user to know which frame is being read, and to ensure that information in different frames is not confused.
 While the above is discussed in terms of preferred and alternative embodiments, the invention is not intended to be so limited.
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6925611||Jan 31, 2001||Aug 2, 2005||Microsoft Corporation||Navigational interface for mobile and wearable computers|
|US6967642 *||Jan 31, 2001||Nov 22, 2005||Microsoft Corporation||Input device with pattern and tactile feedback for computer input and control|
|US7461355||Oct 14, 2004||Dec 2, 2008||Microsoft Corporation||Navigational interface for mobile and wearable computers|
|US7707024||May 23, 2002||Apr 27, 2010||Microsoft Corporation||Method, system, and apparatus for converting currency values based upon semantically labeled strings|
|US7707496||May 9, 2002||Apr 27, 2010||Microsoft Corporation||Method, system, and apparatus for converting dates between calendars and languages based upon semantically labeled strings|
|US7711550||Apr 29, 2003||May 4, 2010||Microsoft Corporation||Methods and system for recognizing names in a computer-generated document and for providing helpful actions associated with recognized names|
|US7712024||Jul 16, 2001||May 4, 2010||Microsoft Corporation||Application program interfaces for semantically labeling strings and providing actions based on semantically labeled strings|
|US7716163||Jul 17, 2001||May 11, 2010||Microsoft Corporation||Method and system for defining semantic categories and actions|
|US7716676||Jun 25, 2002||May 11, 2010||Microsoft Corporation||System and method for issuing a message to a program|
|US7739588||Jun 27, 2003||Jun 15, 2010||Microsoft Corporation||Leveraging markup language data for semantically labeling text strings and data and for providing actions based on semantically labeled text strings and data|
|US7742048||May 23, 2002||Jun 22, 2010||Microsoft Corporation||Method, system, and apparatus for converting numbers based upon semantically labeled strings|
|US7770102||Jun 6, 2000||Aug 3, 2010||Microsoft Corporation||Method and system for semantically labeling strings and providing actions based on semantically labeled strings|
|US7778816||Apr 24, 2001||Aug 17, 2010||Microsoft Corporation||Method and system for applying input mode bias|
|US7783614||Feb 13, 2003||Aug 24, 2010||Microsoft Corporation||Linking elements of a document to corresponding fields, queries and/or procedures in a database|
|US7788590||Sep 26, 2005||Aug 31, 2010||Microsoft Corporation||Lightweight reference user interface|
|US7788602||Jul 16, 2001||Aug 31, 2010||Microsoft Corporation||Method and system for providing restricted actions for recognized semantic categories|
|US7817792||Sep 21, 2006||Oct 19, 2010||Microsoft Corporation||Hyperlink-based softphone call and management|
|US7827546||Dec 9, 2003||Nov 2, 2010||Microsoft Corporation||Mechanism for downloading software components from a remote source for use by a local software application|
|US7916002 *||Jun 30, 2006||Mar 29, 2011||Nokia Corporation||Haptic operative user interface input apparatus|
|US8201090 *||Nov 13, 2008||Jun 12, 2012||The Board Of Trustees Of The University Of Arkansas||User interface for software applications|
|US8341420 *||Dec 6, 2011||Dec 25, 2012||Armstrong, Quinton Co. LLC||Methods, systems, and computer program products for entering sensitive and padding data using user-defined criteria|
|US8570278||Oct 24, 2007||Oct 29, 2013||Apple Inc.||Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker|
|US8633907 *||Jul 6, 2010||Jan 21, 2014||Padmanabhan Mahalingam||Touch screen overlay for visually impaired persons|
|US8659555||Jun 24, 2008||Feb 25, 2014||Nokia Corporation||Method and apparatus for executing a feature using a tactile cue|
|US8661339||Sep 23, 2011||Feb 25, 2014||Apple Inc.||Devices, methods, and graphical user interfaces for document manipulation|
|US8677232||Sep 23, 2011||Mar 18, 2014||Apple Inc.||Devices, methods, and graphical user interfaces for document manipulation|
|US8719695||Sep 23, 2011||May 6, 2014||Apple Inc.||Devices, methods, and graphical user interfaces for document manipulation|
|US8756534 *||Sep 24, 2009||Jun 17, 2014||Apple Inc.||Methods and graphical user interfaces for editing on a multifunction device with a touch screen display|
|US8766786 *||Feb 4, 2008||Jul 1, 2014||Nokia Corporation||Device and method for providing tactile information|
|US8774679||Aug 22, 2012||Jul 8, 2014||Eastman Kodak Company||Electrographic tactile image printing system|
|US8849159||Aug 22, 2012||Sep 30, 2014||Eastman Kodak Company||Electrographic printing of tactile images|
|US8856682||May 11, 2010||Oct 7, 2014||AI Squared||Displaying a user interface in a dedicated display area|
|US8870367||May 2, 2012||Oct 28, 2014||Eastman Kodak Company||Printed image for visually-impaired person|
|US8898564 *||Feb 7, 2011||Nov 25, 2014||Immersion Corporation||Haptic effects with proximity sensing|
|US9092130||Sep 23, 2011||Jul 28, 2015||Apple Inc.||Devices, methods, and graphical user interfaces for document manipulation|
|US20040162833 *||Feb 13, 2003||Aug 19, 2004||Microsoft Corporation||Linking elements of a document to corresponding fields, queries and/or procedures in a database|
|US20040172584 *||Feb 28, 2003||Sep 2, 2004||Microsoft Corporation||Method and system for enhancing paste functionality of a computer software application|
|US20040207601 *||May 14, 2004||Oct 21, 2004||Microsoft Corporation||Input device with pattern and tactile feedback for computer input and control|
|US20040230666 *||May 14, 2003||Nov 18, 2004||Microsoft Corporation||Method, system, and computer-readable medium for communicating results to a data query in a computer network|
|US20040268237 *||Jun 27, 2003||Dec 30, 2004||Microsoft Corporation||Leveraging markup language data for semantically labeling text strings and data and for providing actions based on semantically labeled text strings and data|
|US20050182617 *||Feb 17, 2004||Aug 18, 2005||Microsoft Corporation||Methods and systems for providing automated actions on recognized text strings in a computer-generated document|
|US20090286211 *||Nov 19, 2009||Roche Diagnostics Operations, Inc.||Medical device for visually impaired users and users not visually impaired|
|US20100127999 *||Jan 28, 2010||May 27, 2010||Samsung Electronics Co., Ltd.||Apparatus and method of providing fingertip haptics of visual information using electro-active polymer for image display device|
|US20100235783 *||Sep 24, 2009||Sep 16, 2010||Bas Ording||Methods and Graphical User Interfaces for Editing on a Multifunction Device with a Touch Screen Display|
|US20100315212 *||Feb 4, 2008||Dec 16, 2010||Nokia Corporation||Device and method for providing tactile information|
|US20110111376 *||May 12, 2011||Apple Inc.||Braille Mirroring|
|US20110138277 *||Jun 9, 2011||Immersion Corporation||Haptic effects with proximity sensing|
|US20110283243 *||Nov 17, 2011||Al Squared||Dedicated on-screen closed caption display|
|US20120007809 *||Jul 6, 2010||Jan 12, 2012||Padmanabhan Mahalingam||Touch Screen Overlay for Visually Impaired Persons|
|US20120075200 *||Mar 29, 2012||Nokia Corporation||Touch Sensitive Input|
|US20120151218 *||Jun 14, 2012||Mona Singh||Methods, Systems, And Computer Program Products For Entering Sensitive And Padding Data Using User-Defined Criteria|
|US20130164717 *||Dec 18, 2012||Jun 27, 2013||Thomson Licensing||Braille display system and method for operating a refreshable braille display|
|EP1850305A2 *||Apr 23, 2007||Oct 31, 2007||Metec AG||Display device for tactile recordable display elements and display system with such a display device|
|EP2014037A1 *||Apr 17, 2007||Jan 14, 2009||Microsoft Corporation||Hover to call|
|WO2007104064A1 *||Mar 8, 2007||Sep 20, 2007||Hahn Werner||Braille reading device|
|WO2007130274A1||Apr 17, 2007||Nov 15, 2007||Microsoft Corp||Hover to call|
|WO2009097866A1||Feb 4, 2008||Aug 13, 2009||Nokia Corp||Device and method for providing tactile information|
|U.S. Classification||340/407.1, 341/23, 340/407.2, 340/965, 341/22, 340/4.12, 340/4.11|
|International Classification||H04M1/247, G09B21/00|
|Cooperative Classification||H04M1/2476, G09B21/005, H04M1/72561|
|European Classification||G09B21/00B3S, H04M1/247D2|
|Sep 5, 2000||AS||Assignment|
|Apr 3, 2006||FPAY||Fee payment|
Year of fee payment: 4
|Apr 1, 2010||FPAY||Fee payment|
Year of fee payment: 8
|Sep 22, 2011||AS||Assignment|
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD COMPANY;REEL/FRAME:026945/0699
Effective date: 20030131
|May 9, 2014||REMI||Maintenance fee reminder mailed|
|Oct 1, 2014||LAPS||Lapse for failure to pay maintenance fees|
|Nov 18, 2014||FP||Expired due to failure to pay maintenance fee|
Effective date: 20141001