US20010000433A1 - Ergonomic customizeable user/computer interface devices - Google Patents

Ergonomic customizeable user/computer interface devices Download PDF

Info

Publication number
US20010000433A1
US20010000433A1 US09/732,112 US73211200A US2001000433A1 US 20010000433 A1 US20010000433 A1 US 20010000433A1 US 73211200 A US73211200 A US 73211200A US 2001000433 A1 US2001000433 A1 US 2001000433A1
Authority
US
United States
Prior art keywords
user
computer
signals
interface
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/732,112
Other versions
US6441770B2 (en
Inventor
David Russell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
David Russell
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by David Russell filed Critical David Russell
Priority to US09/732,112 priority Critical patent/US6441770B2/en
Publication of US20010000433A1 publication Critical patent/US20010000433A1/en
Assigned to TRANSFORMING TECHNOLOGIES, LLC reassignment TRANSFORMING TECHNOLOGIES, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RUSSEL, DAVID C.
Assigned to TRANSFORMING TECHNOLOGIES, INC. reassignment TRANSFORMING TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TRANSFORMING TECHNOLOGIES, L.L.C.
Application granted granted Critical
Publication of US6441770B2 publication Critical patent/US6441770B2/en
Assigned to PRIVARIS, INC. reassignment PRIVARIS, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: TRANSFORMING TECHNOLOGIES, INC.
Assigned to PRIVARIS, INC. reassignment PRIVARIS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RUSSELL, DAVID C.
Assigned to SILICON VALLEY BANK reassignment SILICON VALLEY BANK SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PRIVARIS, INC.
Assigned to HARBERT VENTURE PARTNERS, LLC reassignment HARBERT VENTURE PARTNERS, LLC SECURITY AGREEMENT Assignors: PRIVARIS, INC.
Anticipated expiration legal-status Critical
Assigned to PRIVARIS, INC. reassignment PRIVARIS, INC. RELEASE Assignors: SILICON VALLEY BANK
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PRIVARIS INC.
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/83Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device

Definitions

  • the present invention relates to user/computer interface devices, specifically with respect to graphical interfaces. More specifically, the present invention relates to wireless transmissions from an ergonomic remote control device to a base device for control of computer functions and applications.
  • GUI graphics user interfaces
  • pointing devices The user can select an icon from a GUI display to activate the predetermined function or event associated with the icon.
  • GUIs since GUIs first emerged, alternatives to the keyboard have proven highly desirable for optimum productivity in many applications. Accordingly, auxiliary or keyboard alternative hardware such as light pens, joysticks, trackballs, touch pads, digitizing pads, and the “computer mouse” developed. These new GUI-oriented pointing devices quickly proved to be viable, timesaving alternatives to the keyboard for many types of computer input and control situations. In particular, the mouse has become the single most widely-accepted keyboard alternative input device.
  • the fundamental operating principle of the mouse relates to the rotation of a spherical trackball carried within the mouse.
  • the trackball which is partially exposed, freely rotates within the device and generates signals which correspond to pairs of x-axis and y-axis coordinates.
  • the mouse contains means to translate these coordinates into signals to which the attached computer is responsive. Accordingly, when the computer user moves the mouse device across a working surface adjacent to the computer, the cursor indicator on the display screen moves to the location pointed to by the computer user.
  • the computer user's operation of one or more buttons aboard the mouse effects other control functions of the computer and computer display, such as the selection of computer usage event options.
  • mouse requires a prominent, smooth, flat, horizontal space on the user's desk.
  • a typical user's desk is crowded and inhibits the space required for mouse operation.
  • Most mouse devices are especially difficult to use when away from traditional office facilities, in mobile or restricted locations.
  • GUI devices which offer prophylaxis for users with physical impairments and repetitive stress injuries. While successful products serve a variety of needs for these users, high costs and highly specific utility of many such products hinder their widespread acceptance.
  • Another mouse drawback is its' simplex, unidirectional design and operation. No mouse currently implements two-way interaction between controlled computers and input devices. Lack of bidirectionality is better appreciated, if one considers the many new applications and benefits of bidirectionality, such as roaming LAN interaction; security and alarms; mobile signaling and paging; and remote interactive applications.
  • LAN local area network
  • security needs which remain unmet by current input devices.
  • LAN users have connectivity needs which extend beyond their own computer.
  • LANs were created to facilitate resource-sharing of limited resources among multiple users. LAN users often access and connect into one or more LANs, or other accessible computers or network environments. It has been estimated that more than half of all computers in business are attached to a LAN.
  • the underside of the mouse trackball is susceptible to the introduction of dirt, liquids, or other substances into the body cavity. This vulnerability can lead to equipment failure and shorter product life.
  • mouse method Another drawback of the mouse is that the user may find the “mouse method” of frequently moving his or her hand back and forth from the keyboard to the mouse to be distracting to their train of thought, time consuming, or inconvenient to optimal operational efficiency.
  • U.S. Pat. No. 4,550,250 to Mueller discloses an infrared graphic input device for a computer.
  • a remote infrared light source transmits user input commands to a detector device adjacent to the computer.
  • the device must operate within a dedicated horizontal, two-dimensional, smooth, flat surface.
  • the detector apparatus operates according to continuous tracking input principles and does not allow for any straying out of equipment detection boundaries.
  • U.S. Pat. No. 4,578,674 discloses a method and an apparatus for controlling the cursor position on a computer display screen.
  • This device uses both infrared and ultrasonic principles for determining the direction and the velocity of motion of a positioning device which is monitored by a control base detector.
  • the device requires a two-dimensional plane. To operate from a three-dimensional defined location, the user must ensure the emitter/detector front face of the positioning device is always directly facing the control base.
  • U.S. Pat. No. 4,628,541 to Beavers discloses an infrared battery powered keyboard input device for a microcomputer. This device offers the user additional freedom for operating a standard style keyboard without hardwiring constraints. Also, the keyboard cannot be portable to another computer, unless the computer to which the keyboard is ported is a “mirror” microcomputer device. Hence, the infrared battery operated keyboard likely requires the implementation of a separate mouse if “mouse-type” input commands or functionality/features are needed by the user or are required for optimal productivity.
  • the invention herein disclosed offers many distinct and unique capabilities, to serve a wide range of user needs.
  • the present invention can simplify access to computers, and can accelerate user and computer interaction—especially for GUI user/computer applications, and for mobile operating environments.
  • the present invention eliminates or reduces many drawbacks of many existing input devices and mouse-type products.
  • the invention relates to a method to improve computer accessibility, by simplifying user and computer interaction.
  • the apparatus of the present invention can provide very easy access to, and control of, graphic user interface-oriented computing environments, particularly for persons with mobility impairments and for persons with special mobility requirements.
  • a mobile, lightweight, ergonomically-shaped, customizeable, user/computer interface apparatus is attached onto the human forefinger, providing means for thumbtip interaction with a computer, via predetermined user control signals.
  • a hardwired or wireless signal transmission system receives user control signals and relays them to a base/computer interface apparatus, which detects, decodes, and converts user control signals into formats suitable for input to and processing by an interconnected controllable computer.
  • computer-generated or other external control signals can disable operation of a base/computer interface apparatus and a user/computer interface apparatus, when necessary for security.
  • the system and architecture of the invention provides networking of multiple user/computer and base/computer interface devices and other interface device combinations, as means for controlling multiple controllable computers, over at least one computer network.
  • the computer user can control any controllable computer event remotely, without the need for a dedicated, cleared, smooth, flat, horizontal, desktop surface or typical office facilities. Not requiring restrictive, immediate proximity to the controlled computer is a cardinal benefit of this invention and several preferred embodiments.
  • An object of the present invention is to provide a cordless, user/computer interface device operable from any three dimensional location sufficiently proximate to the base transceiver for signals to reach it.
  • Another object of the invention is to provide a computer input device which is operable from any location reasonably close to the computer being controlled, and which does not require a prominent, dedicated, cleared, smooth, flat, horizontal surface or other special surface upon which to run.
  • Another object of this invention is to provide an ergonomically shaped and ergonomically operable device to serve needs of users with physical impairments or handicaps. It is therefore an object of the present invention to provide a wireless GUI-oriented user/computer interface device which is easily attachable to the user's index finger, which can be comfortably “worn” for extended periods of time, and which can be very easily operated by thumbtip and/or forefinger pressure. A further related object is to provide a GUI-oriented user/computer interface device, operable without the need to move the user's hands away from the computer keyboard.
  • Another object is to provide a security option for GUI applications.
  • secured two-way authentication sequences can be used to control LANS, enterprise-wide networks, other network resources, other computing resources, and other controllable machinery.
  • a related object is to provide a secure, mobile, highly flexible GUI equipment design which allows the user to carry his or her own user/computer interface device from one location to another or from a desktop computer to a notebook or laptop computer, with equal facility.
  • Another object is to provide a highly flexible, customizeable GUI equipment design, which can provide multiple basic “personality operating environment” options, using multiple, different “personality modules” (i.e., different ROMs) which can be swapped in and out of device 10 , depending on user selection of the needed “personality module”.
  • personality modules i.e., different ROMs
  • Another object of the invention is to provide an easy-to-use method for operating GUI software.
  • Another object is to provide a user/computer interface system with very sensitive signal radiating and sensing means, allowing signal transmission and reception without rigorous aiming of the input device.
  • Another object is to provide a user/computer interface architecture which can be configured to provide for an interoperable computing environment, wherein a group or groups of computers can be controlled by one or more authorized users and authorized user/computer interface devices, depending on user and interface device privileges.
  • a related object of this invention is to provide a control unit for an enterprise-wide computer security system.
  • Another primary object of the present invention is to provide computer input and control with a device which is externally switchless, in one preferred embodiment.
  • FIG. 1A illustrates a first embodiment of the present invention.
  • FIG. 1B shows the user/computer interface device of FIG. 1A attached to the user's forefinger.
  • FIGS. 1C and 1D show the interface device attached to a support stand.
  • FIG. 2 is a block diagram illustrating a hardware implementation of the present invention.
  • FIG. 3 is a block diagram of basic functional modules of a user/computer interface device according to the present invention.
  • FIG. 4 is a schematic block diagram of the user/computer interface device shown in FIG. 3.
  • FIG. 5 is a block diagram of a computer-interconnected base/computer interface device according to the present invention.
  • FIG. 6 is a schematic block diagram of base/computer interface device shown in FIG. 5.
  • FIGS. 7A and 7B are top perspective views of first and second versions of a first embodiment of a user/computer interface device of the present invention.
  • FIGS. 8 and 9 are bottom perspective views of the device of FIG. 7A.
  • FIG. 10 illustrates devices attached to a user's left and right hand index finger.
  • FIG. 11A is a top perspective view of a second preferred embodiment of the user/computer interface device.
  • FIG. 11B is a top perspective close-up view of the device shown in FIG. 11A.
  • FIG. 12 shows examples of display screens of operational sequences of the device.
  • FIG. 13 is a flowchart showing one embodiment of the security logic of a security version of the device.
  • FIG. 14 shows a block diagram of an enterprise-wide security-oriented computer system.
  • FIG. 15 shows a block diagram of an extended, meshed, enterprise-wide security-oriented network.
  • FIGS. 16A, 16B, 16 C, and 16 D show general byte maps of user control signals in the form of message packets.
  • FIG. 17 shows an example of access and authorization privileges in a large, enterprise-wide implementation.
  • FIGS. 18A and 18B show top perspective close-up views of a third embodiment of the user/computer interface device.
  • FIGS. 19A and 19B show examples of the interface device operation.
  • FIG. 1A shows a first embodiment of the present invention.
  • a wireless battery-powered user/computer interface device 10 transmits infrared user control signals 12 through signal transmission system 14 , to a base/computer interface device 20 , which is interconnected into a computer 30 via a cable 28 .
  • Signal generating circuitry (hardware and firmware) is implemented using a variety of well-known transceiver components, depending on the desired signal transmission system (e.g., infrared, radio, acoustic, hardwired, etc.).
  • User control signals 12 are transmitted from a user/computer interface device 10 in response to a computer user's manual operation of switches mounted thereupon (see FIG. 7A for switch detail). After the user presses one or more switches on device 10 to initiate user control signals, infrared signals 12 are generated by an infrared generator comprising one or more light emitting diodes (not shown) and exit the interface device 10 through infrared lens 8 . Signals 12 propagate through free space and onto base/computer interface device 20 . After detection by device 20 , signals 12 are demodulated, de-encrypted, converted to computer-intelligible control signals and/or other control signals, and relayed into computer 30 .
  • Computer 30 is connected to display terminal 18 .
  • Computer 30 can also be optionally connected and/or networked with other interfaceable peripheral devices, one or more local area networks, or other centralized or distributed computers.
  • display screen 19 of display terminal 18 responds to user control signals 12 and other output/display signals from computer 30 , such that desired effects of signals 12 can be executed and displayed on the display screen 19 .
  • Base/computer interface device 20 includes an optional sonic receiver element 42 adapted to receive and transmit sonic signals from device 10 .
  • Device 20 further includes lens 21 behind which stands one or more signal detectors, such as phototransistors, adapted to detect infrared user control signals 12 relayed from device 10 , through wireless infrared signal transmission system 14 .
  • the base/computer interface device 20 may also include an access key panel 26 .
  • Key panel 26 is used to enter access and authorization codes so as to limit access to device 10 .
  • Key panel 26 is a locking device, to control access to device 20 , to ports 22 and 24 , and to interface devices 10 .
  • Panel 26 can include conventionally known locking hardware for restricting physical access to device 20 or device 10 .
  • Ports 22 and 24 on device 20 are used for recharging of batteries in the interface device 10 .
  • Device 10 includes female couplings 11 a (as shown in FIG. 8), for manual coupling onto male couplings 11 b to receive electrical charges for the rechargeable battery contained in device 10 .
  • Male couplings 11 b are located within ports 22 and 24 in device 20 .
  • either a stand-alone AC/DC transformer (not shown) or the computer's serial port can also be used for recharging the interface device 10 .
  • Display screen 19 inherently includes preprogrammed Cartesian (or other conventional) positioning areas to receive cursor coordinates provided by device 10 .
  • the ordinate axis 17 b and the abscissa axis 17 c are shown.
  • origin 17 a is provided, through which a z-axis (not shown) is provided to assist in generation of three-dimensional displays.
  • computer 30 has a floppy or other storage disk drive 32 which is adapted to receive a floppy or other suitable diskette package 35 containing digitally encoded instructions 33 for interfacing the software drivers of the present invention with the operating system software of the computer 30 .
  • a keyboard 300 is typically attached to the computer as a separate input device. However, keyboard use can be minimized or eliminated, for many important tasks, by use of the interface device 10 .
  • FIG. 1B shows a first embodiment of device 10 attached to a computer user's forefinger.
  • device 10 can be attached to a support stand 70 .
  • the support stand 70 is shown in FIG. 1D.
  • Device 10 is attached to an articulating support 72 of the support stand 70 .
  • Hooks 74 a - 74 d as shown in FIG. 1D affix device 10 firmly onto support 72 .
  • Device 10 can be moved freely about while attached onto support 72 .
  • Support 72 plugs a concavity 76 a onto ball joint 76 b of support stand 70 .
  • Device 10 could also be directly accommodated onto ball joint 76 b without the articulating support 72 if an appropriate concavity 76 a is included.
  • Telescopic support stand members 70 a can be used to allow extension and contraction of stand 70 to different heights, to suit user preferences.
  • Support 72 can also be used to carry other electronic components associated with device 10 operation, including additional means for creating user control signals, such as positionally activated switches.
  • Support stand 70 can be used in at least two basic ways: 1) with articulating support 72 , or 2) without articulating support 72 .
  • the feature of travelling in more than one dimension at the same time is provided by the present invention.
  • the computer user can “travel” in two or three dimensions virtually simultaneously. This effect is achieved by liquid conductive switches or other position-activated switches (not shown) pointing in one virtual dimension (e.g., “x”) in combination with either one or two dimensions (e.g., “y” and “z”) pointed to by device 10 , using manually operable switches.
  • virtual “three dimensional” travel can be achieved, which can be helpful for many applications including CAX (i.e., CAD/CAM/CAE); gaming; robotics; education; virtual microscopy; multimedia; and other so-called “virtual reality” applications.
  • CAX i.e., CAD/CAM/CAE
  • gaming i.e., CAD/CAM/CAE
  • gaming robotics
  • education virtual microscopy
  • multimedia multimedia
  • virtual reality virtual reality
  • Basic control elements include wireless user/computer interface device 10 , signal transmission system 14 , base/computer interface device 20 and controlled computer 30 .
  • Signal transmission system 14 shown includes bidirectionally operable communication channel 14 a between device 10 and device 20 ; channel 14 b within device 20 , and channel 14 c between device 20 and computer 30 .
  • FIG. 3 is a block diagram of the basic functional modules of user/computer interface device 10 , showing the main functional modules and the functional module interconnections. Typical hardware and electronic components are arranged to perform signal-generating, signal processing, signal-terminating, and signal-transmitting functions for user/computer interface device 10 .
  • Module 100 represents the totality of any possible number of switch arrangements implemented on any given embodiment of user/computer interface device 10 . Depending on the particular embodiment, these could correspond to manually operable switches shown as switches 2 , 3 , 1 a, 1 b, 1 c, 1 d, and 1 e of FIG. 7A or to any other feasible alternative arrangement of switching components. Alternatively, a version of module 100 can be provided which reports switch states of internal switches, based upon the position positionally activated switches.
  • FIG. 4 illustrates switch hardware for four secondary thumbswitches 1 a, 1 b, 1 c, 1 d ; for master control thumbswitch 2 , for front switches 3 a - 3 c, and for mode switch 1 e .
  • Switch states of switches 1 a, 1 b, 1 c, 1 d, 1 e ; 2 , and 3 are sensed by switch position sensing module 100 and sensed switch states are communicated to microcontroller 110 .
  • module 110 represents a microprocessor or microcontroller for processing.
  • the microcontroller 110 detects the position and/or state of the control switches comprised in module 100 .
  • Module 110 also serves to encode this information in accordance with predetermined parameters and modulation plans, using the information stored in personality ROM module 118 and ROM (or EEPROM) module 116 .
  • the result of this process is a composite signal which is then outputted from module 110 in an organized intermediate signal format and fed into module 114 .
  • module 110 is implemented with the use of a microprocessor or microcontroller integrated circuit chip, whose specific functional and operational characteristics depend on the specific type of embodiment being implemented. Integrated circuits of this type are well-known in the art.
  • the arrows between functional modules of user/computer interface device 10 represent individual or grouped conductive paths to relay signal intelligence and control signals between functional modules.
  • Personality module 118 is a ROM memory storage device, and module 116 is a ROM (or EEPROM) device. These devices store, in protected form, information which is used by module 110 to determine the encoding scheme and other information processing parameters.
  • personality module 118 contains application-specific, environment-oriented information and codes. The information and codes are used to determine the encoding and modulation scheme to be followed by module 110 , in accordance with the specific application selected by the user.
  • Module 118 is implemented as an interchangeable ROM cartridge, that can be easily inserted and removed by the user (See FIG. 7A, modules 6 a , 6 b , 6 c ). Different ROM cartridges contain encoding and modulation plans and other information corresponding to different software applications. The user needs only to insert the ROM cartridge into device 10 which corresponds to the software application (i.e., “personality operating environment”) to be used.
  • ROM (or EEPROM) module 116 is implemented in device 10 for security-oriented applications.
  • Module 116 contains encryption security information, comprising one or more access by device 10 and authorization security tables for limiting access to any enterprise resource by: 1) one or more user(s); 2) one or more user/computer interface device(s); 3) one or more application(s); 4) one or more file(s); 5) one or more system(s); or 6) one or more network or signal transmission system communication channel(s), within the auspices of an overall, enterprise-wide access and authorization privileges plan.
  • Access by users or devices to any given enterprise resource is either granted or denied, based upon the security clearance of the user or device or based on any other command and control information defined and customized into the access and authorization privileges plan of the specific enterprise, as administered by an authorized systems administrator.
  • Module 116 is not designed to be installed, deinstalled, serviced, or updated by the user, but is controlled by the system or security administrator.
  • modules 116 and 118 are optional and depends on the particular application and environment for which device 10 is adapted. Furthermore, the specific integrated circuit chips used to implement functions of modules 116 and 118 is also dependent on the type of application, personality operating environment, or security plan being implemented.
  • the output of module 110 is a composite signal which is relayed into modulator/transceiver module 112 .
  • the signal path from module 110 to module 112 passes through module 114 , which acts as a kill switch. When switch 114 is open, the signal from module 110 cannot be input to module 112 , and output signaling is disabled.
  • the state of the signal path in kill switch 114 is controlled by an external signal for disabling device 10 .
  • An external kill signal is generated and transmitted by base/computer interface device 20 or other authorized signal source.
  • a kill signal when transmitted, is processed by transceiver module 112 of device 10 , then relayed into kill switch 114 . As illustrated in FIG.
  • AND gate 114 a requires two logical “1” inputs, in order to continue passing the signal from 110 to module 112 .
  • a kill signal sends a logical “0” input into AND gate 114 a , opening the kill switch, and thereby removing the path into module 112 .
  • Module 112 represents a modulator/transceiver device. This device is implemented as an electromagnetic wave modulator/transmitter (for infrared or radio waves), as an electronic transducer (for sonic or ultrasonic waves), or as a signal buffer/driver (for hard wired transmission). In the case of wireless embodiments using electromagnetic wave transmission methods, module 112 uses the information conveyed to it from module 110 to modulate an electromagnetic carrier wave, either infrared or radio. The modulated signal is then radiated by means of a radiating device suitable to the frequency of the signal being radiated, or transmitted. In the case of wireless embodiments using sonic or ultrasonic waves, module 112 comprises an electronic amplifier and a sonic or ultrasonic transducer, depending on the specific means of transmission.
  • Module 112 produces appropriate signals 12 which propagate through free space and/or air and are received by base/computer interface device 20 .
  • the power of the radiated signal or intensity of the acoustic waves is chosen such that the signal can be picked up by device 20 , while the distance between device 10 and device 20 is within the intended operating range.
  • Device 10 also includes data input/output ports 4 a and 4 b adapted to import data from, or export data to, devices interfaceable with device 10 .
  • Power for user/computer device 10 is provided through a battery 402 , adaptable to be recharged via charge and control circuit 404 contained within base/computer device 20 (FIG. 6). Charging voltage passes through male plug 11 b (FIG. 1A) which interconnects with female plug 11 a of device 10 , whenever device 10 is plugged into port 22 .
  • signals 12 transmitted to device 20 are outputs from device 10 . More specifically, signals 12 can comprise infrared, other electromagnetic radiation, optical and/or acoustic signals, depending on specific implementation requirements.
  • infrared signals 12 are received by, or transmitted out of, infrared transceiver module 210 .
  • sonic and/or optical transceiver devices can be implemented to carry sonic and/or optical signals into and out of transceiver 210 .
  • Transceiver 210 is connected to a microcontroller or microprocessor 212 to decode, or encode, signals 12 . Demodulation occurs in a reverse manner to modulation, as is common practice.
  • Information for encoding/decoding microcontroller 212 is provided by memory element 214 .
  • Memory element 214 can further comprise one or more ROM options as shown in FIG. 6.
  • ROM 216 includes a predetermined security table and related options.
  • ROM 218 includes a programmable security coding module option. Furthermore, a ROM or other suitable storage device 220 comprises means for modulation/demodulation of signals 12 in accordance with a preset modulation scheme implemented in user/computer interface device 10 . Information in storage device 220 is used to interpret the signals 12 from device 10 according to any implemented personality environment. As a result, after incoming signals 12 are received by transceiver 210 , they are supplied to the microcontroller 212 for subsequent decoding based upon information provided from memory 214 and memory 220 . The decoded signals are then outputted from microcontroller 212 to a computer interface device comprised in module 225 , such as a Universal, Asynchronous Receiver/Transmitter (UART) or similarly functional device.
  • UART Universal, Asynchronous Receiver/Transmitter
  • Module 225 consists of any conventionally known computer interface device adapted to receive all original switch states generated in user/computer interface device 10 , which are interpreted and stored in microcontroller 212 , and to transfer those states to a computer 30 .
  • Another example of module 225 is a shift register.
  • mode switch 1 e Contained within each such personality operating environment (such as security access, CAD/CAM, etc.) are a plurality of modes selected by mode switch 1 e each of which, in turn, controls the functions designated by selectable switches 1 a - 1 d.
  • one selected mode may be “input formatting”, when a user wishes to designate various input formats for the controlled computer.
  • the secondary control switches 1 a - 1 d designate different “input formatting” switch functions, such as coloring, shading, hatching, providing standardized geometric figures.
  • a different setting of mode switch 1 e could involve an “output formatting” mode, with the color, style, and other “output formatting” functions being designated by the secondary control switches 1 a - 1 d.
  • the cable 28 connects the output of base transceiver 20 to the input of an appropriate input port located on the reverse face of computer 30 .
  • the computer 30 interprets the signal 242 .
  • the computer 30 invokes control over the display device 18 (FIG. 1A).
  • the computer can also invoke control over any implemented controllable peripheral device via a direct, indirect, or virtual network connection.
  • FIG. 7A illustrates a detailed perspective view of a first version of a first embodiment of the user/computer interface device 10 .
  • device 10 includes master control thumbswitch 2 , which controls positioning of the computer display cursor. Thumbswitch 2 can be very easily operated by the computer user's thumbtip, to control motion of the computer cursor in any direction (e.g., pressing locations 2 a , 2 b , 2 c , and 2 d respectively, runs the cursor up, right, down, and left) (see also FIG. 12).
  • Coordinates 17 a , 17 b , 17 c of FIG. 1A or other coordinates, (e.g., polar coordinates) can be manipulated. Three dimensional coordinates along a z-axis (not shown) are also available through the switch 2 when the input device 10 is configured with an appropriate three-dimensional personality module and/or mode switch 1 e selects three-dimensional operation.
  • Device 10 also contains four adjacent, thumb-operable, secondary control switch elements, 1 a, 1 b, 1 c, and 1 d .
  • Each of the secondary switches respectively provides a different functional choice to the computer.
  • switches 1 a, 1 b, 1 c, and 1 d functions can be analogous to the function keys on a computer keyboard.
  • Other switches can be mounted upon the control surface 1 .
  • the other switches can vary in-number, depending upon the version of device 10 , the type of computer being interfaced, the applications being used, or computer environment being served.
  • Mode switch 1 e is a sliding switch, which slides from position 1 e . 1 to position 1 e . 8 , as implemented in the user input device 10 .
  • the mode switch 1 e has significant operational implications in that it can be used to set the functional mode for the various functions represented by switches 1 a - 1 d .
  • each switch 1 a - 1 d can, in turn, change function eight times depending upon the setting of mode switch 1 e.
  • the changing of the mode switch setting 1 e thus significantly changes the operating characteristics of switches 1 a - 1 d and, in turn, the user input device 10 .
  • switch 1 e is easily operable by the thumb.
  • front switch 3 On the front face of input device 10 is front switch 3 , which can be considered a “click switch.” This switch operates in an analogous fashion to selection switches typically available on “mouse-type” input devices. As illustrated, the front switch 3 can toggle upward or downward and click “on” so that data upon which the cursor rests is entered into the computer 30 . The capacity to perform these and other control actions allows the user to access a variety of control options in the computer 30 . In yet other embodiments, the front switch 3 can, from its' center position, “click” directly inward one or more times to perform yet other “click” or multiple “click” functions.
  • a lens 8 or any other appropriately implemented signal transmissive means for transmitting predetermined selected infrared or other signal type 12 is provided.
  • an acoustic signal emission port 41 is illustrated.
  • Interface device 10 can implement an acoustic emission and detection option.
  • the acoustic port 41 in conjunction with acoustic signal generation means, can emit predetermined selected acoustic signals, when enabled.
  • port 41 requires a corresponding implementation of port 42 on base transceiver 20 of FIG. 1A.
  • Port 42 is an acoustic detector, and acoustic signals emitted from port 41 are detected therewith. Any known conventional acoustic transceiver hardware can be used.
  • personality module 6 a is shown loaded into input device 10 .
  • the personality module 6 a can be easily removed from the personality module cabinet 6 .
  • the module consists of, for example, a cartridge enclosed ROM having stored within it the various predetermined functions and data associated with the operating environment choice.
  • different primary “operating personalities” of the computer user input device 10 can be chosen by the computer user.
  • Alternative personality modules 6 b and 6 c can be selected by the computer user to implement different fundamental operating environments.
  • Other personality modules can also be used to replace installed personality module 6 a simply by removing module 6 a from cabinet 6 , and inserting either module 6 b , or module 6 c , or any other module.
  • Data port 4 a can also be used for a variety of data inputs and outputs. When used' as a data input port (e.g., to rewrite an EEPROM, for the purpose of updating security information) input port 4 a affords much additional flexibility to device 10 . It can be observed that 4 a security updates, changing the access and authorization privileges of device 10 can help achieve objectives of the invention. When used as a data export port, data port 4 a can be used to output device 10 stored data, which has been made accessible for export.
  • Device 10 does not require strapping to the user's hand, finger, wrist, etc., in order to operate properly. while strapping by means of a strap 40 a and strap 40 b is useful, strapping does not affect basic functions of device 10 .
  • Strap 40 a and 40 b can be padded (not shown) to provide greater user comfort.
  • a padded lining contributes to the ergonomic design of device 10 —allowing the user to comfortably wear device 10 for prolonged periods.
  • Other strap designs or arrangements are contemplated.
  • Other attachment means can attach device 10 to the human hand, wrist, finger, etc. For example, a ring or ring-type attachment means is shown in FIG. 9.
  • straps 40 a and 40 b Prior to attaching device 10 to the left or right forefinger, straps 40 a and 40 b are affixed into the side of device 10 to which the user will attach his or her forefinger.
  • Device 10 has symmetrical, arcuate-shaped surfaces 99 a and 99 b which accommodate either left forefinger ( 99 a ) or right forefinger ( 99 b ) attachment with equal facility, to suit user preference or immediate needs.
  • Strap 40 a and 40 b are attached to device 10 by attachment fittings.
  • “snap-in” fittings can be implemented on straps 40 a and 40 b , which can be snapped into complementary fitting receptacles or directly into concavities on device 10 .
  • FIG. 7B illustrates a second version of the first embodiment shown in FIG.
  • FIG. 7A which contains similar elements to that of FIG. 7A except for the inclusion of thumbswitch selector switch 2 z.
  • an alternative means for attaching the computer user input device 10 to the computer user's finger can be a releasably secured ring 50 .
  • Ring-type attachment means may be desirable for some computer user preferences.
  • Other strap attachment means can be provided, including leather. However, it appears that Velcro(R) straps with a padded lining best achieve the “illusion of weightlessness.”
  • FIG. 10 shows device 10 worn attached onto both of a user's forefingers.
  • a keyboard 300 can also be used jointly with device 10 for even greater flexibility in computer control.
  • FIG. 11A a second embodiment of a user/computer interface device 1100 is shown attached onto a user's finger.
  • the device is a simplified version of device 10 and includes only a thumbswitch 1102 and a thumbswitch selector switch 1104 .
  • FIG. 11B is a top perspective close-up view of the second embodiment shown in FIG. 11A.
  • the 1102 and 1104 switch arrangement operates in a similar fashion to thumbswitch 2 and front switch 3 of the first preferred embodiment, shown in FIG. 7A.
  • FIG. 12 illustrates control of a cursor on the display screen.
  • Display screen 19 inherently includes a preprogrammed Cartesian (or any other conventional or customized) positioning mechanism to receive cursor coordinates provided by computer 30 , in accordance with user control signals 12 , initiated and transmitted using device 10 .
  • origin 1217 a is provided as a primary point of reference.
  • FIG. 12 shows a series of different, sequential examples of cursor movement events.
  • the default position, 1217 a is shown in the center of the screen.
  • Screen 1204 a is always the beginning screen of a cursor reinitialization and movement sequence.
  • Default position 1217 a is always presented in any basic cursor reinitialization control sequence, unless an alternative default position is customized by the user.
  • Computer control events such as cursor reinitialization, movement, select functions, and “click and drag” functions, are accomplished by using one of the secondary thumbswitches 1 a - 1 d, in combination with master thumbswitch 2 and front switch 3 .
  • the specific functions assigned to the thumbswitches depends upon the mode switch 1 e in combination with personality module and with any application software used in computer 30 .
  • Successive screens show a progression of directional cursor moves using user control signals initiated by thumbswitch 2 .
  • thumbswitch 2 For examples of “click and drag” action, using both thumbswitch 2 and front switch 3 , see FIG. 20, which discusses usage of the present invention with “mapped keys” in a word processing application.
  • directional pressure on switch 2 moves the cursor up in the direction of direction 2 a ; cursor right, 2 b ; cursor down, 2 c ; and cursor left, 2 d .
  • Other directions are implemented with appropriate switching circuitry and/or driver or other application software.
  • FIG. 13 shows a typical security “operating personality” environment, wherein an “annunciator” and an “interrogator” are shown in a security-oriented dialogue.
  • Each user/computer interface device 10 is assigned a unique “device password.” In like manner, each computer user is assigned a unique password.
  • user passwords and device passwords form a composite password which is designated the “annunciator” password.
  • the composite password itself, is a unique password specifically identifying both the computer user and interface device to the security system.
  • Each layer of one or more layers in a security system logic is designated an “interrogator.” Both user and device passwords must be authorized to gain computer access, i.e. to any enterprise resource, or to complete a transaction. If either password is not authorized to gain access to complete the requested transaction, then the attempted transaction is determined “illegal” by the interrogator security logic. After any transaction is deemed “illegal,” alarm features are activated, including activation of the kill switch 114 of device 10 .
  • interrogator functions are started at step 1302 .
  • the interrogator begins a wait state loop at step 1304 , listening for any annunciator signal.
  • the interrogator is testing for the annunciator signal receipt from any annunciator desiring access. Until the answer is “yes,” the system loops, listening for input from any annunciator.
  • Annunciator signal generation capability is active in device 10 at step 1308 , given that device 10 contains a security personality module or other means to be responsive to manual input of a computer user's password.
  • the user inputs a password into device 10 , and the annunciator functions are enabled.
  • the correct user password enables the device 10 for transmission of an annunciator signal.
  • the annunciator signal is transmitted from device 10 to the cognizant interrogator.
  • An interrogator may be located in base/computer transceiver 20 which is interconnected into the individual computer (or other access point) being accessed. Alternatively, interrogators may be located at any access points to any enterprise resource.
  • the interrogator determines if annunciator input device 10 transmitted a composite ID along with the annunciator signal. If ID is received by step 1315 , it is examined, at step 1326 , for validity.
  • the interrogator generates a “Who Are You” inquiry, which is transmitted at step 1318 to the annunciating device.
  • the annunciator processes the “Who Are You” identification request at step 1322 . If ID was not sent, and a “Who Are You” signal is not received, then the annunciator signal is regenerated at step 1310 and transmitted (or retransmitted if it was missed) at step 1312 .
  • the annunciator (device 10 ) authentication/ID response to the interrogator “Who Are You” inquiry begins at step 1322 , after receipt of the “Who Are You” signal.
  • the authentication/ID is fetched from input device 10 's memory at step 1324 .
  • step 1325 Once the ID has been fetched from memory, it is transmitted at step 1325 to the interrogator, which is waiting for this signal at step 1326 . If the signal is not received at step 1326 , at step 1328 the interrogator loops back N number of times to reinitiate the authentication/identification process. Once the N number of loops have been exceeded, however, then an alarm routing is called at step 1330 .
  • the interrogator transmits the received signal to the security memory for comparison against the authentication/ID database (module 216 of FIG. 6).
  • the received authentication/ID is compared in the security database and an ID status message is then produced at step 1338 .
  • the authentication/ID received by the interrogator is determined to be either valid or invalid.
  • step 1346 the access enabling process is entered. Two events then occur. First, an annunciator enable signal is generated at step 1348 and the interrogator transmits the annunciator enable signal at step 1352 to the annunciator. Second, the transaction monitor routine is initiated at step 1347 to ensure that each attempted transaction is legal. On the annunciator side, the enable process is initiated at step 1354 with receipt of the interrogator-initiated annunciator enable signal. Before any enablement occurs, however, the system tests at step 1356 to determine whether more work is being done.
  • the system will loop back till all work is completed and, once no more work is to be done—i.e., annunciator enablement is no longer required—the annunciator will end the session with a logoff transaction.
  • the device annunciator can automatically disable itself (unless other arrangement is made) and it can only be re-enabled by repeating the entire security procedure by an authorized user.
  • step 1344 If the ID is invalid, a record of the ID and the attempted transaction is made in the security memory 1336 and the alarm process is initiated, step 1344 .
  • interrogator functions can be configured by the system administrator or security administrator to be repetitive—i.e., the interrogator can be set to periodically request the annunciator/ID during the progress of any access session being made by the user/annunciator, in accordance with a policy of the computer institution or device being accessed.
  • Routine 1350 can disable annunciator operation at any point that the interface device annunciator steps beyond its' access and authorization privileges.
  • step 1370 If an alarm procedure is to be activated, steps 1370 , 1330 or an invalid ID at step 1340 , then the requested transaction and the IDs are recorded by the interrogator for security monitoring at step 1344 . The interrogator then determines whether to disable the device, starting at step 1380 . The ID and transaction information is again compared with the authorization and access privileges plan in the ROM 1336 . If the device is to be disabled, step 1390 , the kill signal is transmitted at step 1392 . Otherwise, a message as to the reasons that access is being denied is presented to the user, step 1396 , or the device is enabled, step 1346 .
  • one or more individually authorized user/annunciator(s) are granted access to enterprise resources (PCs, networks, applications, etc.) by any “interrogator” at one or more levels—i.e., the total security system logic contained in input device(s), and/or base transceiver(s), and/or LAN network server(s) and/or centralized computer(s) and/or mainframe(s)—which interrogates any annunciator seeking access, to verify that both the user/annunciator and the device/annunciator are authorized access, and what level or levels of access are authorized.
  • levels i.e., the total security system logic contained in input device(s), and/or base transceiver(s), and/or LAN network server(s) and/or centralized computer(s) and/or mainframe(s)—which interrogates any annunciator seeking access, to verify that both the user/annunciator and the device/annunciator are authorized access, and what level or levels of access are authorized.
  • the security module contained in base transceiver 20 which interacts with input device 10 is the interrogator.
  • access control begins by initiating enterprise-wide security system interrogator functions—this is typically done by the system administrator or security administrator. This occurs in similar fashion to initiating a local area network server and its' client devices (e.g., as in Novell, 3COM, or other LANs). This can also occur similar to initiating other types of centralized or distributed networks (e.g., SNA); teleprocessing systems (VTAM, TCAM, etc.); transactions processing applications (e.g., CICS) or other operating system, application, or system access method.
  • SNA centralized or distributed networks
  • VTAM teleprocessing systems
  • TCAM teleprocessing systems
  • CICS transactions processing applications
  • the present invention adds the unique features of 1) an access-seeking user/annunciator which must access user/computer interface device 10 with a password; and 2) after a satisfactory user access, device 10 then transmits its' own composite annunciation signal; and 3) distributed or centralized security logic then authenticates that specific input device 10 's annunciation signal and allows only access authorized to that specific device 10 and that specific user.
  • FIG. 14 groups of computers are shown arrayed within a computing institution or enterprise.
  • One or more properly authorized users (using one or more properly authorized devices 101 - 110 of the type of device 10 of FIG. 1A) in the local area network (LAN) shown in FIG. 14 can gain access to any computer on the LAN implementing the present invention, under the organizational auspices of an all-encompassing, enterprise-wide, security-oriented access and authorization privileges plan.
  • the access and authorization privileges aspects of the present invention are highly flexible, and can be implemented in a number of ways, to suit virtually all user needs.
  • a system administrator only needs to define users, define devices, define one or more levels of access, and define enterprise resources in order to develop an access and authorization privileges plan.
  • FIG. 15 An extended enterprise-wide computer control system is shown in FIG. 15.
  • Six LANs comprised of twenty computers each are shown. All twenty computers “local” to each LAN are controlled by ten “local” user input devices associated with each respective LAN.
  • FIG. 15 thus illustrates an implementation of the present invention which has subdivided an enterprise-wide group of one hundred twenty computers into six LAN operating groups which each operate as separate LANs. Separately, shown to the left of the 120 computers, are four “grand master” input devices, labelled I, II, III, and IV. Each of these grand master input devices can access all 120 of the enterprise's computers. By virtue of their “grand master” status, each can operate on all six LANs shown.
  • the 120 computer, 6 LAN enterprise of FIG. 15 has defined in its' access and authorization privileges plan, such that only four grandmaster (I, II, III, and IV) input devices need to be authorized access to all 120 computers. While each LANs' group of ten input devices could be implemented to access one or more other LANs' computers, in this example, only “grandmasters” access all six LAN groups' computers.
  • Access and authorization privileges plans can vary from simple network definitions, to advanced, meshed, layered network definitions. Advanced access and authorization plans can include definitions which control access to complex networks with inter-LAN gateways, access to other centralized or distributed computers such as mainframes, or any other enterprise resource definitions suitable to the enterprise's needs.
  • FIGS. 16A, 16B, and 16 C each show a table illustrating examples from different classes of user control signals. These signals are represented in message packets.
  • Device 10 using the personality modules, can transmit a plurality of distinct user control signals based upon different encoding sequences. Control signals are transmitted as bursts of coded pulses comprising one or more message packets. Each message packet includes a plurality of fields of encoded characters. The type of control signal determines the packet fields and organization. The format of the message packet is determined by the selected personality module.
  • FIG. 16A illustrates the message packet format for computer control, which includes a high-level, security-oriented ROM personality module 118 (FIG. 3) or for use of the security EEPROM 116 (FIG. 3).
  • Field 1 is a flag indicating the start of the message packet.
  • Fields 2 - 4 provide ID information designating the device and the user. This information is used to determine proper authorization privileges in the annunciator/interrogator dialog. (See FIG. 13)
  • the command/control information is contained in field 5 . This information relates to the setting of the switches on device 10 and provides the actual computer control signals. Finally, a stop flag indicates the end of the packet.
  • Security functions can be included with other processing information. If security is implemented, the ID information fields are included at the beginning of each message packet. The remaining function information is contained in the succeeding fields.
  • FIG. 16B illustrates a message packet for a CAX personality module.
  • Fields 1 and 2 start the message and relate ID information similar to the security-oriented packet of FIG. 16A.
  • information defining the personality environment can be included in the beginning fields.
  • Field 3 contains design mode information, such as a manufacturing or electrical design mode. For example, the design mode can be determined by the setting of mode switch 1 e .
  • Fields 4 - 6 contain the basic command function data, such as “insert”, “delete”, and “create a point.” The command function data or user control signals would generally be designated by pressing switches 1 a - 1 d .
  • Some functions are defined in relation to the previous function performed. Therefore, fields 5 and 6 are used to provide the prior function state and change. Any additional information or control data required for a specific function is included in field 7 . An end message flag would also be included for this packet.
  • FIG. 16C illustrates other user control signals.
  • FIG. 16C. 1 shows functions used to initiate processing on a system. After successfully accessing a specific computer or other resource, cursor movement and an example format for cursor control are illustrated in FIG. 16C. 2 .
  • the mode i.e. the setting for switch 1 e
  • the indicated key meaning is included in field 7 (FIG. 16C. 4 ). The meaning is also affected by the mode referenced in field 5 .
  • the base/computer interface device 20 determines the type of packet and the relevant information contained in the packet. Appropriate control signals are generated and transmitted to the computer 30 to execute the functions.
  • FIG. 16D illustrates a message packet for a control signal from the base/computer device 20 to the user/computer interface device 10 .
  • the bidirectionality of the signal transmission system allows for signals both to and from the base/computer interface device 20 .
  • the packet fields are used to transfer information for activating the kill switch due to improper access requests.
  • FIG. 17 shows a table, indicating an example of the general organization of a predetermined access and authorization privileges plan, developed for the purpose of safeguarding access to confidential data and restricting access to enterprise resources, as needed to satisfy a wide range of user requirements.
  • FIG. 17 shows an example of one possible enterprise-wide access and authorization privileges plan.
  • Enterprise resources 800 comprise resource categories R 1 through RN.
  • R 1 resources 810 comprise six separate LANs—LAN 100 through LAN 600 , as illustrated in FIG. 15.
  • R 2 resources 820 comprise mainframe resources accessible in enterprise-wide network 800 .
  • R 3 resources 830 comprise network resources, including communications channels used within enterprise resources 800 .
  • R 4 resources 840 comprise application resources which can include teleprocessing monitors, database applications, spreadsheet applications, network applications, etc.
  • R 5 resources 850 are not used, in this example, but can be made available at a future time.
  • R 6 resources 860 comprise the set of all user/computer interface devices 0001 through 0060 and grandmaster devices I, II, III and IV.
  • each authorized computer user using enterprise resources 800 , is assigned an unique password identification.
  • a user password will allow the user to access any enterprise resource for which he/she is authorized access, to the extent that the user/device composite is allowed.
  • users are assigned passwords for the purpose of accessing at least one user/computer interface device; and, after accessing device 10 , for accessing one or more specific computers, applications, or any other enterprise resource, to which the user is authorized access.
  • passwords for the purpose of accessing at least one user/computer interface device; and, after accessing device 10 , for accessing one or more specific computers, applications, or any other enterprise resource, to which the user is authorized access.
  • computer users do not access the computer directly.
  • Users access user/computer interface devices, which in turn access computers using a combination user password and a device password.
  • An access and authorization plan can be implemented in the operating system software; applications software; ROMs; and/or EEPROMs, depending on user needs and implementing means chosen.
  • the access and authorization information is stored in the electrically-erasable programmable read-only memory (“EEPROM”) 116 (FIG. 3) of device 10 .
  • the information can include unique identifying data for the device or user for use in security-oriented applications.
  • an EEPROM programmer device in conjunction with specialized data input ports 4 a and/or 4 b , can be used to repeatedly change or update EEPROMS.
  • FIG. 18A shows a basic version of a third embodiment of device 10 .
  • FIG. 18B shows an advanced version. Both versions can be attached to the left or right forefinger.
  • the third embodiment is especially adapted for computer input and control needs of users with mobility impairments or other serious physical handicaps.
  • the embodiment shown in FIGS. 18A and 18B are specifically adapted for attachment to the user's right forefinger.
  • Devices of the third embodiment are explicitly designed to serve the market niche often referred to as the “assistive technologies.”
  • One arcuate-shaped surface 89 is available to accommodate the left or right forefinger. This is in contrast to the two arcuate-shaped surfaces 99 a and 99 b of the first embodiment (FIG. 7A).
  • Straps 40 a and 40 b operate to encircle the proximal phalange of the forefinger and affix it into arcuate-shaped surface 89 .
  • the third embodiment shows a different placement of personality module cabinet 6 , in FIG. 18B.
  • interchangeable ROM cartridges such as 6 a , 6 b , and 6 c of FIG. 7 are used therein.
  • FIG. 19A shows a “mapped keys” application for use in word processing in connection with the interface device of the present invention.
  • Device 10 can be used to move the cursor over one or more “expanding maps” of keyboard layouts. The user can then select “mapped keys” to spell out words without need of a keyboard.
  • the user is presented with successive expanding screens of letters and numbers in the format of the keyboard layout of the user's choice. An area of the displayed keyboard is selected according to the direction of cursor movement. Upon arriving at the chosen character, the user selects it. This seek-and-select procedure is repeated, to select a series of characters in succession until the desired word is spelled out.
  • the screens can display a standard QWERTY keyboard layout or any customized key layout can be used, to suit user preferences.
  • successive screens 1901 a , 1901 b, and 1901 c show expanding screens “zooming-in” on an area of a displayed keyboard layout, according to the direction of cursor movement.
  • screen 1901 d which shows selection of the letter “M”.
  • Screens 1902 a , 1902 b , and 1902 c show selection of the letter “O”.
  • a “Morse code wheel,” as illustrated in FIG. 19B can be operated in a manner similar to the word processing sequence.
  • the “wheel” screen is comprised of sectors, wherein each sector represents one or more dots or dashes.
  • To encode a series of characters the user “seeks and selects” different dot or dash patterns, by moving the cursor from one sector to another to encode one or more alphanumeric characters. The user continues to select character encoding sequences to spell words. Spaces between words are selected, as needed. When the user reaches the end of their desired message he/she selects the “xmit” sector, which automatically routes the encoded message train to a preselected destination.
  • transceiver has been used to illustrate that according to the present invention, both a “transmitter” and a “receiver” are provided together in close proximity for two-way, or “duplex” operation (i.e., base transceiver 20 has both a transmitter and a receiver contained within the same electronic enclosure, as both transmission and reception of signals are provided therein).
  • base transceiver 20 has both a transmitter and a receiver contained within the same electronic enclosure, as both transmission and reception of signals are provided therein.
  • the user input device 10 can also be designated as a “transceiver” given its duplex communications capability shown by “kill circuit” and other “receiver” functions.

Abstract

A ergonomic customizeable user/computer interface system for wireless computer control. A hand-attachable user interface device transmits control information upon activation of switches on the interface device. A base interface device receives the transmissions, decodes the information and provides control signals to the computer. The interface system allows for security authorization control and multiple computer or LAN operations with each user interface device. Greater functionality is provided by the use of personality modules in the user interface device for different modes of operation.

Description

  • 1. This is a continuation in part of application Ser. No. 07/440,771, filed Nov. 22, 1989, now pending.
  • BACKGROUND OF THE INVENTION
  • 2. 1. Field of the Invention
  • 3. The present invention relates to user/computer interface devices, specifically with respect to graphical interfaces. More specifically, the present invention relates to wireless transmissions from an ergonomic remote control device to a base device for control of computer functions and applications.
  • 4. 2. Description of the Art
  • 5. In recent years, the process of entering certain types of data and control inputs into computer based systems has been significantly simplified. Traditionally, data entry to a computer has been done with a standard computer keyboard. However, for many users, the keyboard proved to be insufficiently mobile and accessible, inconvenient, and time consuming.
  • 6. The user/computer interface has been simplified by “graphic user interfaces” (GUIS) and “pointing devices.” The user can select an icon from a GUI display to activate the predetermined function or event associated with the icon.
  • 7. Since GUIs first emerged, alternatives to the keyboard have proven highly desirable for optimum productivity in many applications. Accordingly, auxiliary or keyboard alternative hardware such as light pens, joysticks, trackballs, touch pads, digitizing pads, and the “computer mouse” developed. These new GUI-oriented pointing devices quickly proved to be viable, timesaving alternatives to the keyboard for many types of computer input and control situations. In particular, the mouse has become the single most widely-accepted keyboard alternative input device.
  • 8. The fundamental operating principle of the mouse relates to the rotation of a spherical trackball carried within the mouse. When the mouse is moved over a flat surface, the trackball, which is partially exposed, freely rotates within the device and generates signals which correspond to pairs of x-axis and y-axis coordinates. The mouse contains means to translate these coordinates into signals to which the attached computer is responsive. Accordingly, when the computer user moves the mouse device across a working surface adjacent to the computer, the cursor indicator on the display screen moves to the location pointed to by the computer user. Also, the computer user's operation of one or more buttons aboard the mouse effects other control functions of the computer and computer display, such as the selection of computer usage event options.
  • 9. Notwithstanding the contributions of mouse products and other alternative input devices, many computer input and control needs remain unmet by the prior art. The mouse requires a prominent, smooth, flat, horizontal space on the user's desk. In practice, a typical user's desk is crowded and inhibits the space required for mouse operation. Most mouse devices are especially difficult to use when away from traditional office facilities, in mobile or restricted locations.
  • 10. Users who operate their computers while travelling, or who operate computers in non-office situations find few computer input products that specifically address the needs of laptop and notebook computing. Some mouse type devices have been developed for mobile users. However, the computer user must make special adjustments to clamp-on fittings to attach these products to the computer keyboard. Some of these products must first be physically clamped onto the computer for each work session, then must be physically unclamped, when the work session is over. Also, the computer user must move his/her hand back and forth from keyboard to the clamp-on product to operate it. Another main complaint made by many users and industry analysts is that users' thumbs quickly tire, operating the small trackballs provided on these products.
  • 11. One drawback of the mouse results from hardwired attachment to the computer. The connecting cord from the mouse to the computer is subject to the same “umbilical” problems associated with cords on any appliance which needs to move about, to operate according to design. Some wireless computer input devices exist, but their need for dedicated horizontal surfaces precludes many potential benefits of wirelessness.
  • 12. Users with physical impairments often find mouse products difficult to operate. Depending on the physical impairment, both mouse and keyboard computing can be difficult, painful, or impossible for impaired users. For users with arthritis, carpal tunnel syndrome, or tendonitis, mouse usage can be an awkward and painful. There is a recognized need for GUI devices which offer prophylaxis for users with physical impairments and repetitive stress injuries. While successful products serve a variety of needs for these users, high costs and highly specific utility of many such products hinder their widespread acceptance.
  • 13. Technicians and professionals often have advanced or high-functionality needs. Many of these specialized needs are unmet by traditional desktop mouse-type products, or by products such as the aforementioned mobile computer input products. High costs and highly-specific utility of many such high functionality products also hinder their widespread use.
  • 14. Another mouse drawback is its' simplex, unidirectional design and operation. No mouse currently implements two-way interaction between controlled computers and input devices. Lack of bidirectionality is better appreciated, if one considers the many new applications and benefits of bidirectionality, such as roaming LAN interaction; security and alarms; mobile signaling and paging; and remote interactive applications.
  • 15. Computer users have local area network (LAN) and security needs which remain unmet by current input devices. LAN users have connectivity needs which extend beyond their own computer. LANs were created to facilitate resource-sharing of limited resources among multiple users. LAN users often access and connect into one or more LANs, or other accessible computers or network environments. It has been estimated that more than half of all computers in business are attached to a LAN.
  • 16. In addition, as the computer population grows, security grows more important. Computers increasingly store confidential data, and no mouse products are designed or equipped for individually-assignable security to add to a computing installations' security “shield”.
  • 17. No shortage of LAN products or security products exist. However, no security-oriented, individually-assignable computer input and control products are available which allow LAN users to conveniently transport and securely operate personal GUI-oriented pointing devices in multiple LAN locations. “Security-oriented users” need to limit access to critical resources, including hardware, software, data and information, networks, etc. As LANs become more widespread, security becomes much more important, to ensure privacy.
  • 18. The underside of the mouse trackball is susceptible to the introduction of dirt, liquids, or other substances into the body cavity. This vulnerability can lead to equipment failure and shorter product life.
  • 19. Another drawback of the mouse is that the user may find the “mouse method” of frequently moving his or her hand back and forth from the keyboard to the mouse to be distracting to their train of thought, time consuming, or inconvenient to optimal operational efficiency.
  • 20. Several inventors have attempted to address some of these aforementioned drawbacks and problems.
  • 21. For example, U.S. Pat. No. 4,550,250 to Mueller discloses an infrared graphic input device for a computer. A remote infrared light source transmits user input commands to a detector device adjacent to the computer. The device must operate within a dedicated horizontal, two-dimensional, smooth, flat surface. The detector apparatus operates according to continuous tracking input principles and does not allow for any straying out of equipment detection boundaries.
  • 22. U.S. Pat. No. 4,578,674 discloses a method and an apparatus for controlling the cursor position on a computer display screen. This device uses both infrared and ultrasonic principles for determining the direction and the velocity of motion of a positioning device which is monitored by a control base detector. The device requires a two-dimensional plane. To operate from a three-dimensional defined location, the user must ensure the emitter/detector front face of the positioning device is always directly facing the control base.
  • 23. U.S. Pat. No. 4,628,541 to Beavers discloses an infrared battery powered keyboard input device for a microcomputer. This device offers the user additional freedom for operating a standard style keyboard without hardwiring constraints. Also, the keyboard cannot be portable to another computer, unless the computer to which the keyboard is ported is a “mirror” microcomputer device. Apparently, the infrared battery operated keyboard likely requires the implementation of a separate mouse if “mouse-type” input commands or functionality/features are needed by the user or are required for optimal productivity.
  • SUMMARY OF THE INVENTION
  • 24. In view of the foregoing, it is apparent that there still exists a need in the art for a method, apparatus, system, and architecture which provides a more efficient and effective means to easily and conveniently accomplish user/computer interface. The present invention addresses and solves many or all of the aforementioned drawbacks, for many usage-specific applications and environmental contexts.
  • 25. Accordingly, the invention herein disclosed offers many distinct and unique capabilities, to serve a wide range of user needs. The present invention can simplify access to computers, and can accelerate user and computer interaction—especially for GUI user/computer applications, and for mobile operating environments. The present invention eliminates or reduces many drawbacks of many existing input devices and mouse-type products.
  • 26. The invention relates to a method to improve computer accessibility, by simplifying user and computer interaction. The apparatus of the present invention can provide very easy access to, and control of, graphic user interface-oriented computing environments, particularly for persons with mobility impairments and for persons with special mobility requirements. A mobile, lightweight, ergonomically-shaped, customizeable, user/computer interface apparatus is attached onto the human forefinger, providing means for thumbtip interaction with a computer, via predetermined user control signals. To couple user control signals from the user-attachable apparatus to a controllable computer, a hardwired or wireless signal transmission system receives user control signals and relays them to a base/computer interface apparatus, which detects, decodes, and converts user control signals into formats suitable for input to and processing by an interconnected controllable computer. In one preferred embodiment, computer-generated or other external control signals can disable operation of a base/computer interface apparatus and a user/computer interface apparatus, when necessary for security.
  • 27. The system and architecture of the invention provides networking of multiple user/computer and base/computer interface devices and other interface device combinations, as means for controlling multiple controllable computers, over at least one computer network.
  • 28. Using the present invention, the computer user can control any controllable computer event remotely, without the need for a dedicated, cleared, smooth, flat, horizontal, desktop surface or typical office facilities. Not requiring restrictive, immediate proximity to the controlled computer is a cardinal benefit of this invention and several preferred embodiments.
  • 29. An object of the present invention is to provide a cordless, user/computer interface device operable from any three dimensional location sufficiently proximate to the base transceiver for signals to reach it.
  • 30. Another object of the invention is to provide a computer input device which is operable from any location reasonably close to the computer being controlled, and which does not require a prominent, dedicated, cleared, smooth, flat, horizontal surface or other special surface upon which to run.
  • 31. Another object of this invention is to provide an ergonomically shaped and ergonomically operable device to serve needs of users with physical impairments or handicaps. It is therefore an object of the present invention to provide a wireless GUI-oriented user/computer interface device which is easily attachable to the user's index finger, which can be comfortably “worn” for extended periods of time, and which can be very easily operated by thumbtip and/or forefinger pressure. A further related object is to provide a GUI-oriented user/computer interface device, operable without the need to move the user's hands away from the computer keyboard.
  • 32. It is an object of the invention to provide a device not susceptible to dirt, liquids, or other foreign substances which can be introduced through its' underbody, by eliminating the trackball and aperture, with a “contrarian” product design.
  • 33. Another object is to provide a security option for GUI applications. Given bidirectional functionality of this invention, secured two-way authentication sequences can be used to control LANS, enterprise-wide networks, other network resources, other computing resources, and other controllable machinery.
  • 34. A related object is to provide a secure, mobile, highly flexible GUI equipment design which allows the user to carry his or her own user/computer interface device from one location to another or from a desktop computer to a notebook or laptop computer, with equal facility.
  • 35. Another object is to provide a highly flexible, customizeable GUI equipment design, which can provide multiple basic “personality operating environment” options, using multiple, different “personality modules” (i.e., different ROMs) which can be swapped in and out of device 10, depending on user selection of the needed “personality module”.
  • 36. Another object of the invention is to provide an easy-to-use method for operating GUI software.
  • 37. Another object is to provide a user/computer interface system with very sensitive signal radiating and sensing means, allowing signal transmission and reception without rigorous aiming of the input device.
  • 38. Another object is to provide a user/computer interface architecture which can be configured to provide for an interoperable computing environment, wherein a group or groups of computers can be controlled by one or more authorized users and authorized user/computer interface devices, depending on user and interface device privileges. A related object of this invention is to provide a control unit for an enterprise-wide computer security system.
  • 39. Another primary object of the present invention is to provide computer input and control with a device which is externally switchless, in one preferred embodiment.
  • 40. It is another object of the present invention to provide a method for flexible computer control using wireless signal transmissions.
  • 41. Briefly described, these and other objects of the invention are accomplished with its' method, apparatus, system, and architecture aspects by providing a wireless user/computer interface device adapted to communicate with a base/computer interface device, which is interconnected into a controllable computer equipped with driver software of the present invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • 42.FIG. 1A illustrates a first embodiment of the present invention.
  • 43.FIG. 1B shows the user/computer interface device of FIG. 1A attached to the user's forefinger.
  • 44.FIGS. 1C and 1D show the interface device attached to a support stand.
  • 45.FIG. 2 is a block diagram illustrating a hardware implementation of the present invention.
  • 46.FIG. 3 is a block diagram of basic functional modules of a user/computer interface device according to the present invention.
  • 47.FIG. 4 is a schematic block diagram of the user/computer interface device shown in FIG. 3.
  • 48.FIG. 5 is a block diagram of a computer-interconnected base/computer interface device according to the present invention.
  • 49.FIG. 6 is a schematic block diagram of base/computer interface device shown in FIG. 5.
  • 50.FIGS. 7A and 7B are top perspective views of first and second versions of a first embodiment of a user/computer interface device of the present invention.
  • 51.FIGS. 8 and 9 are bottom perspective views of the device of FIG. 7A.
  • 52.FIG. 10 illustrates devices attached to a user's left and right hand index finger.
  • 53.FIG. 11A is a top perspective view of a second preferred embodiment of the user/computer interface device.
  • 54.FIG. 11B is a top perspective close-up view of the device shown in FIG. 11A.
  • 55.FIG. 12 shows examples of display screens of operational sequences of the device.
  • 56.FIG. 13 is a flowchart showing one embodiment of the security logic of a security version of the device.
  • 57.FIG. 14 shows a block diagram of an enterprise-wide security-oriented computer system.
  • 58.FIG. 15 shows a block diagram of an extended, meshed, enterprise-wide security-oriented network.
  • 59.FIGS. 16A, 16B, 16C, and 16D show general byte maps of user control signals in the form of message packets.
  • 60.FIG. 17 shows an example of access and authorization privileges in a large, enterprise-wide implementation.
  • 61.FIGS. 18A and 18B show top perspective close-up views of a third embodiment of the user/computer interface device.
  • 62.FIGS. 19A and 19B show examples of the interface device operation.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • 63.FIG. 1A, wherein like reference numerals refer to like parts, shows a first embodiment of the present invention. A wireless battery-powered user/computer interface device 10 transmits infrared user control signals 12 through signal transmission system 14, to a base/computer interface device 20, which is interconnected into a computer 30 via a cable 28.
  • 64. Other electromagnetic and/or acoustic signal transmission systems can be used in the present invention. Further, a hardwired signal transmission system can also be desirable where remote wirelessness is not needed or wanted by the computer user. Hardwired versions are less costly to manufacture than wireless versions.
  • 65. Signal generating circuitry (hardware and firmware) is implemented using a variety of well-known transceiver components, depending on the desired signal transmission system (e.g., infrared, radio, acoustic, hardwired, etc.).
  • 66. User control signals 12 are transmitted from a user/computer interface device 10 in response to a computer user's manual operation of switches mounted thereupon (see FIG. 7A for switch detail). After the user presses one or more switches on device 10 to initiate user control signals, infrared signals 12 are generated by an infrared generator comprising one or more light emitting diodes (not shown) and exit the interface device 10 through infrared lens 8. Signals 12 propagate through free space and onto base/computer interface device 20. After detection by device 20, signals 12 are demodulated, de-encrypted, converted to computer-intelligible control signals and/or other control signals, and relayed into computer 30.
  • 67. Computer 30 is connected to display terminal 18. computer 30 can also be optionally connected and/or networked with other interfaceable peripheral devices, one or more local area networks, or other centralized or distributed computers.
  • 68. In FIG. 1A, display screen 19 of display terminal 18 responds to user control signals 12 and other output/display signals from computer 30, such that desired effects of signals 12 can be executed and displayed on the display screen 19.
  • 69. Base/computer interface device 20 includes an optional sonic receiver element 42 adapted to receive and transmit sonic signals from device 10. Device 20 further includes lens 21 behind which stands one or more signal detectors, such as phototransistors, adapted to detect infrared user control signals 12 relayed from device 10, through wireless infrared signal transmission system 14.
  • 70. The base/computer interface device 20 may also include an access key panel 26. Key panel 26 is used to enter access and authorization codes so as to limit access to device 10. Key panel 26 is a locking device, to control access to device 20, to ports 22 and 24, and to interface devices 10. Panel 26 can include conventionally known locking hardware for restricting physical access to device 20 or device 10.
  • 71. Ports 22 and 24 on device 20 are used for recharging of batteries in the interface device 10. Device 10 includes female couplings 11 a (as shown in FIG. 8), for manual coupling onto male couplings 11 b to receive electrical charges for the rechargeable battery contained in device 10. Male couplings 11 b are located within ports 22 and 24 in device 20. Alternatively, either a stand-alone AC/DC transformer (not shown) or the computer's serial port can also be used for recharging the interface device 10.
  • 72. Display screen 19 inherently includes preprogrammed Cartesian (or other conventional) positioning areas to receive cursor coordinates provided by device 10. The ordinate axis 17 b and the abscissa axis 17 c are shown. In addition, origin 17 a is provided, through which a z-axis (not shown) is provided to assist in generation of three-dimensional displays.
  • 73. Additionally, computer 30 has a floppy or other storage disk drive 32 which is adapted to receive a floppy or other suitable diskette package 35 containing digitally encoded instructions 33 for interfacing the software drivers of the present invention with the operating system software of the computer 30. A keyboard 300 is typically attached to the computer as a separate input device. However, keyboard use can be minimized or eliminated, for many important tasks, by use of the interface device 10.
  • 74.FIG. 1B shows a first embodiment of device 10 attached to a computer user's forefinger.
  • 75. Alternatively, as illustrated in FIG. 1C, device 10 can be attached to a support stand 70. The support stand 70 is shown in FIG. 1D. Device 10 is attached to an articulating support 72 of the support stand 70. Hooks 74 a-74 d as shown in FIG. 1D affix device 10 firmly onto support 72. Device 10 can be moved freely about while attached onto support 72. Support 72 plugs a concavity 76 a onto ball joint 76 b of support stand 70. Device 10 could also be directly accommodated onto ball joint 76 b without the articulating support 72 if an appropriate concavity 76 a is included. Telescopic support stand members 70 a can be used to allow extension and contraction of stand 70 to different heights, to suit user preferences. Support 72 can also be used to carry other electronic components associated with device 10 operation, including additional means for creating user control signals, such as positionally activated switches.
  • 76. Support stand 70 can be used in at least two basic ways: 1) with articulating support 72, or 2) without articulating support 72. When device 10 is used attached to stand 72, the feature of travelling in more than one dimension at the same time is provided by the present invention. In combination with a three-dimensional (virtual) display, the computer user can “travel” in two or three dimensions virtually simultaneously. This effect is achieved by liquid conductive switches or other position-activated switches (not shown) pointing in one virtual dimension (e.g., “x”) in combination with either one or two dimensions (e.g., “y” and “z”) pointed to by device 10, using manually operable switches. In summary, virtual “three dimensional” travel can be achieved, which can be helpful for many applications including CAX (i.e., CAD/CAM/CAE); gaming; robotics; education; virtual microscopy; multimedia; and other so-called “virtual reality” applications.
  • 77. Basic Operation
  • 78. Referring now to FIG. 2, an overview of the basic computer interface system is shown. Basic control elements include wireless user/computer interface device 10, signal transmission system 14, base/computer interface device 20 and controlled computer 30. Signal transmission system 14 shown includes bidirectionally operable communication channel 14 a between device 10 and device 20; channel 14 b within device 20, and channel 14 c between device 20 and computer 30.
  • 79. It is well known in the art, that many different signal transmission options are available for communicating between user devices and computers. Thus, a variety of wireless and hardwired signal transmission systems are possible with the present invention. However, infrared signal transmission is the preferred mode and is used in discussing signal transmission.
  • 80.FIG. 3 is a block diagram of the basic functional modules of user/computer interface device 10, showing the main functional modules and the functional module interconnections. Typical hardware and electronic components are arranged to perform signal-generating, signal processing, signal-terminating, and signal-transmitting functions for user/computer interface device 10.
  • 81. Module 100 represents the totality of any possible number of switch arrangements implemented on any given embodiment of user/computer interface device 10. Depending on the particular embodiment, these could correspond to manually operable switches shown as switches 2, 3, 1 a, 1 b, 1 c, 1 d, and 1 e of FIG. 7A or to any other feasible alternative arrangement of switching components. Alternatively, a version of module 100 can be provided which reports switch states of internal switches, based upon the position positionally activated switches.
  • 82.FIG. 4 illustrates switch hardware for four secondary thumbswitches 1 a, 1 b, 1 c, 1 d; for master control thumbswitch 2, for front switches 3 a-3 c, and for mode switch 1 e. (See FIG. 7A) Switch states of switches 1 a, 1 b, 1 c, 1 d, 1 e; 2, and 3 are sensed by switch position sensing module 100 and sensed switch states are communicated to microcontroller 110.
  • 83. Returning to FIG. 3, module 110 represents a microprocessor or microcontroller for processing. The microcontroller 110 detects the position and/or state of the control switches comprised in module 100. Module 110 also serves to encode this information in accordance with predetermined parameters and modulation plans, using the information stored in personality ROM module 118 and ROM (or EEPROM) module 116. The result of this process is a composite signal which is then outputted from module 110 in an organized intermediate signal format and fed into module 114. In general, module 110 is implemented with the use of a microprocessor or microcontroller integrated circuit chip, whose specific functional and operational characteristics depend on the specific type of embodiment being implemented. Integrated circuits of this type are well-known in the art.
  • 84. The arrows between functional modules of user/computer interface device 10, shown in FIG. 3, represent individual or grouped conductive paths to relay signal intelligence and control signals between functional modules.
  • 85. Personality module 118 is a ROM memory storage device, and module 116 is a ROM (or EEPROM) device. These devices store, in protected form, information which is used by module 110 to determine the encoding scheme and other information processing parameters. In general, personality module 118 contains application-specific, environment-oriented information and codes. The information and codes are used to determine the encoding and modulation scheme to be followed by module 110, in accordance with the specific application selected by the user. Module 118 is implemented as an interchangeable ROM cartridge, that can be easily inserted and removed by the user (See FIG. 7A, modules 6 a, 6 b, 6 c). Different ROM cartridges contain encoding and modulation plans and other information corresponding to different software applications. The user needs only to insert the ROM cartridge into device 10 which corresponds to the software application (i.e., “personality operating environment”) to be used.
  • 86. Also, ROM (or EEPROM) module 116 is implemented in device 10 for security-oriented applications. Module 116 contains encryption security information, comprising one or more access by device 10 and authorization security tables for limiting access to any enterprise resource by: 1) one or more user(s); 2) one or more user/computer interface device(s); 3) one or more application(s); 4) one or more file(s); 5) one or more system(s); or 6) one or more network or signal transmission system communication channel(s), within the auspices of an overall, enterprise-wide access and authorization privileges plan. Access by users or devices to any given enterprise resource is either granted or denied, based upon the security clearance of the user or device or based on any other command and control information defined and customized into the access and authorization privileges plan of the specific enterprise, as administered by an authorized systems administrator. Module 116 is not designed to be installed, deinstalled, serviced, or updated by the user, but is controlled by the system or security administrator.
  • 87. The actual implementation or presence of modules 116 and 118 on any given embodiment is optional and depends on the particular application and environment for which device 10 is adapted. Furthermore, the specific integrated circuit chips used to implement functions of modules 116 and 118 is also dependent on the type of application, personality operating environment, or security plan being implemented.
  • 88. The output of module 110 is a composite signal which is relayed into modulator/transceiver module 112. The signal path from module 110 to module 112 passes through module 114, which acts as a kill switch. When switch 114 is open, the signal from module 110 cannot be input to module 112, and output signaling is disabled. The state of the signal path in kill switch 114 is controlled by an external signal for disabling device 10. An external kill signal is generated and transmitted by base/computer interface device 20 or other authorized signal source. A kill signal, when transmitted, is processed by transceiver module 112 of device 10, then relayed into kill switch 114. As illustrated in FIG. 4, AND gate 114 a requires two logical “1” inputs, in order to continue passing the signal from 110 to module 112. A kill signal sends a logical “0” input into AND gate 114 a, opening the kill switch, and thereby removing the path into module 112.
  • 89. Module 112 represents a modulator/transceiver device. This device is implemented as an electromagnetic wave modulator/transmitter (for infrared or radio waves), as an electronic transducer (for sonic or ultrasonic waves), or as a signal buffer/driver (for hard wired transmission). In the case of wireless embodiments using electromagnetic wave transmission methods, module 112 uses the information conveyed to it from module 110 to modulate an electromagnetic carrier wave, either infrared or radio. The modulated signal is then radiated by means of a radiating device suitable to the frequency of the signal being radiated, or transmitted. In the case of wireless embodiments using sonic or ultrasonic waves, module 112 comprises an electronic amplifier and a sonic or ultrasonic transducer, depending on the specific means of transmission.
  • 90. Module 112 produces appropriate signals 12 which propagate through free space and/or air and are received by base/computer interface device 20. The power of the radiated signal or intensity of the acoustic waves is chosen such that the signal can be picked up by device 20, while the distance between device 10 and device 20 is within the intended operating range.
  • 91. Device 10 also includes data input/ output ports 4 a and 4 b adapted to import data from, or export data to, devices interfaceable with device 10.
  • 92. Power for user/computer device 10 is provided through a battery 402, adaptable to be recharged via charge and control circuit 404 contained within base/computer device 20 (FIG. 6). Charging voltage passes through male plug 11 b (FIG. 1A) which interconnects with female plug 11 a of device 10, whenever device 10 is plugged into port 22.
  • 93. Referring now to FIG. 5, a block diagram of the hardware arrangement of the base/computer interface device 20 is provided. Input signals 12 transmitted to device 20 are outputs from device 10. More specifically, signals 12 can comprise infrared, other electromagnetic radiation, optical and/or acoustic signals, depending on specific implementation requirements.
  • 94. In FIG. 5, infrared signals 12 are received by, or transmitted out of, infrared transceiver module 210. Alternatively, or additionally, sonic and/or optical transceiver devices (not shown) can be implemented to carry sonic and/or optical signals into and out of transceiver 210. Transceiver 210 is connected to a microcontroller or microprocessor 212 to decode, or encode, signals 12. Demodulation occurs in a reverse manner to modulation, as is common practice. Information for encoding/decoding microcontroller 212 is provided by memory element 214. Memory element 214 can further comprise one or more ROM options as shown in FIG. 6. ROM 216 includes a predetermined security table and related options. ROM 218 includes a programmable security coding module option. Furthermore, a ROM or other suitable storage device 220 comprises means for modulation/demodulation of signals 12 in accordance with a preset modulation scheme implemented in user/computer interface device 10. Information in storage device 220 is used to interpret the signals 12 from device 10 according to any implemented personality environment. As a result, after incoming signals 12 are received by transceiver 210, they are supplied to the microcontroller 212 for subsequent decoding based upon information provided from memory 214 and memory 220. The decoded signals are then outputted from microcontroller 212 to a computer interface device comprised in module 225, such as a Universal, Asynchronous Receiver/Transmitter (UART) or similarly functional device.
  • 95. Module 225 consists of any conventionally known computer interface device adapted to receive all original switch states generated in user/computer interface device 10, which are interpreted and stored in microcontroller 212, and to transfer those states to a computer 30. Another example of module 225 is a shift register.
  • 96. Depending on the personality module implemented (such as 6 a, 6 b, and 6 c of FIG. 7A), a plurality of different operating environments are possible. Contained within each such personality operating environment (such as security access, CAD/CAM, etc.) are a plurality of modes selected by mode switch 1 e each of which, in turn, controls the functions designated by selectable switches 1 a-1 d.
  • 97. For example, in a CAD/CAM personality operating environment, one selected mode may be “input formatting”, when a user wishes to designate various input formats for the controlled computer. As a result, the secondary control switches 1 a-1 ddesignate different “input formatting” switch functions, such as coloring, shading, hatching, providing standardized geometric figures. A different setting of mode switch 1 e, could involve an “output formatting” mode, with the color, style, and other “output formatting” functions being designated by the secondary control switches 1 a-1 d.
  • 98. In general, different mode switch 1 e position settings and the correspondingly different functions of switches 1 a-1 d(of FIG. 7A) are available for each individual personality operating environment module. This adds great operating flexibility to the present invention.
  • 99. As previously noted, the cable 28 connects the output of base transceiver 20 to the input of an appropriate input port located on the reverse face of computer 30. Upon receipt of the signal outputs from base transceiver 20, the computer 30 (via driver software) then interprets the signal 242. In response, the computer 30 invokes control over the display device 18 (FIG. 1A). The computer can also invoke control over any implemented controllable peripheral device via a direct, indirect, or virtual network connection.
  • 100.FIG. 7A illustrates a detailed perspective view of a first version of a first embodiment of the user/computer interface device 10. As noted earlier, device 10 includes master control thumbswitch 2, which controls positioning of the computer display cursor. Thumbswitch 2 can be very easily operated by the computer user's thumbtip, to control motion of the computer cursor in any direction (e.g., pressing locations 2 a, 2 b, 2 c, and 2 d respectively, runs the cursor up, right, down, and left) (see also FIG. 12). Coordinates 17 a, 17 b, 17 c of FIG. 1A or other coordinates, (e.g., polar coordinates) can be manipulated. Three dimensional coordinates along a z-axis (not shown) are also available through the switch 2 when the input device 10 is configured with an appropriate three-dimensional personality module and/or mode switch 1 e selects three-dimensional operation.
  • 101. Device 10 also contains four adjacent, thumb-operable, secondary control switch elements, 1 a, 1 b, 1 c, and 1 d. Each of the secondary switches respectively provides a different functional choice to the computer. In other words, switches 1 a, 1 b, 1 c, and 1 d functions can be analogous to the function keys on a computer keyboard. Other switches can be mounted upon the control surface 1. The other switches can vary in-number, depending upon the version of device 10, the type of computer being interfaced, the applications being used, or computer environment being served.
  • 102. Mode switch 1 e is a sliding switch, which slides from position 1 e.1 to position 1 e.8, as implemented in the user input device 10. As previously discussed, the mode switch 1 e has significant operational implications in that it can be used to set the functional mode for the various functions represented by switches 1 a-1 d. Thus, each switch 1 a-1 d can, in turn, change function eight times depending upon the setting of mode switch 1 e. The changing of the mode switch setting 1 e thus significantly changes the operating characteristics of switches 1 a-1 d and, in turn, the user input device 10. When device 10 is attached to the user's forefinger, switch 1 e is easily operable by the thumb.
  • 103. On the front face of input device 10 is front switch 3, which can be considered a “click switch.” This switch operates in an analogous fashion to selection switches typically available on “mouse-type” input devices. As illustrated, the front switch 3 can toggle upward or downward and click “on” so that data upon which the cursor rests is entered into the computer 30. The capacity to perform these and other control actions allows the user to access a variety of control options in the computer 30. In yet other embodiments, the front switch 3 can, from its' center position, “click” directly inward one or more times to perform yet other “click” or multiple “click” functions.
  • 104. Still referring to FIG. 7A, a lens 8 or any other appropriately implemented signal transmissive means for transmitting predetermined selected infrared or other signal type 12 is provided. At the end of lens 8, an acoustic signal emission port 41 is illustrated. Interface device 10 can implement an acoustic emission and detection option. The acoustic port 41, in conjunction with acoustic signal generation means, can emit predetermined selected acoustic signals, when enabled. When implemented, port 41 requires a corresponding implementation of port 42 on base transceiver 20 of FIG. 1A. Port 42 is an acoustic detector, and acoustic signals emitted from port 41 are detected therewith. Any known conventional acoustic transceiver hardware can be used.
  • 105. Referring still to FIG. 7A, personality module 6 a is shown loaded into input device 10. The personality module 6 a can be easily removed from the personality module cabinet 6. The module consists of, for example, a cartridge enclosed ROM having stored within it the various predetermined functions and data associated with the operating environment choice. Depending upon the personality module selected by the user, different primary “operating personalities” of the computer user input device 10 can be chosen by the computer user.
  • 106. Alternative personality modules 6 b and 6 c can be selected by the computer user to implement different fundamental operating environments. Other personality modules can also be used to replace installed personality module 6 a simply by removing module 6 a from cabinet 6, and inserting either module 6 b, or module 6 c, or any other module.
  • 107. Data port 4 a can also be used for a variety of data inputs and outputs. When used' as a data input port (e.g., to rewrite an EEPROM, for the purpose of updating security information) input port 4 a affords much additional flexibility to device 10. It can be observed that 4 a security updates, changing the access and authorization privileges of device 10 can help achieve objectives of the invention. When used as a data export port, data port 4 a can be used to output device 10 stored data, which has been made accessible for export.
  • 108. Device 10 does not require strapping to the user's hand, finger, wrist, etc., in order to operate properly. while strapping by means of a strap 40 a and strap 40 b is useful, strapping does not affect basic functions of device 10. Strap 40 a and 40 b can be padded (not shown) to provide greater user comfort. A padded lining contributes to the ergonomic design of device 10—allowing the user to comfortably wear device 10 for prolonged periods. Other strap designs or arrangements are contemplated. Other attachment means can attach device 10 to the human hand, wrist, finger, etc. For example, a ring or ring-type attachment means is shown in FIG. 9.
  • 109. Prior to attaching device 10 to the left or right forefinger, straps 40 a and 40 b are affixed into the side of device 10 to which the user will attach his or her forefinger.
  • 110. An important feature of the first preferred embodiment of device 10 is “ambidexterity.” Device 10 has symmetrical, arcuate-shaped surfaces 99 a and 99 b which accommodate either left forefinger (99 a) or right forefinger (99 b) attachment with equal facility, to suit user preference or immediate needs.
  • 111. Strap 40 a and 40 b are attached to device 10 by attachment fittings. Alternatively, “snap-in” fittings can be implemented on straps 40 a and 40 b, which can be snapped into complementary fitting receptacles or directly into concavities on device 10.
  • 112. When straps 40 a and 40 b firmly encircle the forefinger phalange, affixing it into the left or right side arcuate-shaped surface 99 a or 99 b of device 10, strap padded lining helps to promote an “illusion of weightlessness”. This “illusion” can be provided, due to the padding and substantial construction of the straps. In reality, the light weight of device 10 is made to be perceived as “featherweight” given 1) the padding, 2) the firm encirclement of the phalange, 3) the staging of device 10 when affixed to the phalange according to design, and 4) the easy balance achieved on the user's hand, given the above. FIG. 7B illustrates a second version of the first embodiment shown in FIG. 7A which contains similar elements to that of FIG. 7A except for the inclusion of thumbswitch selector switch 2 z. As illustrated in FIG. 9, an alternative means for attaching the computer user input device 10 to the computer user's finger can be a releasably secured ring 50. Ring-type attachment means may be desirable for some computer user preferences. Other strap attachment means can be provided, including leather. However, it appears that Velcro(R) straps with a padded lining best achieve the “illusion of weightlessness.”
  • 113.FIG. 10 shows device 10 worn attached onto both of a user's forefingers. A keyboard 300 can also be used jointly with device 10 for even greater flexibility in computer control.
  • 114. Referring to FIG. 11A, a second embodiment of a user/computer interface device 1100 is shown attached onto a user's finger. The device is a simplified version of device 10 and includes only a thumbswitch 1102 and a thumbswitch selector switch 1104.
  • 115.FIG. 11B is a top perspective close-up view of the second embodiment shown in FIG. 11A. The 1102 and 1104 switch arrangement operates in a similar fashion to thumbswitch 2 and front switch 3 of the first preferred embodiment, shown in FIG. 7A.
  • 116.FIG. 12 illustrates control of a cursor on the display screen. Display screen 19 inherently includes a preprogrammed Cartesian (or any other conventional or customized) positioning mechanism to receive cursor coordinates provided by computer 30, in accordance with user control signals 12, initiated and transmitted using device 10. As a primary point of reference, origin 1217 a is provided.
  • 117.FIG. 12 shows a series of different, sequential examples of cursor movement events. In screen 1204 a, the default position, 1217 a, is shown in the center of the screen. Screen 1204 a is always the beginning screen of a cursor reinitialization and movement sequence. Default position 1217 a is always presented in any basic cursor reinitialization control sequence, unless an alternative default position is customized by the user.
  • 118. Computer control events, such as cursor reinitialization, movement, select functions, and “click and drag” functions, are accomplished by using one of the secondary thumbswitches 1 a-1 d, in combination with master thumbswitch 2 and front switch 3. As discussed above, the specific functions assigned to the thumbswitches depends upon the mode switch 1 e in combination with personality module and with any application software used in computer 30.
  • 119. Successive screens show a progression of directional cursor moves using user control signals initiated by thumbswitch 2. (For examples of “click and drag” action, using both thumbswitch 2 and front switch 3, see FIG. 20, which discusses usage of the present invention with “mapped keys” in a word processing application.) For example, directional pressure on switch 2 moves the cursor up in the direction of direction 2 a; cursor right, 2 b; cursor down, 2 c; and cursor left, 2 d. Other directions are implemented with appropriate switching circuitry and/or driver or other application software.
  • 120. Security Control
  • 121.FIG. 13 shows a typical security “operating personality” environment, wherein an “annunciator” and an “interrogator” are shown in a security-oriented dialogue.
  • 122. Each user/computer interface device 10 is assigned a unique “device password.” In like manner, each computer user is assigned a unique password. When combined together, user passwords and device passwords form a composite password which is designated the “annunciator” password. The composite password, itself, is a unique password specifically identifying both the computer user and interface device to the security system. Each layer of one or more layers in a security system logic is designated an “interrogator.” Both user and device passwords must be authorized to gain computer access, i.e. to any enterprise resource, or to complete a transaction. If either password is not authorized to gain access to complete the requested transaction, then the attempted transaction is determined “illegal” by the interrogator security logic. After any transaction is deemed “illegal,” alarm features are activated, including activation of the kill switch 114 of device 10.
  • 123. Referring now to FIG. 13, interrogator functions are started at step 1302. The interrogator begins a wait state loop at step 1304, listening for any annunciator signal. At step 1306, the interrogator is testing for the annunciator signal receipt from any annunciator desiring access. Until the answer is “yes,” the system loops, listening for input from any annunciator.
  • 124. Annunciator signal generation capability is active in device 10 at step 1308, given that device 10 contains a security personality module or other means to be responsive to manual input of a computer user's password. At step 1310, the user inputs a password into device 10, and the annunciator functions are enabled.
  • 125. The correct user password enables the device 10 for transmission of an annunciator signal. At step 1312, the annunciator signal is transmitted from device 10 to the cognizant interrogator. An interrogator may be located in base/computer transceiver 20 which is interconnected into the individual computer (or other access point) being accessed. Alternatively, interrogators may be located at any access points to any enterprise resource.
  • 126. After an annunciator signal is received at step 1306, the process for validating and identifying the annunciator signal and identifying the user begins. At step 1315, the interrogator determines if annunciator input device 10 transmitted a composite ID along with the annunciator signal. If ID is received by step 1315, it is examined, at step 1326, for validity.
  • 127. However, if a complete composite ID is not received, at step 1316, the interrogator generates a “Who Are You” inquiry, which is transmitted at step 1318 to the annunciating device. Once the interrogator's “Who Are You” signal is received at step 1320, the annunciator processes the “Who Are You” identification request at step 1322. If ID was not sent, and a “Who Are You” signal is not received, then the annunciator signal is regenerated at step 1310 and transmitted (or retransmitted if it was missed) at step 1312.
  • 128. The annunciator (device 10) authentication/ID response to the interrogator “Who Are You” inquiry begins at step 1322, after receipt of the “Who Are You” signal. At step 1323, the authentication/ID is fetched from input device 10's memory at step 1324.
  • 129. Once the ID has been fetched from memory, it is transmitted at step 1325 to the interrogator, which is waiting for this signal at step 1326. If the signal is not received at step 1326, at step 1328 the interrogator loops back N number of times to reinitiate the authentication/identification process. Once the N number of loops have been exceeded, however, then an alarm routing is called at step 1330.
  • 130. If, on the other hand, the ID signal is received at step 1326, the authentication process beings at step 1332. At step 1334, the interrogator transmits the received signal to the security memory for comparison against the authentication/ID database (module 216 of FIG. 6). At step 1336, the received authentication/ID is compared in the security database and an ID status message is then produced at step 1338. At step 1340, the authentication/ID received by the interrogator is determined to be either valid or invalid.
  • 131. If the ID is valid, then at step 1346 the access enabling process is entered. Two events then occur. First, an annunciator enable signal is generated at step 1348 and the interrogator transmits the annunciator enable signal at step 1352 to the annunciator. Second, the transaction monitor routine is initiated at step 1347 to ensure that each attempted transaction is legal. On the annunciator side, the enable process is initiated at step 1354 with receipt of the interrogator-initiated annunciator enable signal. Before any enablement occurs, however, the system tests at step 1356 to determine whether more work is being done.
  • 132. If the answer is yes, then the system will loop back till all work is completed and, once no more work is to be done—i.e., annunciator enablement is no longer required—the annunciator will end the session with a logoff transaction. Immediately after logoff, the device annunciator can automatically disable itself (unless other arrangement is made) and it can only be re-enabled by repeating the entire security procedure by an authorized user.
  • 133. If the ID is invalid, a record of the ID and the attempted transaction is made in the security memory 1336 and the alarm process is initiated, step 1344.
  • 134. An additional important aspect of the system is that interrogator functions can be configured by the system administrator or security administrator to be repetitive—i.e., the interrogator can be set to periodically request the annunciator/ID during the progress of any access session being made by the user/annunciator, in accordance with a policy of the computer institution or device being accessed.
  • 135. In the interrogator, the “continue annunciator enable?” routine is continuously occurring in the background at step 1350. With this routine, initiated at step 1347, the security system is continuously testing to see if each specific annunciator transaction being attempted should continue to be enabled or permitted, within the context of the session. Routine 1350 can disable annunciator operation at any point that the interface device annunciator steps beyond its' access and authorization privileges.
  • 136. If an alarm procedure is to be activated, steps 1370, 1330 or an invalid ID at step 1340, then the requested transaction and the IDs are recorded by the interrogator for security monitoring at step 1344. The interrogator then determines whether to disable the device, starting at step 1380. The ID and transaction information is again compared with the authorization and access privileges plan in the ROM 1336. If the device is to be disabled, step 1390, the kill signal is transmitted at step 1392. Otherwise, a message as to the reasons that access is being denied is presented to the user, step 1396, or the device is enabled, step 1346.
  • 137. In enterprise-wide security-oriented configurations, one or more individually authorized user/annunciator(s) are granted access to enterprise resources (PCs, networks, applications, etc.) by any “interrogator” at one or more levels—i.e., the total security system logic contained in input device(s), and/or base transceiver(s), and/or LAN network server(s) and/or centralized computer(s) and/or mainframe(s)—which interrogates any annunciator seeking access, to verify that both the user/annunciator and the device/annunciator are authorized access, and what level or levels of access are authorized.
  • 138. In this example, the security module contained in base transceiver 20 which interacts with input device 10 is the interrogator. Any annunciator—i.e., any input device 10—operated by any user desiring access—any authorized user/annunciator—transmits an access request signal to the interrogator.
  • 139. In summary, access control begins by initiating enterprise-wide security system interrogator functions—this is typically done by the system administrator or security administrator. This occurs in similar fashion to initiating a local area network server and its' client devices (e.g., as in Novell, 3COM, or other LANs). This can also occur similar to initiating other types of centralized or distributed networks (e.g., SNA); teleprocessing systems (VTAM, TCAM, etc.); transactions processing applications (e.g., CICS) or other operating system, application, or system access method.
  • 140. What is first done is to define all enterprise resources to the core security system, and access/authority levels of these system resources, as is already known in the art of computer security. Then, all user/annunciators are defined to the security system to provide a complete deterministic, closed system, accessible only to authorized devices and users in accordance with individually assigned access privileges under a defined access and authorization privileges plan.
  • 141. The present invention adds the unique features of 1) an access-seeking user/annunciator which must access user/computer interface device 10 with a password; and 2) after a satisfactory user access, device 10 then transmits its' own composite annunciation signal; and 3) distributed or centralized security logic then authenticates that specific input device 10's annunciation signal and allows only access authorized to that specific device 10 and that specific user.
  • 142. Referring now to FIG. 14, groups of computers are shown arrayed within a computing institution or enterprise.
  • 143. One or more properly authorized users (using one or more properly authorized devices 101-110 of the type of device 10 of FIG. 1A) in the local area network (LAN) shown in FIG. 14 can gain access to any computer on the LAN implementing the present invention, under the organizational auspices of an all-encompassing, enterprise-wide, security-oriented access and authorization privileges plan.
  • 144. Two reasons for implementing such a method and apparatus approach, are 1) the need for one or more levels of security; and 2) the need for controlled “user and device portability” to allow access by any authorized user to any authorized enterprise resource. The access and authorization privileges aspects of the present invention are highly flexible, and can be implemented in a number of ways, to suit virtually all user needs. A system administrator only needs to define users, define devices, define one or more levels of access, and define enterprise resources in order to develop an access and authorization privileges plan.
  • 145. An extended enterprise-wide computer control system is shown in FIG. 15. Six LANs comprised of twenty computers each are shown. All twenty computers “local” to each LAN are controlled by ten “local” user input devices associated with each respective LAN. FIG. 15 thus illustrates an implementation of the present invention which has subdivided an enterprise-wide group of one hundred twenty computers into six LAN operating groups which each operate as separate LANs. Separately, shown to the left of the 120 computers, are four “grand master” input devices, labelled I, II, III, and IV. Each of these grand master input devices can access all 120 of the enterprise's computers. By virtue of their “grand master” status, each can operate on all six LANs shown.
  • 146. In summary, the 120 computer, 6 LAN enterprise of FIG. 15, has defined in its' access and authorization privileges plan, such that only four grandmaster (I, II, III, and IV) input devices need to be authorized access to all 120 computers. While each LANs' group of ten input devices could be implemented to access one or more other LANs' computers, in this example, only “grandmasters” access all six LAN groups' computers.
  • 147. Access and authorization privileges plans can vary from simple network definitions, to advanced, meshed, layered network definitions. Advanced access and authorization plans can include definitions which control access to complex networks with inter-LAN gateways, access to other centralized or distributed computers such as mainframes, or any other enterprise resource definitions suitable to the enterprise's needs.
  • 148. Packet Byte Maps
  • 149.FIGS. 16A, 16B, and 16C each show a table illustrating examples from different classes of user control signals. These signals are represented in message packets. Device 10, using the personality modules, can transmit a plurality of distinct user control signals based upon different encoding sequences. Control signals are transmitted as bursts of coded pulses comprising one or more message packets. Each message packet includes a plurality of fields of encoded characters. The type of control signal determines the packet fields and organization. The format of the message packet is determined by the selected personality module.
  • 150.FIG. 16A illustrates the message packet format for computer control, which includes a high-level, security-oriented ROM personality module 118 (FIG. 3) or for use of the security EEPROM 116 (FIG. 3). Field 1 is a flag indicating the start of the message packet. Fields 2-4 provide ID information designating the device and the user. This information is used to determine proper authorization privileges in the annunciator/interrogator dialog. (See FIG. 13) The command/control information is contained in field 5. This information relates to the setting of the switches on device 10 and provides the actual computer control signals. Finally, a stop flag indicates the end of the packet. Security functions can be included with other processing information. If security is implemented, the ID information fields are included at the beginning of each message packet. The remaining function information is contained in the succeeding fields.
  • 151.FIG. 16B illustrates a message packet for a CAX personality module. Fields 1 and 2 start the message and relate ID information similar to the security-oriented packet of FIG. 16A. In addition, information defining the personality environment can be included in the beginning fields. Field 3 contains design mode information, such as a manufacturing or electrical design mode. For example, the design mode can be determined by the setting of mode switch 1 e. Fields 4-6 contain the basic command function data, such as “insert”, “delete”, and “create a point.” The command function data or user control signals would generally be designated by pressing switches 1 a-1 d. Some functions are defined in relation to the previous function performed. Therefore, fields 5 and 6 are used to provide the prior function state and change. Any additional information or control data required for a specific function is included in field 7. An end message flag would also be included for this packet.
  • 152.FIG. 16C illustrates other user control signals. FIG. 16C.1 shows functions used to initiate processing on a system. After successfully accessing a specific computer or other resource, cursor movement and an example format for cursor control are illustrated in FIG. 16C.2. For function processing, the mode, i.e. the setting for switch 1 e, is needed, and is included in field 5. When function keys 1 a-1 d are pressed, the indicated key meaning is included in field 7 (FIG. 16C.4). The meaning is also affected by the mode referenced in field 5.
  • 153. For each message packet, the base/computer interface device 20 determines the type of packet and the relevant information contained in the packet. Appropriate control signals are generated and transmitted to the computer 30 to execute the functions.
  • 154.FIG. 16D illustrates a message packet for a control signal from the base/computer device 20 to the user/computer interface device 10. The bidirectionality of the signal transmission system allows for signals both to and from the base/computer interface device 20. The packet fields are used to transfer information for activating the kill switch due to improper access requests.
  • 155.FIG. 17 shows a table, indicating an example of the general organization of a predetermined access and authorization privileges plan, developed for the purpose of safeguarding access to confidential data and restricting access to enterprise resources, as needed to satisfy a wide range of user requirements.
  • 156.FIG. 17 shows an example of one possible enterprise-wide access and authorization privileges plan. Enterprise resources 800 comprise resource categories R1 through RN.
  • 157. R1 resources 810 comprise six separate LANs—LAN100 through LAN600, as illustrated in FIG. 15. R2 resources 820 comprise mainframe resources accessible in enterprise-wide network 800. R3 resources 830 comprise network resources, including communications channels used within enterprise resources 800. R4 resources 840 comprise application resources which can include teleprocessing monitors, database applications, spreadsheet applications, network applications, etc. R5 resources 850 are not used, in this example, but can be made available at a future time. R6 resources 860 comprise the set of all user/computer interface devices 0001 through 0060 and grandmaster devices I, II, III and IV.
  • 158. Separately, each authorized computer user, using enterprise resources 800, is assigned an unique password identification. In combination with the password of any interface device 0001-0060, shown as R6 resources 860— a user password will allow the user to access any enterprise resource for which he/she is authorized access, to the extent that the user/device composite is allowed.
  • 159. To summarize, users are assigned passwords for the purpose of accessing at least one user/computer interface device; and, after accessing device 10, for accessing one or more specific computers, applications, or any other enterprise resource, to which the user is authorized access. With this method, computer users do not access the computer directly. Users access user/computer interface devices, which in turn access computers using a combination user password and a device password.
  • 160. An access and authorization plan can be implemented in the operating system software; applications software; ROMs; and/or EEPROMs, depending on user needs and implementing means chosen.
  • 161. In one implementation, the access and authorization information is stored in the electrically-erasable programmable read-only memory (“EEPROM”) 116 (FIG. 3) of device 10. The information can include unique identifying data for the device or user for use in security-oriented applications. To reflect changes in the organization, an EEPROM programmer device, in conjunction with specialized data input ports 4 a and/or 4 b, can be used to repeatedly change or update EEPROMS.
  • 162.FIG. 18A shows a basic version of a third embodiment of device 10. FIG. 18B shows an advanced version. Both versions can be attached to the left or right forefinger.
  • 163. The third embodiment, comprising several versions, is especially adapted for computer input and control needs of users with mobility impairments or other serious physical handicaps. The embodiment shown in FIGS. 18A and 18B are specifically adapted for attachment to the user's right forefinger. Devices of the third embodiment are explicitly designed to serve the market niche often referred to as the “assistive technologies.”
  • 164. Much of the discussion applicable to the first embodiment applies to the third embodiment, in terms of means to enable it. One arcuate-shaped surface 89 is available to accommodate the left or right forefinger. This is in contrast to the two arcuate-shaped surfaces 99 a and 99 b of the first embodiment (FIG. 7A).
  • 165. Straps 40 a and 40 b operate to encircle the proximal phalange of the forefinger and affix it into arcuate-shaped surface 89. Also, the third embodiment shows a different placement of personality module cabinet 6, in FIG. 18B. As in the first preferred embodiment, interchangeable ROM cartridges (such as 6 a, 6 b, and 6 c of FIG. 7) are used therein.
  • 166.FIG. 19A shows a “mapped keys” application for use in word processing in connection with the interface device of the present invention. Device 10 can be used to move the cursor over one or more “expanding maps” of keyboard layouts. The user can then select “mapped keys” to spell out words without need of a keyboard. In this application, the user is presented with successive expanding screens of letters and numbers in the format of the keyboard layout of the user's choice. An area of the displayed keyboard is selected according to the direction of cursor movement. Upon arriving at the chosen character, the user selects it. This seek-and-select procedure is repeated, to select a series of characters in succession until the desired word is spelled out.
  • 167. The screens can display a standard QWERTY keyboard layout or any customized key layout can be used, to suit user preferences.
  • 168. In FIG. 19A, successive screens 1901 a, 1901 b, and 1901 c show expanding screens “zooming-in” on an area of a displayed keyboard layout, according to the direction of cursor movement. Upon arriving at the chosen character, the user selects it, as in screen 1901 d, which shows selection of the letter “M”. Screens 1902 a, 1902 b, and 1902 c show selection of the letter “O”.
  • 169. As demonstrated, this seek-and-select procedure in a word processing application is repeated, to select a series of characters and spaces in succession until the desired word is spelled out. Many other applications provide a portfolio of other utility which can include similar seek and select sequences. For example, a “Morse code wheel,” as illustrated in FIG. 19B, can be operated in a manner similar to the word processing sequence. The “wheel” screen is comprised of sectors, wherein each sector represents one or more dots or dashes. To encode a series of characters, the user “seeks and selects” different dot or dash patterns, by moving the cursor from one sector to another to encode one or more alphanumeric characters. The user continues to select character encoding sequences to spell words. Spaces between words are selected, as needed. When the user reaches the end of their desired message he/she selects the “xmit” sector, which automatically routes the encoded message train to a preselected destination.
  • 170. Throughout this document, the term “transceiver” has been used to illustrate that according to the present invention, both a “transmitter” and a “receiver” are provided together in close proximity for two-way, or “duplex” operation (i.e., base transceiver 20 has both a transmitter and a receiver contained within the same electronic enclosure, as both transmission and reception of signals are provided therein). By like reasoning, the user input device 10 can also be designated as a “transceiver” given its duplex communications capability shown by “kill circuit” and other “receiver” functions.
  • 171. Although specific embodiments are specifically illustrated and described herein, it will be apparent that many modifications and variations of the present invention are possible in light of the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.

Claims (21)

What is claimed is:
1. A method for user and computer interface, comprising the steps of:
operating switches on at least one user interface device;
generating user control signals in response to operation of said switches;
transmitting said user control signals through at least one communications channel;
receiving said transmitted user control signals;
interpreting said transmitted signals;
transmitting said interpreted signals to at least one controllable computer;
controlling operation of said controllable computer in response to said interpreted signals.
2. The method of
claim 1
, further comprising the step of controlling a display connected to said computer in response to said interpreted signals.
3. A user and computer interface system, comprising:
an ergonomically shaped, finger attachable interface means for generating user signals, wherein said user signals include message packets;
coding means for encoding said message packets in the form of bursts of coded signals;
signal transmission means for transmitting said coded signals, said signal transmission means including at least one communication channel;
detecting means for receiving said bursts of coded signals;
conversion means for converting said bursts of coded signals received by said detecting means into computer processable signals; and
at least one computer for processing said computer processable signals.
4. The system of
claim 3
, wherein said at least one computer includes a graphic user interface.
5. The system of
claim 3
, further comprising display means for displaying images generated by said at least one computer in response to said interpreted signals.
6. The system of
claim 3
, further comprising a keypad means for entering user passwords and enabling at least one enterprise resource connected to said at least one computer.
7. The system of
claim 3
, wherein said interface means further comprises means for inputting and exporting data.
8. The system of
claim 3
, wherein said interface means includes manually operable element means for generating said user signals.
9. The system of
claim 3
, wherein said interface means includes at least one switch means for initiating generation of user signals based upon a position of said at least one switch means.
10. The system of
claim 3
, wherein said interface means further includes a mode switch means for altering said user signals based upon a position of said mode switch means and said at least one switch means.
11. The system of
claim 3
, further comprising:
kill transmission means for transmitting a kill signal from said computer; and
wherein said transmission means includes a kill switch for receiving said kill signal and deactivating transmission means in response to said kill signal.
12. The system of
claim 3
, wherein said interface means includes a ROM cabinet means adapted to receive at least one removably insertable ROM personality module, wherein said user signals are generated in relation to information contained in said personality module.
13. The system of
claim 3
, wherein said interface means includes attachment means for encircling and affixing said interface means onto the proximal or medial forefinger phalange of a user.
14. The system of
claim 3
, further comprising:
an ergonomically shaped structural envelope enclosing said interface means, said coding means and said transmission means;
wherein said structural envelope includes:
at least one arcuate shaped enclosure surface means adapted for receiving the proximal or medial phalange of the forefinger of said a user; and
attachment means for encircling and affixing said proximal or medial forefinger phalange of said computer user into said at least one arcuate shaped enclosure surface means.
15. The system of
claim 3
, wherein said at least one communications channel includes electromagnetic radiation.
16. The system of
claim 3
, wherein said at least one communications channel includes infrared transmissions.
17. The system of
claim 3
, wherein said at least one communications channel includes acoustic transmissions.
18. An support apparatus for anchoring and supporting an ergonomic, forefinger attachable interface device when not attached to a forefinger, comprising:
a base;
a telescoping support, including at least two concentric tubular members; and
a rounded support fitting for attaching said interface means.
19. A security system in connection with a wireless user/computer interface device, comprising:
a user interface device for generating and transmitting computer control signals, including:
identifying means for transmitting a unique identifying signal, and
disabling means for disabling transmission of said computer control signals upon receipt of a kill signal; and
a receiver device for receiving said computer control signals, including:
receiving means for receiving said unique identifying signal;
comparison means for comparing said unique identifying signal with a predetermined set of authorized signals,
transmission means for transmitting said kill signal if said unique identifying signal does not match one of said predetermined set of authorized signals, and
control means for transferring said computer control signals to a computer if said unique identifying signal does match one of said predetermined set of authorized signals.
20. The system of
claim 19
, further comprising:
a plurality of user interface devices, wherein each interface device has a distinct identifying signal; and
a plurality of receiver devices respectively connected to a plurality of computers, wherein each receiver device includes a distinct predetermined set of authorized signals, such that each interface device can operate at least one of said receiver devices.
21. A method for security control for a user and computer interface, comprising the steps of:
assigning unique identifying passwords to each of a plurality of users;
assigning unique identifying passwords to each of a plurality of interface devices;
defining, for at least one enterprise resource, a set of user passwords and a set of interface device passwords to be allowed access to said enterprise resource;
transmitting to said at least one enterprise resource user control signals including a user password and a device password;
comparing said user and interface device passwords to said set of passwords to be allowed access to said enterprise resource;
operating said enterprise resource according to said user control signals if said user and interface device passwords are included in said set of passwords to be allowed access.
US09/732,112 1989-11-22 2000-12-07 Ergonomic customizeable user/computer interface devices Expired - Fee Related US6441770B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/732,112 US6441770B2 (en) 1989-11-22 2000-12-07 Ergonomic customizeable user/computer interface devices

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US44077189A 1989-11-22 1989-11-22
US07/879,374 US5481265A (en) 1989-11-22 1992-05-07 Ergonomic customizeable user/computer interface devices
US08/581,429 US5729220A (en) 1989-11-22 1995-12-29 Ergonomic customizable user/computer interface device
US09/037,061 US6201484B1 (en) 1989-11-22 1998-03-09 Ergonomic customizeable user/computer interface device
US09/732,112 US6441770B2 (en) 1989-11-22 2000-12-07 Ergonomic customizeable user/computer interface devices

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/037,061 Continuation US6201484B1 (en) 1989-11-22 1998-03-09 Ergonomic customizeable user/computer interface device

Publications (2)

Publication Number Publication Date
US20010000433A1 true US20010000433A1 (en) 2001-04-26
US6441770B2 US6441770B2 (en) 2002-08-27

Family

ID=23750112

Family Applications (4)

Application Number Title Priority Date Filing Date
US07/879,374 Expired - Lifetime US5481265A (en) 1989-11-22 1992-05-07 Ergonomic customizeable user/computer interface devices
US08/581,429 Expired - Lifetime US5729220A (en) 1989-11-22 1995-12-29 Ergonomic customizable user/computer interface device
US09/037,061 Expired - Lifetime US6201484B1 (en) 1989-11-22 1998-03-09 Ergonomic customizeable user/computer interface device
US09/732,112 Expired - Fee Related US6441770B2 (en) 1989-11-22 2000-12-07 Ergonomic customizeable user/computer interface devices

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US07/879,374 Expired - Lifetime US5481265A (en) 1989-11-22 1992-05-07 Ergonomic customizeable user/computer interface devices
US08/581,429 Expired - Lifetime US5729220A (en) 1989-11-22 1995-12-29 Ergonomic customizable user/computer interface device
US09/037,061 Expired - Lifetime US6201484B1 (en) 1989-11-22 1998-03-09 Ergonomic customizeable user/computer interface device

Country Status (5)

Country Link
US (4) US5481265A (en)
EP (1) EP0500794A4 (en)
JP (1) JPH05502130A (en)
AU (1) AU7788191A (en)
WO (1) WO1991007826A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003081414A1 (en) * 2002-03-25 2003-10-02 David Michael King Gui and support hardware for maintaining long-term personal access to the world
US20040017359A1 (en) * 2000-04-06 2004-01-29 Bohn David D. User interface utilizing a computer pointing device with infrared bridge
US6920557B2 (en) 2002-06-28 2005-07-19 Pitney Bowes Inc. System and method for wireless user interface for business machines
US7225262B2 (en) 2002-06-28 2007-05-29 Pitney Bowes Inc. System and method for selecting an external user interface using spatial information
US20070222759A1 (en) * 2006-03-23 2007-09-27 Barnes Cody C Computer pointing device
WO2008094383A1 (en) * 2007-01-29 2008-08-07 Fred Bassali Advanced vehicular universal transmitter using time domain with vehicle location logging system
US7649536B1 (en) * 2006-06-16 2010-01-19 Nvidia Corporation System, method, and computer program product for utilizing natural motions of a user to display intuitively correlated reactions
WO2016008041A1 (en) * 2014-07-15 2016-01-21 Synaptive Medical (Barbados) Inc. Finger controlled medical device interface
WO2018217436A1 (en) * 2017-05-26 2018-11-29 Covidien Lp Controller for imaging device

Families Citing this family (223)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7537167B1 (en) 1993-08-31 2009-05-26 Broadcom Corporation Modular, portable data processing terminal for use in a radio frequency communication network
EP0500794A4 (en) * 1989-11-22 1993-02-03 David C. Russell Computer control system
US7383038B2 (en) * 1990-01-18 2008-06-03 Broadcom Corporation Modular, portable data processing terminal for use in a radio frequency communication network
JPH0619978A (en) * 1992-06-30 1994-01-28 Sony Corp Reproducing and display device
US7098891B1 (en) * 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US5534865A (en) * 1992-11-20 1996-07-09 Kriegsman; Irving M. Ergonomic wireless remote control transmitter for use with consumer entertainment electronics appliance
EP0606704B1 (en) * 1993-01-12 1999-04-14 Lee S. Weinblatt Finger-mounted computer interface device
JPH075984A (en) * 1993-01-29 1995-01-10 At & T Global Inf Solutions Internatl Inc Mouse pointing device
JPH07284166A (en) * 1993-03-12 1995-10-27 Mitsubishi Electric Corp Remote controller
US7853254B2 (en) * 1993-08-31 2010-12-14 Broadcom Corp. Modular, portable data processing terminal for use in a radio frequency communication network
US5489922A (en) * 1993-12-08 1996-02-06 Hewlett-Packard Company Hand worn remote computer mouse
WO1995024714A1 (en) * 1994-03-11 1995-09-14 Elonex Technologies, Inc. Removable pointer device
EP0696014B1 (en) * 1994-07-28 2000-09-20 Hewlett-Packard Company Pressure sensitive input device wearable around a human finger
US6344845B1 (en) * 1994-07-29 2002-02-05 Sony Corporation Position inputting device and video signal processing apparatus
US6137476A (en) * 1994-08-25 2000-10-24 International Business Machines Corp. Data mouse
SE504846C2 (en) * 1994-09-28 1997-05-12 Jan G Faeger Control equipment with a movable control means
JPH08106372A (en) * 1994-10-07 1996-04-23 Ibm Japan Ltd Control method/device for object put on computer
US5638092A (en) * 1994-12-20 1997-06-10 Eng; Tommy K. Cursor control system
US5832296A (en) * 1995-04-26 1998-11-03 Interval Research Corp. Wearable context sensitive user interface for interacting with plurality of electronic devices of interest to the user
WO1996039679A1 (en) * 1995-06-06 1996-12-12 Mrigank Shekhar Pointer device
US5812371A (en) * 1995-07-25 1998-09-22 Compal Electronics, Inc. Orientation-adjustable infrared transceiver used in a notebook type computer
US5823782A (en) * 1995-12-29 1998-10-20 Tinkers & Chance Character recognition educational system
JP2001513918A (en) 1996-01-26 2001-09-04 オラング―オタング コンピューターズ インコーポレイテッド Key palette
US7470244B2 (en) * 1996-01-26 2008-12-30 Harrison Jr Shelton E Flexion-discouraging splint system, method and device
EP0789320A3 (en) * 1996-02-09 1998-10-21 Pegasus Technologies Ltd. Computer mouse and holder
US5699357A (en) * 1996-03-06 1997-12-16 Bbn Corporation Personal data network
GB2312040A (en) * 1996-04-13 1997-10-15 Xerox Corp A computer mouse
US6624824B1 (en) * 1996-04-30 2003-09-23 Sun Microsystems, Inc. Tilt-scrolling on the sunpad
WO1997043706A1 (en) * 1996-05-10 1997-11-20 Sony Corporation Method and device for transmitting key operation information and transmission-reception system
US5801918A (en) * 1996-07-12 1998-09-01 Hand Held Products, Inc. Ergonomic housing for a micro computer
US5815679A (en) * 1996-07-23 1998-09-29 Primax Electronics, Ltd. Interface device for controlling computer peripherals
US5854624A (en) * 1996-09-12 1998-12-29 Innovative Device Technologies, Inc. Pocket-sized user interface for internet browser terminals and the like
US6618039B1 (en) * 1996-09-12 2003-09-09 Gerry R. Grant Pocket-sized user interface for internet browser terminals and the like
US5935244A (en) * 1997-01-21 1999-08-10 Dell Usa, L.P. Detachable I/O device for computer data security
US5978379A (en) * 1997-01-23 1999-11-02 Gadzoox Networks, Inc. Fiber channel learning bridge, learning half bridge, and protocol
JP3564966B2 (en) 1997-09-19 2004-09-15 トヨタ自動車株式会社 Failure diagnosis device for exhaust gas purification device
US7209127B2 (en) * 1997-10-09 2007-04-24 Bowen James H Electronic sketch pad and auxiliary monitor
US5995085A (en) * 1997-10-09 1999-11-30 Bowen; James H. Electronic sketch pad and auxiliary monitor
GB2330646B (en) * 1997-10-23 2002-04-24 Nokia Mobile Phones Ltd Input device
US6037928A (en) 1997-11-13 2000-03-14 Imageworks Manufacturing, Inc. System and method for providing restrained, streamlined access to a computerized information source
JP4011165B2 (en) * 1997-11-21 2007-11-21 泰和 楊 Mouse with handle
US6677987B1 (en) * 1997-12-03 2004-01-13 8×8, Inc. Wireless user-interface arrangement and method
US6160540A (en) * 1998-01-12 2000-12-12 Xerox Company Zoomorphic computer user interface
AU2950599A (en) * 1998-04-06 1999-10-25 Ethertouch Limited Positioning a cursor on the display screen of a computer
US6424335B1 (en) * 1998-09-02 2002-07-23 Fujitsu Limited Notebook computer with detachable infrared multi-mode input device
US6392671B1 (en) * 1998-10-27 2002-05-21 Lawrence F. Glaser Computer pointing device having theme identification means
US7430171B2 (en) 1998-11-19 2008-09-30 Broadcom Corporation Fibre channel arbitrated loop bufferless switch circuitry to increase bandwidth without significant increase in cost
US6933919B1 (en) 1998-12-03 2005-08-23 Gateway Inc. Pointing device with storage
US9183306B2 (en) 1998-12-18 2015-11-10 Microsoft Technology Licensing, Llc Automated selection of appropriate information based on a computer user's context
US6801223B1 (en) 1998-12-18 2004-10-05 Tangis Corporation Managing interactions between computer users' context models
US6791580B1 (en) 1998-12-18 2004-09-14 Tangis Corporation Supplying notifications related to supply and consumption of user context data
US6513046B1 (en) 1999-12-15 2003-01-28 Tangis Corporation Storing and recalling information to augment human memories
US6295051B1 (en) * 1999-06-02 2001-09-25 International Business Machines Corporation Intelligent boundless computer mouse system
US6911855B2 (en) * 1999-06-28 2005-06-28 Broadcom Corporation Current-controlled CMOS circuit using higher voltage supply in low voltage CMOS process
US6424177B1 (en) * 1999-06-28 2002-07-23 Broadcom Corporation Universal single-ended parallel bus
KR20010004087A (en) * 1999-06-28 2001-01-15 박준일 Ring type touch pad
US6424194B1 (en) * 1999-06-28 2002-07-23 Broadcom Corporation Current-controlled CMOS logic family
US6897697B2 (en) * 1999-06-28 2005-05-24 Broadcom Corporation Current-controlled CMOS circuit using higher voltage supply in low voltage CMOS process
US6462732B2 (en) * 1999-07-28 2002-10-08 Michael Mehr Hand un-inhibiting cursor control device
ATE322711T1 (en) * 1999-08-25 2006-04-15 Swatch Ag CLOCK WITH A CONTACTLESS CONTROL DEVICE FOR A COMPUTER MOUSE
US6237879B1 (en) 1999-09-10 2001-05-29 Michael Budge Ergonomic comfort pads for portable or notebook computers
US6782245B1 (en) 1999-09-10 2004-08-24 Logitech Europe S.A. Wireless peripheral interface with universal serial bus port
US6664949B1 (en) * 1999-11-05 2003-12-16 International Business Machines Corporation Interoperable/heterogeneous environment keyboard
AU4137601A (en) 1999-11-30 2001-06-12 Barry Johnson Methods, systems, and apparatuses for secure interactions
US20020091843A1 (en) * 1999-12-21 2002-07-11 Vaid Rahul R. Wireless network adapter
US6937615B1 (en) 2000-02-18 2005-08-30 Logitech Europe S.A. Multi-purpose bridge for wireless communications
US6340899B1 (en) 2000-02-24 2002-01-22 Broadcom Corporation Current-controlled CMOS circuits with inductive broadbanding
US7137711B1 (en) 2000-03-21 2006-11-21 Leonard Reiffel Multi-user retro reflector data input
US6804232B1 (en) * 2000-03-27 2004-10-12 Bbnt Solutions Llc Personal area network with automatic attachment and detachment
WO2001090869A1 (en) * 2000-05-02 2001-11-29 Macri Vincent J Processing system for interactive, personal and idiosyncratic control of images and devices
JP2003532239A (en) * 2000-05-03 2003-10-28 レナード ライフェル Dual mode data drawing product
US6888898B1 (en) * 2000-05-25 2005-05-03 Logitech Europe S.A. Random code for device identification
US6552714B1 (en) * 2000-06-30 2003-04-22 Lyle A. Vust Portable pointing device
US7161578B1 (en) 2000-08-02 2007-01-09 Logitech Europe S.A. Universal presentation device
US6738044B2 (en) * 2000-08-07 2004-05-18 The Regents Of The University Of California Wireless, relative-motion computer input device
US7034803B1 (en) 2000-08-18 2006-04-25 Leonard Reiffel Cursor display privacy product
DE60134759D1 (en) * 2000-08-18 2008-08-21 Leonard Reiffel NOTE PICTURE DATA PRODUCT
CN100397469C (en) * 2000-08-18 2008-06-25 伦纳德·赖费尔 Cursor display privacy product
DE10040812C2 (en) * 2000-08-21 2002-10-10 Infineon Technologies Ag Device for cursor control
JP4362748B2 (en) * 2000-08-21 2009-11-11 ソニー株式会社 Information processing system, information processing apparatus and method, recording medium, and communication terminal apparatus
US7015833B1 (en) 2000-08-31 2006-03-21 Logitech Europe S.A. Multilink receiver for multiple cordless applications
US7058814B1 (en) 2000-09-28 2006-06-06 International Business Machines Corporation System and method for providing time-limited access to people, objects and services
US7792676B2 (en) * 2000-10-25 2010-09-07 Robert Glenn Klinefelter System, method, and apparatus for providing interpretive communication on a network
US6781570B1 (en) * 2000-11-09 2004-08-24 Logitech Europe S.A. Wireless optical input device
US7145549B1 (en) * 2000-11-27 2006-12-05 Intel Corporation Ring pointing device
JP4081373B2 (en) * 2000-12-15 2008-04-23 ライフェル レナード Coded data source converter by image
JP4103592B2 (en) 2000-12-15 2008-06-18 ライフェル レナード Multiple imaging devices, multiple data sources, and coded data source data input devices for multiple applications
WO2002049344A1 (en) * 2000-12-15 2002-06-20 Leonard Reiffel Imaged coded data source tracking product
US7203840B2 (en) * 2000-12-18 2007-04-10 Burlingtonspeech Limited Access control for interactive learning system
US7996321B2 (en) * 2000-12-18 2011-08-09 Burlington English Ltd. Method and apparatus for access control to language learning system
US7224801B2 (en) 2000-12-27 2007-05-29 Logitech Europe S.A. Wireless secure device
SE518567C2 (en) * 2001-01-03 2002-10-22 Digityper Ab A portable device for supplying control signals to a peripheral unit and the use of such a device
DE10101839A1 (en) * 2001-01-17 2002-07-18 Dirk Bertram Device for controlling position indicator on visual display is worn on operator's finger, has control element positioned to prevent unintentional operation, is protected against loss by holder
DE10103888C2 (en) * 2001-01-30 2002-11-28 Paul Bellendorf control device
CN1164983C (en) * 2001-03-29 2004-09-01 国际商业机器公司 Finger identifying keyboard
US20050024338A1 (en) * 2001-03-29 2005-02-03 International Business Machines Corporation Finger-identifying keyboard
US7061468B2 (en) 2001-04-10 2006-06-13 Logitech Europe S.A. Hybrid presentation controller and computer input device
US20040195327A1 (en) * 2001-04-19 2004-10-07 Leonard Reiffel Combined imaging coded data source data acquisition
EP1253509A1 (en) * 2001-04-27 2002-10-30 Jacques Andre Device for controlling a three-dimensional movement
WO2002089441A1 (en) * 2001-05-01 2002-11-07 Meta4Hand Inc. Wireless network computing
US20020163495A1 (en) * 2001-05-02 2002-11-07 Plamen Doynov Multi-functional ergonomic interface
US6864558B2 (en) * 2001-05-17 2005-03-08 Broadcom Corporation Layout technique for C3MOS inductive broadbanding
US7133021B2 (en) * 2001-06-09 2006-11-07 Coghan Iv Francis F Finger-fitting pointing device
JP4552366B2 (en) * 2001-07-09 2010-09-29 日本電気株式会社 Mobile portable terminal, position search system, position search method and program thereof
US9031880B2 (en) * 2001-07-10 2015-05-12 Iii Holdings 1, Llc Systems and methods for non-traditional payment using biometric data
KR100480770B1 (en) * 2001-07-12 2005-04-06 삼성전자주식회사 Method for pointing information in three-dimensional space
US7239636B2 (en) 2001-07-23 2007-07-03 Broadcom Corporation Multiple virtual channels for use in network devices
US20030025721A1 (en) * 2001-08-06 2003-02-06 Joshua Clapper Hand mounted ultrasonic position determining device and system
US20030030542A1 (en) * 2001-08-10 2003-02-13 Von Hoffmann Gerard PDA security system
US20040135766A1 (en) * 2001-08-15 2004-07-15 Leonard Reiffel Imaged toggled data input product
US6850224B2 (en) * 2001-08-27 2005-02-01 Carba Fire Technologies, Inc. Wearable ergonomic computer mouse
US20030048174A1 (en) * 2001-09-11 2003-03-13 Alcatel, Societe Anonyme Electronic device capable of wirelessly transmitting a password that can be used to unlock/lock a password protected electronic device
US6720948B2 (en) * 2001-10-11 2004-04-13 International Business Machines Corporation Method, program, and system for communicating between a pointing device and a host computer
US6624699B2 (en) * 2001-10-25 2003-09-23 Broadcom Corporation Current-controlled CMOS wideband data amplifier circuits
US6816151B2 (en) * 2001-11-09 2004-11-09 Terry L. Dellinger Hand-held trackball computer pointing device
US7421257B1 (en) 2001-11-30 2008-09-02 Stragent, Llc Receiver scheduling in ad hoc wireless networks
SE0104110D0 (en) * 2001-12-06 2001-12-06 Digityper Ab Pointing Device
US20030142069A1 (en) * 2002-01-25 2003-07-31 Gatto Frank P. Hand-held ergonomic computer interface device
US20030142065A1 (en) * 2002-01-28 2003-07-31 Kourosh Pahlavan Ring pointer device with inertial sensors
US20030160762A1 (en) * 2002-02-22 2003-08-28 Ho-Lung Lu Rechargeable wireless mouse
US7295555B2 (en) 2002-03-08 2007-11-13 Broadcom Corporation System and method for identifying upper layer protocol message boundaries
US6778380B2 (en) * 2002-05-06 2004-08-17 William P Murray, Jr. TV mute finger ring
SE523297C2 (en) * 2002-05-28 2004-04-06 Perific Ab A device for inputting control signals to a peripheral unit and a system including such a device
US20040059463A1 (en) * 2002-06-24 2004-03-25 Scriptpro Llc Active control center for use with an automatic dispensing system for prescriptions and the like
US6910601B2 (en) 2002-07-08 2005-06-28 Scriptpro Llc Collating unit for use with a control center cooperating with an automatic prescription or pharmaceutical dispensing system
DE10329028A1 (en) * 2002-07-11 2004-01-29 Ceram Tec Ag Innovative Ceramic Engineering Preparation of piezoelectric multi layer actuators for e.g. injection valves, provided with heat insulation formed by sintering thick coating mixture of inorganic material and organic binder
EP1543457A4 (en) * 2002-07-12 2009-03-25 Privaris Inc Personal authentication software and systems for travel privilege assignation and verification
JP4119187B2 (en) * 2002-07-17 2008-07-16 国立大学法人金沢大学 Input device
EP3547599A1 (en) 2002-08-06 2019-10-02 Apple Inc. Methods for secure enrollment and backup of personal identity credentials into electronic devices
US7411959B2 (en) 2002-08-30 2008-08-12 Broadcom Corporation System and method for handling out-of-order frames
US7934021B2 (en) 2002-08-29 2011-04-26 Broadcom Corporation System and method for network interfacing
US7346701B2 (en) 2002-08-30 2008-03-18 Broadcom Corporation System and method for TCP offload
US7313623B2 (en) 2002-08-30 2007-12-25 Broadcom Corporation System and method for TCP/IP offload independent of bandwidth delay product
US8180928B2 (en) 2002-08-30 2012-05-15 Broadcom Corporation Method and system for supporting read operations with CRC for iSCSI and iSCSI chimney
US20040078792A1 (en) * 2002-10-21 2004-04-22 Microsoft Corporation System and method for selectively deactivating auto-deploy functionality of a software input panel
WO2004044664A1 (en) * 2002-11-06 2004-05-27 Julius Lin Virtual workstation
US7774075B2 (en) * 2002-11-06 2010-08-10 Lin Julius J Y Audio-visual three-dimensional input/output
US6717075B1 (en) * 2003-01-08 2004-04-06 Hewlett-Packard Development Company, L.P. Method and apparatus for a multi-sided input device
US20040179692A1 (en) * 2003-03-11 2004-09-16 David Cheng Personal data entry and authentication device
US20060291797A1 (en) * 2003-05-27 2006-12-28 Leonard Reiffel Multi-imager multi-source multi-use coded data source data input product
CA2724292C (en) 2003-05-30 2014-09-30 Privaris, Inc. An in-circuit security system and methods for controlling access to and use of sensitive data
US7048183B2 (en) 2003-06-19 2006-05-23 Scriptpro Llc RFID rag and method of user verification
US20040256452A1 (en) * 2003-06-19 2004-12-23 Coughlin Michael E. RFID tag and method of user verification
US7230519B2 (en) * 2003-06-19 2007-06-12 Scriptpro Llc RFID tag and method of user verification
US7121427B2 (en) * 2003-07-22 2006-10-17 Scriptpro Llc Fork based transport storage system for pharmaceutical unit of use dispenser
US20050024334A1 (en) * 2003-08-01 2005-02-03 Fengsheng Li Computer mouse with bristles
US7100796B1 (en) 2003-08-08 2006-09-05 Scriptpro Llc Apparatus for dispensing vials
AU2003904317A0 (en) * 2003-08-13 2003-08-28 Securicom (Nsw) Pty Ltd Remote entry system
US20050102163A1 (en) * 2003-11-06 2005-05-12 Coughlin Michael E. Method and system for delivering prescriptions to remote locations for patient retrieval
US7170420B2 (en) * 2003-11-13 2007-01-30 James Phifer Ergonomic television remote control
WO2005086802A2 (en) 2004-03-08 2005-09-22 Proxense, Llc Linked account system using personal digital key (pdk-las)
DE102004013708A1 (en) * 2004-03-18 2005-10-27 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Computer-based interaction device for generating signals supplies signals to be received and evaluated by computer-based applications with two-dimensional/multi-dimensional image displays
US8232862B2 (en) 2004-05-17 2012-07-31 Assa Abloy Ab Biometrically authenticated portable access device
US20060005035A1 (en) * 2004-06-22 2006-01-05 Coughlin Michael E Keystroke input device for use with an RFID tag and user verification system
US7461759B2 (en) * 2004-07-22 2008-12-09 Scriptpro Llc Fork based transport storage system for pharmaceutical unit of use dispenser
US7451409B2 (en) * 2004-08-05 2008-11-11 Ixi Mobile (R & D), Ltd. Embedded user interface system and method for a mobile communication device
US7446753B2 (en) * 2004-09-10 2008-11-04 Hand Held Products, Inc. Hand held computer device
US7175381B2 (en) * 2004-11-23 2007-02-13 Scriptpro Llc Robotic arm for use with pharmaceutical unit of use transport and storage system
US20060164383A1 (en) * 2004-12-16 2006-07-27 Media Lab Europe (In Voluntary Liquidation) Remote controller ring for user interaction
AU2005319019A1 (en) 2004-12-20 2006-06-29 Proxense, Llc Biometric personal data key (PDK) authentication
US7244944B2 (en) * 2005-06-28 2007-07-17 Symbol Technologies, Inc Triggering system and method
US7362174B2 (en) * 2005-07-29 2008-04-22 Broadcom Corporation Current-controlled CMOS (C3MOS) wideband input data amplifier for reduced differential and common-mode reflection
US7598788B2 (en) * 2005-09-06 2009-10-06 Broadcom Corporation Current-controlled CMOS (C3MOS) fully differential integrated delay cell with variable delay and high bandwidth
US20070087838A1 (en) * 2005-09-12 2007-04-19 Jonathan Bradbury Video game media
US20070087837A1 (en) * 2005-09-12 2007-04-19 Jonathan Bradbury Video game consoles
US7883420B2 (en) * 2005-09-12 2011-02-08 Mattel, Inc. Video game systems
US8694435B1 (en) 2005-11-14 2014-04-08 American Express Travel Related Services Company, Inc. System and method for linking point of sale devices within a virtual network
FI118674B (en) * 2005-11-22 2008-02-15 Planmeca Oy Hardware in a dental environment and a method for controlling a hardware device
TWI315481B (en) * 2005-12-09 2009-10-01 Ind Tech Res Inst Wireless inertial input device
US7810504B2 (en) * 2005-12-28 2010-10-12 Depuy Products, Inc. System and method for wearable user interface in computer assisted surgery
US9113464B2 (en) 2006-01-06 2015-08-18 Proxense, Llc Dynamic cell size variation via wireless link parameter adjustment
US11206664B2 (en) 2006-01-06 2021-12-21 Proxense, Llc Wireless network synchronization of cells and client devices on a network
US7188045B1 (en) 2006-03-09 2007-03-06 Dean A. Cirielli Three-dimensional position and motion telemetry input
US7353134B2 (en) * 2006-03-09 2008-04-01 Dean A. Cirielli Three-dimensional position and motion telemetry input
TW200741509A (en) * 2006-04-19 2007-11-01 Kye Systems Corp Finger wearing type input device and input method thereof
US7904718B2 (en) 2006-05-05 2011-03-08 Proxense, Llc Personal digital key differentiation for secure transactions
US7748878B2 (en) * 2006-05-18 2010-07-06 Production Resource Group, Inc. Lighting control system with wireless network connection
KR100826872B1 (en) * 2006-08-30 2008-05-06 한국전자통신연구원 Wearable computer system and method controlling information/service in wearable computer system
US9233301B2 (en) 2006-09-07 2016-01-12 Rateze Remote Mgmt Llc Control of data presentation from multiple sources using a wireless home entertainment hub
US9386269B2 (en) * 2006-09-07 2016-07-05 Rateze Remote Mgmt Llc Presentation of data on multiple display devices using a wireless hub
US8935733B2 (en) * 2006-09-07 2015-01-13 Porto Vinci Ltd. Limited Liability Company Data presentation using a wireless home entertainment hub
US8607281B2 (en) 2006-09-07 2013-12-10 Porto Vinci Ltd. Limited Liability Company Control of data presentation in multiple zones using a wireless home entertainment hub
US8966545B2 (en) * 2006-09-07 2015-02-24 Porto Vinci Ltd. Limited Liability Company Connecting a legacy device into a home entertainment system using a wireless home entertainment hub
US9319741B2 (en) 2006-09-07 2016-04-19 Rateze Remote Mgmt Llc Finding devices in an entertainment system
US9269221B2 (en) 2006-11-13 2016-02-23 John J. Gobbi Configuration of interfaces for a location detection system and application
US9486703B2 (en) * 2007-01-31 2016-11-08 Broadcom Corporation Mobile communication device with game application for use in conjunction with a remote mobile communication device and methods for use therewith
BRPI0706343A2 (en) * 2007-05-10 2008-12-30 Andre Luis Cortes hand-knotted mouse for quick access
US20110210931A1 (en) * 2007-08-19 2011-09-01 Ringbow Ltd. Finger-worn device and interaction methods and communication methods
WO2009024971A2 (en) * 2007-08-19 2009-02-26 Saar Shai Finger-worn devices and related methods of use
US8659427B2 (en) * 2007-11-09 2014-02-25 Proxense, Llc Proximity-sensor supporting multiple application services
US8171528B1 (en) 2007-12-06 2012-05-01 Proxense, Llc Hybrid device having a personal digital key and receiver-decoder circuit and methods of use
US9251332B2 (en) 2007-12-19 2016-02-02 Proxense, Llc Security system and method for controlling access to computing resources
WO2009102979A2 (en) 2008-02-14 2009-08-20 Proxense, Llc Proximity-based healthcare management system with automatic access to private information
WO2009126732A2 (en) 2008-04-08 2009-10-15 Proxense, Llc Automated service-based order processing
JP2009290329A (en) * 2008-05-27 2009-12-10 Toshiba Corp Ip communication system, server unit, terminal device and authentication method
GB2465782B (en) 2008-11-28 2016-04-13 Univ Nottingham Trent Biometric identity verification
US10257191B2 (en) 2008-11-28 2019-04-09 Nottingham Trent University Biometric identity verification
US8289162B2 (en) * 2008-12-22 2012-10-16 Wimm Labs, Inc. Gesture-based user interface for a wearable portable device
CA2776877C (en) 2009-10-06 2017-07-18 Leonard Rudy Dueckman A method and an apparatus for controlling a machine using motion based signals and inputs
US8305251B2 (en) * 2010-02-09 2012-11-06 National Taiwan University Wireless remote control system
US20130328770A1 (en) * 2010-02-23 2013-12-12 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US20120050164A1 (en) * 2010-02-24 2012-03-01 Martel Burton M Computer Mouse Input Device
US9418205B2 (en) 2010-03-15 2016-08-16 Proxense, Llc Proximity-based system for automatic application or data access and item tracking
US9310887B2 (en) 2010-05-06 2016-04-12 James W. Wieder Handheld and wearable remote-controllers
US8570273B1 (en) * 2010-05-20 2013-10-29 Lockheed Martin Corporation Input device configured to control a computing device
US9322974B1 (en) 2010-07-15 2016-04-26 Proxense, Llc. Proximity-based system for object tracking
US9335793B2 (en) * 2011-01-31 2016-05-10 Apple Inc. Cover attachment with flexible display
US8857716B1 (en) 2011-02-21 2014-10-14 Proxense, Llc Implementation of a proximity-based system for object tracking and automatic application initialization
US9007302B1 (en) 2011-11-11 2015-04-14 Benjamin D. Bandt-Horn Device and user interface for visualizing, navigating, and manipulating hierarchically structured information on host electronic devices
US8933912B2 (en) 2012-04-02 2015-01-13 Microsoft Corporation Touch sensitive user interface with three dimensional input sensor
CN105473021B (en) 2013-03-15 2018-01-19 智能专利有限责任公司 wearable device and related system
US9405898B2 (en) 2013-05-10 2016-08-02 Proxense, Llc Secure element as a digital pocket
CN203986449U (en) * 2014-04-28 2014-12-10 京东方科技集团股份有限公司 A kind of projection ring
TWD171525S (en) * 2014-06-24 2015-11-01 鴻海精密工業股份有限公司 Ring reader
US20150204108A1 (en) * 2014-07-14 2015-07-23 Lear Corporation Key Fob Having Electrical Port Concealed by Removable Key
US9652038B2 (en) 2015-02-20 2017-05-16 Sony Interactive Entertainment Inc. Magnetic tracking of glove fingertips
US20160306421A1 (en) * 2015-04-16 2016-10-20 International Business Machines Corporation Finger-line based remote control
US11140171B1 (en) 2015-06-05 2021-10-05 Apple Inc. Establishing and verifying identity using action sequences while protecting user privacy
US10868672B1 (en) 2015-06-05 2020-12-15 Apple Inc. Establishing and verifying identity using biometrics while protecting user privacy
USD859412S1 (en) * 2017-08-18 2019-09-10 Practech, Inc. Wearable or handheld hybrid smart barcode scanner
US10698498B2 (en) 2017-11-30 2020-06-30 Komodo OpenLab Inc. Configurable device switching mechanism that enables seamless interactions with multiple devices
US20190187813A1 (en) * 2017-12-19 2019-06-20 North Inc. Wearable electronic devices having a multi-use single switch and methods of use thereof
US10579099B2 (en) * 2018-04-30 2020-03-03 Apple Inc. Expandable ring device

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4189712A (en) * 1977-11-09 1980-02-19 Lemelson Jerome H Switch and lock activating system and method
FR2448824A1 (en) * 1979-02-06 1980-09-05 Telediffusion Fse VIDEOTEX SYSTEM PROVIDED WITH INFORMATION ACCESS CONTROL MEANS
US4453161A (en) * 1980-02-15 1984-06-05 Lemelson Jerome H Switch activating system and method
US4458238A (en) * 1982-01-28 1984-07-03 Learn Dale H Hand held data entry unit
JPS59174938A (en) * 1983-03-25 1984-10-03 Canon Inc Key device for moving cursor
US4578674A (en) * 1983-04-20 1986-03-25 International Business Machines Corporation Method and apparatus for wireless cursor position control
DE3482904D1 (en) * 1983-05-06 1990-09-13 Seiko Instr Inc DATA STORAGE DISPLAY DEVICE, e.g. A WRISTWATCH.
US4628541A (en) * 1983-08-10 1986-12-09 International Business Machines Corporation Infra-red data communications system for coupling a battery powered data entry device to a microcomputer
JPS60225910A (en) * 1984-04-24 1985-11-11 Fuji Electric Co Ltd Operation monitor device
US4682159A (en) * 1984-06-20 1987-07-21 Personics Corporation Apparatus and method for controlling a cursor on a computer display
EP0170716B1 (en) * 1984-08-08 1990-10-24 Kabushiki Kaisha Toshiba Information medium
JPS6194134A (en) * 1984-10-13 1986-05-13 Naretsuji:Kk Radio mouse device
US4654648A (en) * 1984-12-17 1987-03-31 Herrington Richard A Wireless cursor control system
GB2173597A (en) * 1985-04-10 1986-10-15 Taylor Miller Limited Position sensing touch panel
US4722625A (en) * 1985-09-26 1988-02-02 Triune Automated Painting Systems Remote control device for powered painting system
US4763291A (en) * 1986-03-06 1988-08-09 Project Benjamin, Ltd. Remote display device for a microcomputer
DE3614744C2 (en) * 1986-04-30 1994-11-10 Koenig & Bauer Ag Device for controlling a rotary printing machine
US4808995A (en) * 1986-05-02 1989-02-28 Stanley Automatic Openers Accessory-expandable, radio-controlled, door operator with multiple security levels
US4823311A (en) * 1986-05-30 1989-04-18 Texas Instruments Incorporated Calculator keyboard with user definable function keys and with programmably alterable interactive labels for certain function keys
FR2602875B1 (en) * 1986-08-18 1989-02-17 Inst Francais Du Petrole METHOD AND DEVICE FOR INITIALIZING APPARATUS FOR DATA ACQUISITION AND IN PARTICULAR SEISMIC DATA
US4951249A (en) * 1986-10-24 1990-08-21 Harcom Security Systems Corp. Method and apparatus for controlled access to a computer system
ES2022557B3 (en) * 1986-11-20 1991-12-01 Ernst Peiniger Gmbh Unternehmen Fur Bautenschutz SAFETY INSTALLATION FOR A DEVICE OPERABLE BY A SERVICE PERSON
US4844475A (en) * 1986-12-30 1989-07-04 Mattel, Inc. Electronic interactive game apparatus in which an electronic station responds to play of a human
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US4763993A (en) * 1987-04-30 1988-08-16 N-View Corporation Liquid crystal display for projection systems
FR2615985B1 (en) * 1987-05-26 1992-01-24 Cogema SYSTEM FOR IDENTIFYING INDIVIDUALS AUTHORIZED TO ACCESS A RESERVED AREA
JPS6453395A (en) * 1987-08-25 1989-03-01 Mitsubishi Electric Corp Semiconductor memory device
US4905001A (en) * 1987-10-08 1990-02-27 Penner Henry C Hand-held finger movement actuated communication devices and systems employing such devices
JPH01175057A (en) * 1987-12-28 1989-07-11 Toshiba Corp Dynamic control method for security
US4924216A (en) * 1988-02-12 1990-05-08 Acemore International Ltd. Joystick controller apparatus
US4922236A (en) * 1988-04-25 1990-05-01 Richard Heady Fiber optical mouse
US4954817A (en) * 1988-05-02 1990-09-04 Levine Neil A Finger worn graphic interface device
US4927987A (en) * 1989-02-24 1990-05-22 Kirchgessner Steven J Directional control device
US4961224A (en) * 1989-03-06 1990-10-02 Darby Yung Controlling access to network resources
US4972074A (en) * 1989-04-10 1990-11-20 Scott M. Wright Optical attenuator movement detection system
US5267181A (en) * 1989-11-03 1993-11-30 Handykey Corporation Cybernetic interface for a computer that uses a hand held chord keyboard
EP0500794A4 (en) * 1989-11-22 1993-02-03 David C. Russell Computer control system
US5581484A (en) * 1994-06-27 1996-12-03 Prince; Kevin R. Finger mounted computer input device
US5832296A (en) * 1995-04-26 1998-11-03 Interval Research Corp. Wearable context sensitive user interface for interacting with plurality of electronic devices of interest to the user

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040017359A1 (en) * 2000-04-06 2004-01-29 Bohn David D. User interface utilizing a computer pointing device with infrared bridge
GB2402591A (en) * 2002-03-25 2004-12-08 David Michael King GUI and support hardware for maintaining long-term personal access to the world
WO2003081414A1 (en) * 2002-03-25 2003-10-02 David Michael King Gui and support hardware for maintaining long-term personal access to the world
US7756988B2 (en) 2002-06-28 2010-07-13 Pitney Bowes Inc. System and method for selecting an external user interface using spatial information
US6920557B2 (en) 2002-06-28 2005-07-19 Pitney Bowes Inc. System and method for wireless user interface for business machines
US7225262B2 (en) 2002-06-28 2007-05-29 Pitney Bowes Inc. System and method for selecting an external user interface using spatial information
US20070208433A1 (en) * 2002-06-28 2007-09-06 Pitney Bowes Inc. System and Method for Selecting an External User Interface Using Spatial Information
US8806039B2 (en) 2002-06-28 2014-08-12 Pitney Bowes Inc. System and method for selecting an external user interface using spatial information
US20070222759A1 (en) * 2006-03-23 2007-09-27 Barnes Cody C Computer pointing device
US7649536B1 (en) * 2006-06-16 2010-01-19 Nvidia Corporation System, method, and computer program product for utilizing natural motions of a user to display intuitively correlated reactions
WO2008094383A1 (en) * 2007-01-29 2008-08-07 Fred Bassali Advanced vehicular universal transmitter using time domain with vehicle location logging system
WO2016008041A1 (en) * 2014-07-15 2016-01-21 Synaptive Medical (Barbados) Inc. Finger controlled medical device interface
GB2545117A (en) * 2014-07-15 2017-06-07 Synaptive Medical Barbados Inc Finger controlled medical device interface
US9974622B2 (en) 2014-07-15 2018-05-22 Synaptive Medical (Barbados) Inc. Finger controlled medical device interface
GB2545117B (en) * 2014-07-15 2020-10-14 Synaptive Medical Barbados Inc Finger controlled medical device interface
WO2018217436A1 (en) * 2017-05-26 2018-11-29 Covidien Lp Controller for imaging device
EP3629973A4 (en) * 2017-05-26 2021-01-27 Covidien LP Controller for imaging device
US11583356B2 (en) * 2017-05-26 2023-02-21 Covidien Lp Controller for imaging device

Also Published As

Publication number Publication date
EP0500794A4 (en) 1993-02-03
EP0500794A1 (en) 1992-09-02
US5729220A (en) 1998-03-17
US6441770B2 (en) 2002-08-27
JPH05502130A (en) 1993-04-15
US6201484B1 (en) 2001-03-13
AU7788191A (en) 1991-06-13
WO1991007826A1 (en) 1991-05-30
US5481265A (en) 1996-01-02

Similar Documents

Publication Publication Date Title
US6441770B2 (en) Ergonomic customizeable user/computer interface devices
US5841425A (en) Ambidextrous computer input device
US20020163495A1 (en) Multi-functional ergonomic interface
US20050141752A1 (en) Dynamically modifiable keyboard-style interface
KR102425307B1 (en) Data processing terminals and related methods in lock, intermediate, and unlock modes
US11223719B2 (en) Mobile communication terminals, their directional input units, and methods thereof
US8736420B2 (en) Methods, systems, and products for controlling devices
US6601129B1 (en) Interface device between PC and keyboard enabling switching of data
US7877692B2 (en) Accessible display system
CN108985034A (en) A kind of unlocking method and terminal device
US20040098481A1 (en) Computer-user authentication system, method and program therefor
US20030090472A1 (en) Method of controlling function key of local computer per each corresponding programs in the remote control apparatus
US20140040986A1 (en) Protocol to Prevent Replay Attacks on Secured Wireless Transactions
US20030132910A1 (en) Enhanced computer peripheral input device
US20040203480A1 (en) Configuration and management of human interface and other attached devices through bi-directional radio frequency link
WO1993004424A1 (en) Remote sensing computer pointer
US20070002017A1 (en) Device, system and method for wireless communication and cursor pointing
WO2005064439A2 (en) Dynamically modifiable virtual keyboard or virtual mouse interface
KR20050096558A (en) Wireless data input device capable of operating without battery
JPH11272409A (en) Input device capable of operating a plurality of terminals
JP2716983B2 (en) Patient monitoring remote control
KR100488447B1 (en) Wireless keyboard system
JPH10187322A (en) System for inputting data
JP2000242426A (en) Infrared mouse device
JPH0816492A (en) Information processor

Legal Events

Date Code Title Description
AS Assignment

Owner name: TRANSFORMING TECHNOLOGIES, LLC, VIRGINIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RUSSEL, DAVID C.;REEL/FRAME:012732/0291

Effective date: 20010516

AS Assignment

Owner name: TRANSFORMING TECHNOLOGIES, INC., VIRGINIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TRANSFORMING TECHNOLOGIES, L.L.C.;REEL/FRAME:012735/0930

Effective date: 20010529

AS Assignment

Owner name: PRIVARIS, INC., DELAWARE

Free format text: CHANGE OF NAME;ASSIGNOR:TRANSFORMING TECHNOLOGIES, INC.;REEL/FRAME:013467/0692

Effective date: 20021010

AS Assignment

Owner name: PRIVARIS, INC., VIRGINIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RUSSELL, DAVID C.;REEL/FRAME:016686/0183

Effective date: 20040806

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: SILICON VALLEY BANK, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:PRIVARIS, INC.;REEL/FRAME:020234/0001

Effective date: 20071108

AS Assignment

Owner name: HARBERT VENTURE PARTNERS, LLC, VIRGINIA

Free format text: SECURITY AGREEMENT;ASSIGNOR:PRIVARIS, INC.;REEL/FRAME:020092/0139

Effective date: 20071108

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

SULP Surcharge for late payment

Year of fee payment: 7

AS Assignment

Owner name: PRIVARIS, INC., VIRGINIA

Free format text: RELEASE;ASSIGNOR:SILICON VALLEY BANK;REEL/FRAME:025317/0924

Effective date: 20101123

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20140827

AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRIVARIS INC.;REEL/FRAME:034648/0239

Effective date: 20141014