Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20010032278 A1
Publication typeApplication
Application numberUS 09/780,316
Publication dateOct 18, 2001
Filing dateFeb 9, 2001
Priority dateOct 7, 1997
Also published asUS7853645, US20050114444, US20090030977, US20090063628, US20090082686, US20090157807, US20110301957, US20120005268, US20130304806, US20150312336
Publication number09780316, 780316, US 2001/0032278 A1, US 2001/032278 A1, US 20010032278 A1, US 20010032278A1, US 2001032278 A1, US 2001032278A1, US-A1-20010032278, US-A1-2001032278, US2001/0032278A1, US2001/032278A1, US20010032278 A1, US20010032278A1, US2001032278 A1, US2001032278A1
InventorsStephen Brown, David Brown
Original AssigneeBrown Stephen J., Brown David W.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Remote generation and distribution of command programs for programmable devices
US 20010032278 A1
Abstract
A control system for generating and distributing command programs for programmable devices. The control system comprises a server system, a remote system, a programmable device, and a communications system. The server system runs a controlling software program that generates command programs based on an application program. The remote system runs an industry standard browser program. The programmable device performs tasks based on command programs. The communications system allows the industry standard browser program and the device program to communicate with the controlling software program. An end user controls the controlling software program using the remote system to generate a desired command program based on a desired task and downloads the desired command program to the programmable device through the communications system.
Images(13)
Previous page
Next page
Claims(1)
I claim:
1. A system for generating and distributing command programs for programmable devices, comprising:
a server system for running a controlling software program that generates command programs based on an application program;
a remote system for running an industry standard browser program;
a programmable device that performs tasks based on command programs;
a communications system that allows the industry standard browser program and the device program to communicate with the controlling software program; wherein
an end user can control the controlling software program using the remote system to generate a desired command program based on a desired task and download the desired command program to the programmable device through the communications system.
Description
    RELATED APPLICATIONS
  • [0001]
    This application claims priority of U.S. Provisional patent application Ser. No. 60/181,577, which was filed on Feb. 10, 2000.
  • TECHNICAL FIELD
  • [0002]
    The present invention relates to control systems for programmable devices and, more particularly, to the generation and distribution of control commands that control the operation of programmable devices.
  • BACKGROUND OF THE INVENTION
  • [0003]
    A wide variety of devices contain a combination of software and hardware that control the operation of the device. These devices will be referred to herein as programmable devices. Programmable devices include a wide variety of items such as toys, industrial motion control systems, exercise equipment, medical devices, household appliances, HVAC systems, and the like.
  • [0004]
    A common characteristic of such programmable devices is that they are programmed to perform a limited number of predetermined tasks. For example, a toy may be programmed to speak, move, or react to external stimulation in a predetermined manner. An industrial motion control system is programmed to assemble parts in a precise, repetitive manner. A household appliance may be programmed to perform one or more cooking or cleaning tasks. An HVAC system will be programmed to control a heating element and heat distribution systems to obtain a desired air temperature.
  • [0005]
    Some programmable devices contain means for allowing the end user to control the functionality of the system to a limited degree. In the context of a toy, the end user may operate a switch or joystick to select a manner of movement. An HVAC system will normally allow the end user to set the desired temperature. In most cases, however, the input of the end user is limited to changing variables or selecting from among a plurality of stand-alone programs.
  • [0006]
    Programmable devices thus take many forms but have certain common characteristics. A programmable device includes some form of memory for storing control commands that define a predetermined command program. The command program may accept input from the user or contain discrete sub-programs from which the end user may select, but the end user may not modify the command program.
  • [0007]
    A programmable device further comprises a processor capable of executing the command program and generating control signals. To reduce manufacturing costs, the processor is normally an inexpensive dedicated processor with relatively limited capabilities and resources.
  • [0008]
    A programmable device will also comprise control hardware that performs a desired task as defined by the control signals. The control hardware can be as simple as an LED or speaker that generates light or sound or as complicated as a multi-axis industrial motion control device that performs a complex welding procedure.
  • [0009]
    The relevance of the present invention is particularly significant given the varying degrees of technical skill possessed by the various patient end users involved in the design, manufacturing, and use of a typical programmable device. The user of a programmable device must be assumed to have little or no capability to create the command programs necessary to operate a programmable device. Certainly a typical child using a toy will not have the skills necessary to create command program for that toy. Even a highly trained technician operating an industrial motion control system typically will likely not have the skill to program the system to perform a desired task.
  • [0010]
    Accordingly, in this application the term “end user” will refer to a person who uses a programmable device but cannot be assumed to have the expertise to create a command program for that programmable device.
  • [0011]
    In contrast, the term “programmer” will be used herein to refer to a person having the expertise to create a command program for a particular programmable device. The skill level and background of the programmer will vary depending upon the specific programmable device; the term programmer is thus not intended to define a particular level of expertise, but is instead defined in relation to the specific programmable device.
  • [0012]
    With some programmable devices, the programmer has no direct contact with the end user. For example, a programmer of a toy or household appliance will typically not have direct contact with the end user. A programmer of an HVAC system or industrial motion control system may, on the other hand, have contact with the end user.
  • [0013]
    Without direct contact with the end user, the programmer must anticipate what task the end user will desire of the programmable device. Even with direct contact, the programmer may not fully comprehend the desired task, or the desired task may change after the command program has been created. In either case, obtaining the services of the programmer to modify the command program is likely to be difficult and expensive, if not impossible.
  • [0014]
    In general, while the end user may not be able to create a command program, the end user will be able to define the desired task. A technician operating an industrial motion control system will likely be able to observe that a change in the operation of the system will increase product yield or speed up the manufacturing process. Even a child might be able to determine that a doll that walks should also be able to jump.
  • [0015]
    The term “end user” may include any other person involved with a programmable device without the technical expertise to qualify as a programmer of that device. For example, a medical device may be used by a patient and controlled by a caregiver, neither of which would have the expertise to be considered a programmer; both the patient and the caregiver would be considered end users in the present application.
  • [0016]
    The purpose of the present invention is to facilitate the generation and distribution of command programs for programmable devices. In particular, the present invention is designed to allow an end user of a particular programmable device to define a desired task, interact with a remote computer over a communications network to generate a command program, and then download the command program into the programmable device over the communications network.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0017]
    [0017]FIG. 1 is a block diagram of a networked system according to a preferred embodiment of the invention.
  • [0018]
    [0018]FIG. 2 is a block diagram illustrating the interaction of the components of the system of FIG. 1.
  • [0019]
    [0019]FIG. 3 is a perspective view of a remotely programmable talking toy of the system of FIG. 1.
  • [0020]
    [0020]FIG. 4 is a block diagram illustrating the components of the talking toy of FIG. 3.
  • [0021]
    [0021]FIG. 5 is a script entry screen according to the preferred embodiment of the invention.
  • [0022]
    [0022]FIG. 6 is a listing of a sample script program according to the preferred embodiment of the invention.
  • [0023]
    [0023]FIG. 7 is a script assignment screen according to the preferred embodiment of the invention.
  • [0024]
    [0024]FIG. 8 is a flow chart illustrating the steps included in a software application executed by the server of FIG. 1 according to the preferred embodiment of the invention.
  • [0025]
    [0025]FIG. 9 is a flow chart illustrating the steps included in a control program executed by the talking toy of FIG. 3 according to the preferred embodiment of the invention.
  • [0026]
    [0026]FIG. 10 is a flow chart illustrating the steps included in the script program of FIG. 6.
  • [0027]
    [0027]FIG. 11 is a block diagram illustrating the interaction of the server of FIG. 1 with the talking toy of FIG. 3 according to a second embodiment of the invention.
  • [0028]
    [0028]FIG. 12 is a script entry screen according to the second embodiment of the invention.
  • [0029]
    [0029]FIG. 13 is a listing of a generic script program according to the second embodiment of the invention.
  • [0030]
    [0030]FIG. 14 is a listing of a custom script program according to the second embodiment of the invention.
  • [0031]
    [0031]FIG. 15 is a flow chart illustrating the steps included in a software application executed by the server of FIG. 1 according to the second embodiment of the invention.
  • [0032]
    [0032]FIG. 16 is a script entry screen according to an alternative embodiment of the invention.
  • [0033]
    [0033]FIG. 17 is a script entry screen according to another embodiment of the invention.
  • DETAILED DESCRIPTION
  • [0034]
    The present invention may be embodied in any programmable device. The present invention will be described below in the context of a toy that may be programmed to speak or move in a desired fashion. The present invention has application to other programmable devices, and the scope of the present invention should be determined by the claims appended hereto and not the following discussion.
  • [0035]
    The invention may be embodied as a networked system including one or more programmable toys that can be controlled to perform a desired task such as move and/or communicate messages to end users. In contrast to conventional programmable toys whose desired task is programmed during manufacture or through the insertion of external media, the programmable toys of the present invention are programmed remotely through the use of script programs. The script programs allow flexible and dynamic updating of the movement of or messages delivered by the toys, as well as convenient tailoring of toy movement and/or the communicated messages to the needs of particular end users.
  • [0036]
    In a exemplary embodiment of the invention disclosed below, the end users as described above are patients and healthcare providers, and the programmable toys are remotely programmed to encourage healthy behavior in the patients. The terms “patient” and “health care provider” will be used below interchangeably with the term end user.
  • [0037]
    In the present exemplary embodiment, the programmable toys may be programmed to encourage children to take their medicine or to tolerate difficult healthcare regimens. The encouragement can take the form of a request audibly delivered to the patient in the form of speech and feedback in the form of movement when the request is followed.
  • [0038]
    As generally discussed throughout this application, the system of the present invention is equally well suited for purposes other than healthcare, such as industrial motion control systems, exercise equipment, HVAC systems, advertising, home appliances, education, entertainment, or any other application which involves the control of programmable devices to perform a desired task for an end user.
  • [0039]
    The preferred embodiment of the invention is illustrated in FIGS. 1-7. Referring to FIG. 1, a networked system 16 includes a server 18 and a workstation 20 connected to server 18 through a communication network 24. Server 18 is preferably a world wide web server and communication network 24 is preferably the Internet. It will be apparent to one skilled in the art that server 18 may comprise a single stand-alone computer or multiple computers distributed throughout a network. Workstation 20 is preferably a personal computer, remote terminal, or web TV unit connected to server 18 via the Internet. Workstation 20 functions as a remote interface for entering in server 18 the end task to be performed for the benefit of the end user.
  • [0040]
    System 16 also includes first and second programmable toys 26 and 28. Each programmable toy interacts with a patient end user in accordance with script programs received from server 18. Each programmable toy is connected to server 18 through communication network 24, preferably the Internet. Alternatively, the programmable toys may be placed in communication with server 18 via wireless communication networks, cellular networks, telephone networks, or any other network which allows each programmable toy to exchange data with server 18. For clarity of illustration, only two programmable toys are shown in FIG. 1. It is to be understood that system 16 may include any number of programmable toys for communicating messages to any number of patient end users.
  • [0041]
    In general, a healthcare provider end user will operate the workstation 20, a programmer will design and operate the server 18, and the patient end user will use the toys 26 and 28.
  • [0042]
    [0042]FIG. 2 shows server 18, workstation 20, and programmable toy 26 in greater detail. Server 18 includes a database 30 for storing script programs 32. The script programs are executed by the programmable toys to communicate messages to the patients. Database 30 further includes a look-up table 34. Table 34 contains a list of the patients who are to receive messages, and for each of the patient end user, a unique identification code and a respective pointer to the script program assigned to the end user. Each programmable toy is designed to execute assigned script programs which it receives from server 18.
  • [0043]
    FIGS. 3-4 show the structure of each programmable toy according to the preferred embodiment. For clarity, only programmable toy 26 is illustrated since each programmable toy of the exemplary preferred embodiment has substantially identical structure to toy 26. Referring to FIG. 3, toy 26 is preferably embodied as a doll, such as a teddy bear. Alternatively, toy 26 may be embodied as an action figure, robot, or any other desired toy.
  • [0044]
    Toy 26 includes a modem jack 46 for connecting the toy to a telephone jack 22 through a connection cord 48. Toy 26 also includes first and second user control buttons 50 and 52. Button 50 is pressed to instruct the toy to execute a script program. Button 52 is pressed to instruct the toy to establish a communication link to the server and download a new script program. In alternative embodiments, the control buttons may be replaced by switches, keys, sensors, or any other type of interface suitable for receiving user input.
  • [0045]
    [0045]FIG. 4 is a schematic block diagram illustrating the internal components of toy 26. Toy 26 includes an audio processor chip 54, which is preferably an RSC-164 chip commercially available from Sensory Circuits Inc. of 1735 N. First Street, San Jose, Calif. 95112. Audio processor chip 54 has a microcontroller 56 for executing script programs received from the server. A memory 58 is connected to microcontroller 56. Memory 58 stores the end user's unique identification code, script programs received from the server, and a script interpreter used by microcontroller 56 to execute the script programs.
  • [0046]
    The script interpreter translates script commands into the native processor code of microcontroller 56. Specific techniques for translating and executing script commands in this manner are well known in the art. Memory 58 also stores a control program executed by microcontroller 56 to perform various control functions which are described in the operation section below. Memory 58 is preferably a non-volatile memory, such as a serial EEPROM.
  • [0047]
    Toy 26 also includes a modem 85 which is connected between microcontroller 56 and modem jack 46. Modem 85 operates under the control of microcontroller 56 to establish communication links to the server through the communication network and to exchange data with the server. The data includes the end user's unique identification code which modem 85 transmits to the server, as well as assigned script programs which modem 85 receives from the server. Modem 85 is preferably a complete 28. 8 K modem commercially available from Cermetek, although any suitable modem may be used.
  • [0048]
    Toy 26 further includes a speaker 64 and a microphone 66. Audio processor chip 54 has built in speech synthesis functionality for audibly communicating messages and prompts to an end user through speaker 64. For speech synthesis, chip 54 includes a digital to analog converter (DAC) 60 and an amplifier 62. DAC 60 and amplifier 62 drive speaker 64 under the control of microcontroller 56 to communicate the messages and prompts.
  • [0049]
    Audio processor chip 54 also has built in speech recognition functionality for recognizing responses spoken into microphone 66. Audio signals received through microphone 66 are converted to electrical signals and sent to a preamp and gain control circuit 68. Circuit 68 is controlled by an automatic gain control circuit 70, which is in turn controlled by microcontroller 56. After being amplified by preamp 68, the electrical signals enter chip 54 and pass to through a multiplexer 72 and an analog to digital converter (ADC) 74. The resulting digital signals pass through a digital logic circuit 76 and enter microcontroller 56 for speech recognition.
  • [0050]
    Audio processor chip 54 also includes a RAM 80 for short term memory storage and a ROM 82 which stores audio sounds for speech synthesis and programs executed by microcontroller 56 to perform speech recognition and speech synthesis. Chip 54 operates at a clock speed determined by a crystal 84. Chip 54 further includes a clock 78 which provides the current date and time to microcontroller 56. Microcontroller 56 is also connected to control buttons 50 and 52 to receive user input. Toy 26 is preferably powered by one or more batteries (not shown). Alternatively, the toy may be powered by a standard wall outlet. Both methods for supplying power to a toy are well known in the art.
  • [0051]
    The toy 26 further comprises a motion system 79 that receives control signals from the microcontroller 56. The motion system 79 can be similar to the motion system used in the “FURBY” doll; this system 79 allows the toy 26 to shake, move its hands and feet, and open and close its mouth and eyes. The motion system 79 is well-known in the art, but is conventionally preprogrammed at the factory for particular ranges and sequences of movement.
  • [0052]
    Referring again to FIG. 2, server 18 includes a controlling software application 36 which is executed by server 18 to perform the various functions described below. The controlling software application 36 may be a system for generating a sequence of control commands based on an application program for motion control systems such as is disclosed in U.S. Pat. Nos. 5,867,385 and 5,691,897 to Brown et al.
  • [0053]
    The controlling software application 36 includes a script generator 38 and a script assignor 40. Script generator 38 is designed to generate script programs 32 from script information entered through workstation 20. The script programs 32 are a specific type of command program such as those typically executed by programmable devices. The script programs 32 contain the information necessary for the microcontroller 56 to cause the toy 26 to perform a desired task.
  • [0054]
    The script information is entered through a script entry screen 42. In the preferred embodiment, script entry screen 42 is implemented as a web page on server 18. Workstation 20 includes a web browser for accessing the web page to enter the script information.
  • [0055]
    [0055]FIG. 5 illustrates a sample script entry screen 42 as it appears on workstation 20. Screen 42 includes a script name field 86 for specifying the name of a script program to be generated. Screen 42 also includes entry fields 88 for entering information defining the desired task, such as a message containing instructions from the healthcare provider end user to be communicated to the patient end user and a movement to be performed when patient end user complies with the instructions.
  • [0056]
    [0056]FIG. 5 illustrates an exemplary set of statements which encourage the end user to comply with his or her diabetes care regimen. However, it is to be understood that any type of desired task may be entered in screen 42, including movement, sounds, or other messages such as advertisements, educational messages, and entertainment messages. Screen 42 further includes a CREATE SCRIPT button 90 for instructing the script generator to generate a script program from the information entered in screen 42. Screen 42 also includes a CANCEL button 92 for canceling the information entered.
  • [0057]
    In the preferred embodiment, each script program created by the script generator conforms to the standard file format used on UNIX systems. In the standard file format, each command is listed in the upper case and followed by a colon. Every line in the script program is terminated by a linefeed character {LF}, and only one command is placed on each line. The last character in the script program is a UNIX end of file character {EOF}. Table 1 shows an exemplary listing of script commands used in the preferred embodiment of the invention.
    TABLE 1
    SCRIPT COMMANDS
    Command Description
    SPEAK: {words} {LF} Synthesize the words following the
    SPEAK command.
    RECOGNIZE: {word} {LF} Recognize the word following the
    RECOGNIZE command.
    DELAY: t {LF} Wait a period of seconds specified by
    time parameter t.
  • [0058]
    The script commands illustrated in Table 1 are representative of the preferred embodiment and are not intended to limit the scope of the invention. After consideration of the ensuing description, it will be apparent to one skilled in the art many other suitable scripting languages and sets of script commands may be used to implement the invention.
  • [0059]
    Script generator 38 preferably stores a script program template which it uses to create each script program. To generate a script program, script generator 38 inserts into the template the information entered in screen 42. For example, FIG. 6 illustrates a sample script program created by the script generator from the script information shown in FIG. S. The script program includes speech commands to synthesize the phrases or statements entered in fields 88. The steps included in the script program are also shown in the flow chart of FIG. 10 and will be discussed in the operation section below.
  • [0060]
    Referring again to FIG. 2, script assignor 40 is for assigning script programs 32 to the patient end users. Script programs 32 are assigned in accordance with script assignment information entered through workstation 30. The script assignment information is entered through a script assignment screen 44, which is preferably implemented as a web page on server 18.
  • [0061]
    [0061]FIG. 7 illustrates a sample script assignment screen 44 as it appears on workstation 20. Screen 44 includes check boxes 94 for selecting a script program to be assigned and check boxes 96 for selecting the patient end users to whom the script program is to be assigned. Screen 44 also includes an ASSIGN SCRIPT button 100 for entering the assignments. When button 100 is pressed, the script assignor creates and stores for each patient end user selected in check boxes 96 a respective pointer to the script program selected in check boxes 94. Each pointer is stored in the look-up table of the database. Screen 44 further includes an ADD SCRIPT button 98 for adding a new script program and a DELETE SCRIPT button 102 for deleting a script program.
  • [0062]
    The operation of the preferred embodiment is illustrated in FIGS. 1-10. FIG. 8 is a flow chart illustrating the steps included in the software application executed by server 18. In step 202, server 18 determines if new script information has been entered through script entry screen 42. If new script information has not been entered, server 18 proceeds to step 206. If new script information has been entered, server 18 proceeds to step 204.
  • [0063]
    In the preferred embodiment, the script information is entered in server 18 by one or more healthcare provider end users, such as a physician or case manager assigned to the patient, as generally discussed above. Of course, any person desiring to communicate with the end users may be granted access to server to create and assign script programs.
  • [0064]
    Further, it is to be understood that the system may include any number of remote interfaces for entering script generation and script assignment information in server 18. In a toy created for entertainment rather than healthcare purposes, a child may log on to the server 18, custom design a script program, and download the program into the toy.
  • [0065]
    As shown in FIG. 5, the script information specifies a desired task, such as a message containing a set of statements or phrases, to be communicated to one or more patient end users. The desired task may further comprise movements selected and/or entered in a similar manner.
  • [0066]
    In step 204, the script generator 38 generates a script program from the information entered in screen 42. The script program is stored in database 30. Steps 202 and 204 are preferably repeated to generate multiple script programs, e. g. a script program for diabetes patients, a script program for asthma patients, etc. Each script program corresponds to a respective one of the sets of statements entered through script entry screen 42. In step 206, the server 18 determines if new script assignment has been entered through assignment screen 44. If new script assignment information has not been entered, server 18 proceeds to step 210. If new script assignment information has been entered server 18 proceeds to step 208.
  • [0067]
    As shown in FIG. 7, the script assignment information is entered by the healthcare provider end user by selecting a desired script program through check boxes 94, selecting the patient end users to whom the selected scrip program is to be assigned through check boxes 96, and pressing the ASSIGN SCRIPT button 100. When button 100 is pressed, script assignor 40 creates for each end user selected in check boxes 96 a respective pointer to the script program selected in check boxes 94. In step 208, each pointer is stored in look-up table 34 of database 30. In step 210, server 18 determines if any one of the programmable toys is remotely connected to the server.
  • [0068]
    Each patient end user is preferably provided with his or her own programmable toy which has the end user's unique identification code stored therein. Each patient end user is thus uniquely associated with a respective one of the programmable toys. If none of the programmable toys is connected, server 18 returns to step 202. If a programmable toy is connected, server 18 receives from the programmable toy the patient end user's unique identification code to retrieve from table 34 the pointer to the script program assigned to the patient end user. In step 214, server 18 retrieves the assigned script program from database 30. In step 216, server 18 transmits the assigned script program to the patient end user's programmable toy through communication network 24. Following step 216, the server returns to step 202.
  • [0069]
    Each programmable toy is initially programmed with its user's unique identification code, the script interpreter used by the toy to interpret and execute script program commands, and a control program executed by the toy to control its overall operation. The initial programming may be achieved during manufacture or during an initial connection to server 18. FIG. 9 illustrates the steps included in the control program executed by microcontroller 56 of programmable toy 26.
  • [0070]
    In step 302, microcontroller 56 determines if any user input has been received. In the preferred embodiment, user input is received through control buttons 50 and 52. Control button 50 is pressed to instruct the programmable toy to move and/or speak, and control button 52 is pressed to instruct the toy to connect to the server and download a new script program. If no user input is received for a predetermined period of time, such as two minutes, toy 26 enters sleep mode in step 304. The sleep mode conserves battery power while the toy is not in use. Following step 304, microcontroller 56 returns to step 302 and awaits user input.
  • [0071]
    If user input has been received, microcontroller 56 determines if the input is a task request, step 306. If the user has pressed control button 50, if microcontroller 56 executes the script program last received from the server, step 308. The steps included in a sample script program are shown in the flow chart of FIG. 10 and will be discussed below. Following step 308, microcontroller 56 returns to step 302 and awaits new user input.
  • [0072]
    If the user presses control button 52 requesting a connection to the server, microcontroller 56 attempts to establish a communication link to the server through modem 85 and communication network 24, step 310. In step 312, microcontroller determines if the connection was successful. If the connection failed, the user is prompted to connect toy 26 to telephone jack 22 in step 314. Microcontroller 56 preferably prompts the user by synthesizing the phrase “PLEASE CONNECT ME TO THE TELEPHONE JACK USING THE CONNECTION CORD AND SAY ‘DONE’ WHEN YOU HAVE FINISHED.”
  • [0073]
    In step 316, microcontroller 56 waits until the appropriate reply is received through microphone 66. Upon recognizing the reply ‘DONE’, microcontroller 56 repeats step 310 to get a successful connection to the server. Once a successful connection is established, microcontroller 56 transmits the unique identification code stored in memory 58 to server 18 in step 318.
  • [0074]
    In step 320, microcontroller 56 receives a new script program from the server through communication network 24 and modem 85. The new script program is stored in memory 58 for subsequent execution by microcontroller 56. Following step 320, microcontroller 56 returns to step 302 and awaits new user input. FIG. 10 is a flow chart illustrating the steps included in a sample script program executed by microcontroller 56. In step 402, microcontroller 56 prompts the user by synthesizing through speaker 64 “SAY ‘OK’ WHEN YOU ARE READY”. In step 404, microcontroller 56 waits until a reply to the prompt is received through the microphone 66. When the reply ‘OK’ is recognized, microcontroller 55 proceeds to step 406. If no reply is received within a predetermined period of time, such as two minutes, toy 26 preferably enters sleep mode until it is reactivated by pressing one of the control buttons.
  • [0075]
    In step 406, microcontroller 56 executes successive speech commands to synthesize through speaker 64 the phrases or statements specified in the script program. Referring again to FIG. 6, the speech commands are preferably separated by delay commands which instruct microcontroller 56 to pause for a number of seconds between statements. The number of seconds is selected to allow the user sufficient time to absorb each statement. Alternatively, the user may be prompted to acknowledge each statement before a subsequent statement is synthesized. For example, the script program may include commands which instruct microcontroller 56 to synthesize the phrase “SAY ‘OK’ WHEN YOU ARE READY TO HEAR THE NEXT STATEMENT.” Upon recognizing the reply ‘OK’, microcontroller 56 proceeds to the next speech command in the script program. Movement commands are processed for execution by the motion system in a similar manner.
  • [0076]
    In step 408, the user is reminded to connect toy 26 to telephone jack 22 to download a new script program. Microcontroller 56 synthesizes through speaker 64 “PLEASE CONNECT ME TO THE TELEPHONE JACK TO GET NEW MESSAGES.” Following step 408, the script program ends.
  • [0077]
    One advantage of the system of the present invention is that it allows each programmable toy to be programmed remotely through the use of script programs. This allows the task performed by each programmable toy to be tailored to the specific needs of an specific end user or group of end users. Moreover, each script program may be easily created, assigned, and downloaded by simply accessing a server through a communication network, such as the Internet. Thus, the invention provides a powerful, convenient, and inexpensive system for communicating messages to a large number of end users.
  • [0078]
    FIGS. 11-15 illustrate a second embodiment of the invention in which messages are further customized to each patient end user by merging personal data with the script programs, much like a standard mail merge application. Referring to FIG. 11, personal data relating to each patient end user is preferably stored in look-up table 34 of database 30. By way of example, the data may include each patient end user's name, the name of each patient end user's medication or disease, or any other desired data. As in the preferred embodiment, database 30 also stores generic script programs 31 created by script generator 38.
  • [0079]
    In the second embodiment, server 18 includes a data merge program 41 for merging the data stored in table 34 with generic script programs 31. Data merge program 41 is designed to retrieve selected data from table 34 and to insert the data into statements in generic script programs 31, thus creating custom script programs 33. Each custom script program contains a message which is customized to a patient end user. For example, the message may be customized with the patient end user's name, medication name, disease name, etc.
  • [0080]
    The operation of the second embodiment is illustrated in FIGS. 11-15. The operation of the second embodiment is similar to the operation of the preferred embodiment except that server 18 transmits custom script programs to each programmable toy rather than generic script programs. FIG. 15 is a flow chart illustrating the steps included in a software application executed by server 18 according to the second embodiment.
  • [0081]
    In step 502, server 18 determines if new script information has been entered through script entry screen 42. If new script information has not been entered, server 18 proceeds to step 506. If new script information has been entered, server 18 proceeds to step 504. As shown in FIG. 12, the script information specifies a message, such as a set of statements or phrases, to be communicated to the patient end users. Each statement preferably includes one or more insert commands specifying data from table 34 to be inserted into the statement. The insert commands instruct data merge program 41 to retrieve the specified data from database 30 and to insert the data into the statement. For example, the first statement shown in FIG. 12 includes insert commands instructing the data merge program to insert a patient name and a medication name into the statement.
  • [0082]
    Following entry of the statements and insert commands, CREATE SCRIPT button 90 is pressed. When button 90 is pressed, script generator 38 generates a generic script program from the information entered in screen 42, step 504. A sample generic script program is illustrated in FIG. 13. The generic script program includes speech commands to synthesize the statements entered in fields 88. Each statement preferably includes one or more insert commands specifying data to be inserted into the script program. The generic script program is stored in database 30.
  • [0083]
    In step 506, server 18 determines if new script assignment information has been entered through assignment screen 44. If new script assignment information has not been entered, server 18 proceeds to step 512. If new script assignment information has been entered, server 18 proceeds to step 508. As shown in FIG. 7, the script assignment information is entered by selecting a desired script program through check boxes 94, selecting the patient end users to whom the selected script program is to be assigned through check boxes 96, and pressing the ASSIGN SCRIPT button 100.
  • [0084]
    When button 100 is pressed, data merge program 41 creates a custom script program for each patient end user selected in check boxes 96, step 508. Each custom script program is preferably created by using the selected generic script program as a template. For each patient end user selected, data merge program 41 retrieves from database 30 the data specified in the insert commands. Next, data merge program 41 inserts the data into the appropriate statements in the generic script program to create a custom script program for the patient end user.
  • [0085]
    For example, FIG. 14 illustrates a custom script program created from the generic script program of FIG. 13. Each custom script program is stored in database 30.
  • [0086]
    As each custom script program is generated for a patient end user, script assignor 40 assigns the custom script program to the patient end user, step 510. This is preferably accomplished by creating a pointer to the custom script program and storing the pointer with the patient end user's unique identification code in table 34. In step 512, server 18 determines if any one of the programmable toys is remotely connected to the server. If a programmable toy is connected, server 18 receives from the programmable toy the patient end user's unique identification code in step 514.
  • [0087]
    Server 18 uses the received identification code to retrieve from table 34 the pointer to the custom script program assigned to the patient end user. In step 516, server 18 retrieves the custom script program from database 30. In step 518, server 18 transmits the custom script program to the patient end user's programmable toy. The programmable toy receives and executes the script program in the same manner described in the preferred embodiment. The remaining operation of the second embodiment is analogous to the operation of the preferred embodiment described above.
  • [0088]
    Although it is presently preferred to generate a custom script program for each patient end user as soon as script assignment information is received for the patient end user, it is also possible to wait until the patient end user's programmable toy s connects to the server before generating the custom script program. This is accomplished by creating and storing a pointer to the generic script program assigned to the patient end user, as previously described in the preferred embodiment. When the patient end user's programmable toy connects to the server, the data merge program creates a custom script program for the patient end user from the generic script program assigned to the patient end user. The custom script program is then transmitted to the patient end user's programmable toy for execution.
  • [0089]
    Although the first and second embodiments focus on healthcare applications, the system of the present invention may be used for any messaging application. For example, the system is particularly well suited for advertising. In a third embodiment of the invention, an advertising service is provided with a remote interface to the server for creating and assigning script programs which contain advertising messages. As shown in FIG. 16, each advertising message may be conveniently entered through script entry screen 42, like the health-related messages of the preferred embodiment. The operation of the third embodiment is analogous to the operation of the preferred embodiment, except that the talking togs communicate advertising messages rather than health-related messages.
  • [0090]
    Of course, the system of the present invention has many other applications. Typically, the user of each programmable toy is a child. In a fourth embodiment of the invention, the child's parent or guardian is provided with a remote interface to the server for creating and assigning script programs which contain messages for the child. As shown in FIG. 17, each message may be conveniently entered through script entry screen 42. The operation of the fourth embodiment is analogous to the operation of the preferred embodiment, except that script information is entered in the server by a parent or guardian rather than a healthcare provider.
  • [0091]
    Alternatively, the child may be provided with a remote interface to the server to create and assign his or her own script programs. It should also be noted that script programs may be generated from information received from multiple sources, such as a healthcare provider, an advertiser, and a parent. In a fifth embodiment of the invention, the script entry screen includes a respective section for each of the sources to enter a message to be communicated. Each of the sources is provided with a remote interface to the server and a password for accessing the script entry screen. After each source has entered one or more messages in the server, a script program is generated which contains a combination of health-related messages, advertisements, educational messages, or entertainment messages. The remaining operation of the fifth embodiment is analogous to the operation of the preferred embodiment described above.
  • [0092]
    Although the above description contains many specificities, these should not be construed as limitations on the scope of the invention but merely as illustrations of some of the presently preferred embodiments. Many other embodiments of the invention are possible. For example, the scripting language and script commands shown are representative of the preferred embodiment. It will be apparent to one skilled in the art many other scripting languages and specific script commands may be used to implement the invention.
  • [0093]
    Moreover, the programmable device need not be embodied as a doll. The toys may be embodied as action figures, robots, or any other type of toy. Further, each programmable toy need not include a control button for triggering speech output. In alternative embodiments, speech is triggered by other mechanisms, such as voice prompts, the absence of the user's voice, position sensitive sensors, switches, or the like. Specific techniques for triggering speech in a programmable toy are well known in the art. In addition, the system of the present invention is not limited to healthcare applications.
  • [0094]
    The system may be used in any application which involves the communication of messages, including advertising, education, or entertainment. Of course, various combinations of these applications are also possible. For example, messages from multiple sources may be combined to generate script programs which contain a combination of health-related messages, advertisements, or educational messages. Further, the system may include any number of remote interfaces for entering and assigning script programs, and any number of programmable toys for delivering messages.
  • [0095]
    More generally, the programmable device need not be a toy.
  • [0096]
    For example, the programmable device may be a handheld computing device adapted to control and monitor an athlete end user's performance during athletic training. The device may monitor the athlete end user's heart rate and lactose levels. The device may be remotely programmed by a trainer end user or automated system with instructions for training for a specific athletic event based on data collected by the handheld device.
  • [0097]
    The programmable device may be a house hold appliance such as a refrigerator that monitors its contents. A remote end user such as a service that delivers groceries could reprogram the refrigerator as necessary to reflect delivered goods.
  • [0098]
    The programmable device may also be an industrial motion control system such as a robotic welding machine. An engineer in charge of the product being welded may change its design and remotely generate a new command program for the robotic welding machine in response to the design changes.
  • [0099]
    The programmable device may be an HVAC system that communicates with a remote weather monitoring system that generates a new command program for the HVAC system in response to changes in the weather.
  • [0100]
    The embodiments of the present invention disclosed above are merely an illustrative, and not exhaustive, list of the environments in which the present invention may be applied. Therefore, the scope of the invention should be determined not by the examples given, but by the appended claims and their legal equivalents.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4418381 *Jan 23, 1981Nov 29, 1983Bristol Babcock Inc.Single loop control system
US4531182 *Mar 1, 1972Jul 23, 1985Hyatt Gilbert PMachine control system operating from remote commands
US4769771 *Jan 14, 1985Sep 6, 1988U.S. Philips CorporationMultiprocessor system comprising a plurality of data processors which are interconnected by a communication network
US4800521 *Sep 21, 1982Jan 24, 1989Xerox CorporationTask control manager
US4809335 *Oct 24, 1985Feb 28, 1989Rumsey Daniel SSpeech unit for dolls and other toys
US4840602 *Dec 2, 1987Jun 20, 1989Coleco Industries, Inc.Talking doll responsive to external signal
US4846693 *Dec 1, 1988Jul 11, 1989Smith EngineeringVideo based instructional and entertainment system using animated figure
US4857030 *Feb 6, 1987Aug 15, 1989Coleco Industries, Inc.Conversing dolls
US4897835 *May 12, 1989Jan 30, 1990At&E CorporationHigh capacity protocol with multistation capability
US4923428 *May 5, 1988May 8, 1990Cal R & D, Inc.Interactive talking toy
US4937737 *Aug 31, 1988Jun 26, 1990International Business Machines CorporationProcess transparent multi storage mode data transfer and buffer control
US4987537 *May 31, 1988Jan 22, 1991Nec CorporationComputer capable of accessing a memory by supplying an address having a length shorter than that of a required address for the memory
US5120065 *Feb 8, 1991Jun 9, 1992Hasbro, IncorporatedElectronic talking board game
US5390304 *Sep 28, 1990Feb 14, 1995Texas Instruments, IncorporatedMethod and apparatus for processing block instructions in a data processor
US5402518 *Jul 22, 1992Mar 28, 1995Pcvoice, Inc.Sound storage and sound retrieval system having peripheral with hand operable switches
US5438529 *Jan 26, 1994Aug 1, 1995Immersion Human Interface CorporationPercussion input device for personal computer systems
US5450079 *Sep 7, 1994Sep 12, 1995International Business Machines CorporationMultimodal remote control device having electrically alterable keypad designations
US5453933 *Sep 8, 1993Sep 26, 1995Hurco Companies, Inc.CNC control system
US5493281 *Jul 29, 1994Feb 20, 1996The Walt Disney CompanyMethod and apparatus for remote synchronization of audio, lighting, animation and special effects
US5576727 *Jun 5, 1995Nov 19, 1996Immersion Human Interface CorporationElectromechanical human-computer interface with force feedback
US5596994 *May 2, 1994Jan 28, 1997Bro; William L.Automated and interactive behavioral and medical guidance system
US5600373 *Jun 27, 1996Feb 4, 1997Houston Advanced Research CenterMethod and apparatus for video image compression and decompression using boundary-spline-wavelets
US5607336 *Jul 18, 1995Mar 4, 1997Steven LebensfeldSubject specific, word/phrase selectable message delivering doll or action figure
US5617528 *Feb 6, 1995Apr 1, 1997Datacard CorporationMethod and apparatus for interactively creating a card which includes video and cardholder information
US5623582 *Jul 14, 1994Apr 22, 1997Immersion Human Interface CorporationComputer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects
US5625821 *Aug 12, 1991Apr 29, 1997International Business Machines CorporationAsynchronous or synchronous operation of event signaller by event management services in a computer system
US5636994 *Nov 9, 1995Jun 10, 1997Tong; Vincent M. K.Interactive computer controlled doll
US5652866 *Aug 26, 1994Jul 29, 1997Ibm CorporationCollaborative working method and system for a telephone to interface with a collaborative working application
US5655945 *Sep 28, 1995Aug 12, 1997Microsoft CorporationVideo and radio controlled moving and talking device
US5666161 *Apr 26, 1994Sep 9, 1997Hitachi, Ltd.Method and apparatus for creating less amount of compressd image data from compressed still image data and system for transmitting compressed image data through transmission line
US5670992 *Sep 23, 1992Sep 23, 1997Sony CorporationPortable graphic computer apparatus
US5691897 *May 30, 1995Nov 25, 1997Roy-G-Biv CorporationMotion control systems
US5691898 *Mar 28, 1996Nov 25, 1997Immersion Human Interface Corp.Safe and low cost computer peripherals with force feedback for consumer applications
US5707289 *Oct 6, 1995Jan 13, 1998Pioneer Electronic CorporationVideo game system having terminal identification data
US5724074 *Feb 6, 1995Mar 3, 1998Microsoft CorporationMethod and system for graphically programming mobile toys
US5733131 *Jul 29, 1994Mar 31, 1998Seiko Communications Holding N.V.Education and entertainment device with dynamic configuration and operation
US5734373 *Dec 1, 1995Mar 31, 1998Immersion Human Interface CorporationMethod and apparatus for controlling force feedback interface systems utilizing a host computer
US5737523 *Mar 4, 1996Apr 7, 1998Sun Microsystems, Inc.Methods and apparatus for providing dynamic network file system client authentication
US5746602 *Feb 27, 1996May 5, 1998Kikinis; DanPC peripheral interactive doll
US5752880 *Nov 20, 1995May 19, 1998Creator Ltd.Interactive doll
US5754855 *May 12, 1997May 19, 1998International Business Machines CorporationSystem and method for managing control flow of computer programs executing in a computer system
US5764155 *Apr 3, 1996Jun 9, 1998General Electric CompanyDynamic data exchange server
US5790178 *Jul 15, 1992Aug 4, 1998Hitachi, Ltd.Picture codec for teleconference equipment, which forms and displays picture-in-frame including at least two of received motion picture, a received still picture and a self motion picture in an optional combination
US5800268 *Oct 20, 1995Sep 1, 1998Molnick; MelvinMethod of participating in a live casino game from a remote location
US5801946 *Oct 19, 1995Sep 1, 1998Kawasaki Motors Mfg. Co.Assembly prompting system
US5805785 *Feb 27, 1996Sep 8, 1998International Business Machines CorporationMethod for monitoring and recovery of subsystems in a distributed/clustered system
US5818537 *Dec 2, 1996Oct 6, 1998Canon Kabushiki KaishaImage processing method and apparatus for converting between data coded in different formats
US5821920 *Mar 28, 1997Oct 13, 1998Immersion Human Interface CorporationControl input device for interfacing an elongated flexible object with a computer system
US5821987 *Feb 28, 1997Oct 13, 1998Larson; Craig R.Videophone for simultaneous audio and video communication via a standard telephone line
US5822207 *Sep 3, 1996Oct 13, 1998Amadasoft America, Inc.Apparatus and method for integrating intelligent manufacturing system with expert sheet metal planning and bending system
US5825308 *Nov 26, 1996Oct 20, 1998Immersion Human Interface CorporationForce feedback interface having isotonic and isometric functionality
US5828575 *Jul 31, 1996Oct 27, 1998Amadasoft America, Inc.Apparatus and method for managing and distributing design and manufacturing information throughout a sheet metal production facility
US5867385 *May 30, 1996Feb 2, 1999Roy-G-Biv CorporationMotion control systems
US5873765 *Jan 7, 1997Feb 23, 1999Mattel, Inc.Toy having data downloading station
US5889670 *Jan 11, 1996Mar 30, 1999Immersion CorporationMethod and apparatus for tactilely responsive user interface
US5889672 *Jun 3, 1998Mar 30, 1999Immersion CorporationTactiley responsive user interface device and method therefor
US5890963 *Sep 30, 1996Apr 6, 1999Yen; WeiSystem and method for maintaining continuous and progressive game play in a computer network
US5907831 *Apr 4, 1997May 25, 1999Lotvin; MikhailComputer apparatus and methods supporting different categories of users
US5920476 *Nov 21, 1996Jul 6, 1999Hennessey; John M.Computer controlled movement of stage effects and stage installation employing same
US5924013 *Sep 3, 1997Jul 13, 1999Guido; Mary M.Method and apparatus for transmitting motion picture cinematic information for viewing in movie theaters and ordering method therefor
US5937811 *Nov 26, 1997Aug 17, 1999Otics CorporationVariable valve system
US5956484 *Aug 1, 1996Sep 21, 1999Immersion CorporationMethod and apparatus for providing force feedback over a computer network
US5959613 *Nov 13, 1996Sep 28, 1999Immersion CorporationMethod and apparatus for shaping force signals for a force feedback device
US5960085 *Apr 14, 1997Sep 28, 1999De La Huerga; CarlosSecurity badge for automated access control and secure data gathering
US5977951 *Feb 4, 1997Nov 2, 1999Microsoft CorporationSystem and method for substituting an animated character when a remote control physical character is unavailable
US6012961 *May 14, 1997Jan 11, 2000Design Lab, LlcElectronic toy including a reprogrammable data storage device
US6020876 *Apr 14, 1997Feb 1, 2000Immersion CorporationForce feedback interface with selective disturbance filter
US6028593 *Jun 14, 1996Feb 22, 2000Immersion CorporationMethod and apparatus for providing simulated physical interactions within computer generated environments
US6031973 *Jun 16, 1997Feb 29, 2000Seiko Epson CorporationRobot and its controller method
US6038603 *Mar 25, 1997Mar 14, 2000Oracle CorporationProcessing customized uniform resource locators
US6046727 *Feb 9, 1999Apr 4, 2000Immersion CorporationThree dimensional position sensing interface with force output
US6057828 *Jan 16, 1997May 2, 2000Immersion CorporationMethod and apparatus for providing force sensations in virtual environments in accordance with host software
US6061004 *May 29, 1998May 9, 2000Immersion CorporationProviding force feedback using an interface device including an indexing function
US6078308 *Jun 18, 1997Jun 20, 2000Immersion CorporationGraphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6078968 *Dec 17, 1997Jun 20, 2000Vicom Systems, Inc.Platform-independent communications protocol supporting communications between a processor and subsystem controller based on identifying information
US6083104 *Dec 31, 1998Jul 4, 2000Silverlit Toys (U.S.A.), Inc.Programmable toy with an independent game cartridge
US6100874 *Jun 24, 1997Aug 8, 2000Immersion CorporationForce feedback mouse interface
US6101425 *Nov 26, 1997Aug 8, 2000Allen-Bradley Company, LlcMultiple connection networked man-machine interfaces for computer numerical controls
US6101530 *Sep 16, 1998Aug 8, 2000Immersion CorporationForce feedback provided over a computer network
US6104158 *Jun 15, 1999Aug 15, 2000Immersion CorporationForce feedback system
US6125385 *Sep 22, 1999Sep 26, 2000Immersion CorporationForce feedback implementation in web pages
US6128006 *Mar 26, 1998Oct 3, 2000Immersion CorporationForce feedback mouse wheel and other control wheels
US6131097 *May 21, 1997Oct 10, 2000Immersion CorporationHaptic authoring
US6139177 *Apr 25, 1997Oct 31, 2000Hewlett Packard CompanyDevice access and control using embedded web access functionality
US6144895 *Nov 26, 1997Nov 7, 2000Allen-Bradley Company, LlcSystem and method for networking a computer numerical control with a workstation
US6169540 *Jun 17, 1997Jan 2, 2001Immersion CorporationMethod and apparatus for designing force sensations in force feedback applications
US6173316 *Apr 8, 1998Jan 9, 2001Geoworks CorporationWireless communication device with markup language based man-machine interface
US6209037 *Dec 3, 1998Mar 27, 2001Roy-G-Biv CorporationMotion control systems using communication map to facilitating communication with motion control hardware
US6216173 *Feb 3, 1998Apr 10, 2001Redbox Technologies LimitedMethod and apparatus for content processing and routing
US6288716 *Jun 24, 1998Sep 11, 2001Samsung Electronics, Co., LtdBrowser based command and control home network
US6290565 *Jul 21, 1999Sep 18, 2001Nearlife, Inc.Interactive game apparatus with game play controlled by user-modifiable toy
US6295530 *May 15, 1996Sep 25, 2001Andrew M. RitchieInternet service of differently formatted viewable data signals including commands for browser execution
US6301634 *Jan 12, 2000Oct 9, 2001Seiko Epson CorporationReal time control method for a robot controller
US6309275 *Oct 10, 2000Oct 30, 2001Peter Sui Lun FongInteractive talking dolls
US6400996 *Feb 1, 1999Jun 4, 2002Steven M. HoffbergAdaptive pattern recognition based control system and method
US6442451 *Dec 28, 2000Aug 27, 2002Robotic Workspace Technologies, Inc.Versatile robot control system
US6519594 *Mar 1, 1999Feb 11, 2003Sony Electronics, Inc.Computer-implemented sharing of java classes for increased memory efficiency and communication method
US6546436 *Mar 30, 1999Apr 8, 2003Moshe FainmesserSystem and interface for controlling programmable toys
US6678713 *Apr 29, 1998Jan 13, 2004Xerox CorporationMachine control using a schedulerlock construct
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US6438454 *Nov 24, 2000Aug 20, 2002Sony CorporationRobot failure diagnosing system
US6959166 *Jun 23, 2000Oct 25, 2005Creator Ltd.Interactive toy
US7309234 *Dec 18, 2001Dec 18, 2007David Ross MathogMethod and device for introducing state changes into athletic activities
US7766829Nov 4, 2005Aug 3, 2010Abbott Diabetes Care Inc.Method and system for providing basal profile modification in analyte monitoring and management systems
US7811231Dec 26, 2003Oct 12, 2010Abbott Diabetes Care Inc.Continuous glucose monitoring system and methods of use
US7853645Jan 28, 2005Dec 14, 2010Roy-G-Biv CorporationRemote generation and distribution of command programs for programmable devices
US7860544Mar 7, 2007Dec 28, 2010Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US7869853Aug 6, 2010Jan 11, 2011Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US7885699Aug 6, 2010Feb 8, 2011Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US7904194Mar 26, 2007Mar 8, 2011Roy-G-Biv CorporationEvent management systems and methods for motion control systems
US7920907Jun 7, 2007Apr 5, 2011Abbott Diabetes Care Inc.Analyte monitoring system and method
US7928850May 8, 2008Apr 19, 2011Abbott Diabetes Care Inc.Analyte monitoring system and methods
US7976778Jun 22, 2005Jul 12, 2011Abbott Diabetes Care Inc.Blood glucose tracking apparatus
US8019827 *Aug 15, 2005Sep 13, 2011Microsoft CorporationQuick deploy of content
US8027349Sep 11, 2009Sep 27, 2011Roy-G-Biv CorporationDatabase event driven motion systems
US8066639Jun 4, 2004Nov 29, 2011Abbott Diabetes Care Inc.Glucose measuring device for use in personal area network
US8073557Mar 18, 2009Dec 6, 2011Roy-G-Biv CorporationMotion control systems
US8102869Jun 29, 2009Jan 24, 2012Roy-G-Biv CorporationData routing systems and methods
US8103456Jan 29, 2009Jan 24, 2012Abbott Diabetes Care Inc.Method and device for early signal attenuation detection using blood glucose measurements
US8112240Apr 29, 2005Feb 7, 2012Abbott Diabetes Care Inc.Method and apparatus for providing leak detection in data monitoring and management systems
US8123686Mar 1, 2007Feb 28, 2012Abbott Diabetes Care Inc.Method and apparatus for providing rolling data in communication systems
US8149117Aug 29, 2009Apr 3, 2012Abbott Diabetes Care Inc.Analyte monitoring system and methods
US8162829Mar 30, 2009Apr 24, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8175673Nov 9, 2009May 8, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8177716Dec 21, 2009May 15, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8187183Oct 11, 2010May 29, 2012Abbott Diabetes Care Inc.Continuous glucose monitoring system and methods of use
US8224413Oct 10, 2008Jul 17, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8226555Mar 18, 2009Jul 24, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8226557Dec 28, 2009Jul 24, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8226558Sep 27, 2010Jul 24, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8226891Mar 31, 2006Jul 24, 2012Abbott Diabetes Care Inc.Analyte monitoring devices and methods therefor
US8231532Apr 30, 2007Jul 31, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8235896Dec 21, 2009Aug 7, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8236242Feb 12, 2010Aug 7, 2012Abbott Diabetes Care Inc.Blood glucose tracking apparatus and methods
US8251904Jun 7, 2006Aug 28, 2012Roche Diagnostics Operations, Inc.Device and method for insulin dosing
US8255031Mar 17, 2009Aug 28, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8260392Jun 9, 2008Sep 4, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8265726Nov 9, 2009Sep 11, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8268243Dec 28, 2009Sep 18, 2012Abbott Diabetes Care Inc.Blood glucose tracking apparatus and methods
US8271105Jun 14, 2006Sep 18, 2012Roy-G-Biv CorporationMotion control systems
US8273022Feb 13, 2009Sep 25, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8275439Nov 9, 2009Sep 25, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8287454Sep 27, 2010Oct 16, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8306598Nov 9, 2009Nov 6, 2012Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8346336Mar 18, 2009Jan 1, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8346337Jun 30, 2009Jan 1, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8353829Dec 21, 2009Jan 15, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8357091Dec 21, 2009Jan 22, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8362904Apr 18, 2011Jan 29, 2013Abbott Diabetes Care Inc.Analyte monitoring system and methods
US8366614Mar 30, 2009Feb 5, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8372005Dec 21, 2009Feb 12, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8380273Apr 11, 2009Feb 19, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8391945Mar 17, 2009Mar 5, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8409131Mar 7, 2007Apr 2, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8437966Nov 20, 2009May 7, 2013Abbott Diabetes Care Inc.Method and system for transferring analyte test data
US8456301May 8, 2008Jun 4, 2013Abbott Diabetes Care Inc.Analyte monitoring system and methods
US8461985May 8, 2008Jun 11, 2013Abbott Diabetes Care Inc.Analyte monitoring system and methods
US8465425Jun 30, 2009Jun 18, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8473021Jul 31, 2009Jun 25, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8473220Jan 23, 2012Jun 25, 2013Abbott Diabetes Care Inc.Method and device for early signal attenuation detection using blood glucose measurements
US8480580Apr 19, 2007Jul 9, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8483974Nov 20, 2009Jul 9, 2013Abbott Diabetes Care Inc.Method and system for transferring analyte test data
US8512239Apr 20, 2009Aug 20, 2013Abbott Diabetes Care Inc.Glucose measuring device for use in personal area network
US8560250Aug 18, 2010Oct 15, 2013Abbott LaboratoriesMethod and system for transferring analyte test data
US8585591Jul 10, 2010Nov 19, 2013Abbott Diabetes Care Inc.Method and system for providing basal profile modification in analyte monitoring and management systems
US8593109Nov 3, 2009Nov 26, 2013Abbott Diabetes Care Inc.Method and system for powering an electronic device
US8593287Jul 20, 2012Nov 26, 2013Abbott Diabetes Care Inc.Analyte monitoring system and methods
US8597189Mar 3, 2009Dec 3, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8597575Jul 23, 2012Dec 3, 2013Abbott Diabetes Care Inc.Analyte monitoring devices and methods therefor
US8612159Feb 16, 2004Dec 17, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8617071Jun 21, 2007Dec 31, 2013Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8622903May 25, 2012Jan 7, 2014Abbott Diabetes Care Inc.Continuous glucose monitoring system and methods of use
US8622906Dec 21, 2009Jan 7, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8641619Dec 21, 2009Feb 4, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8647269Apr 20, 2009Feb 11, 2014Abbott Diabetes Care Inc.Glucose measuring device for use in personal area network
US8649841Apr 3, 2007Feb 11, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8652043Jul 20, 2012Feb 18, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8660627Mar 17, 2009Feb 25, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8665091Jun 30, 2009Mar 4, 2014Abbott Diabetes Care Inc.Method and device for determining elapsed sensor life
US8666469Nov 16, 2007Mar 4, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8668645Jan 3, 2003Mar 11, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8670815Apr 30, 2007Mar 11, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8672844Feb 27, 2004Mar 18, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8676513Jun 21, 2013Mar 18, 2014Abbott Diabetes Care Inc.Method and device for early signal attenuation detection using blood glucose measurements
US8682598Aug 27, 2009Mar 25, 2014Abbott LaboratoriesMethod and system for transferring analyte test data
US8688188Jun 30, 2009Apr 1, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8732188Feb 15, 2008May 20, 2014Abbott Diabetes Care Inc.Method and system for providing contextual based medication dosage determination
US8734346Apr 30, 2007May 27, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8734348Mar 17, 2009May 27, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8738109Mar 3, 2009May 27, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8744545Mar 3, 2009Jun 3, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8765059Oct 27, 2010Jul 1, 2014Abbott Diabetes Care Inc.Blood glucose tracking apparatus
US8771183Feb 16, 2005Jul 8, 2014Abbott Diabetes Care Inc.Method and system for providing data communication in continuous glucose monitoring and management system
US8774887Mar 24, 2007Jul 8, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8795022Jul 18, 2008Aug 5, 2014Hydrae LimitedInteracting toys
US8827761Jan 21, 2009Sep 9, 2014Hydrae LimitedInteracting toys
US8840553Feb 26, 2009Sep 23, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8880137Apr 18, 2003Nov 4, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8915850Mar 28, 2014Dec 23, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8920319Dec 28, 2012Dec 30, 2014Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8930203Feb 3, 2010Jan 6, 2015Abbott Diabetes Care Inc.Multi-function analyte test device and methods therefor
US8933664Nov 25, 2013Jan 13, 2015Abbott Diabetes Care Inc.Method and system for powering an electronic device
US8974386Nov 1, 2005Mar 10, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US8993331Aug 31, 2010Mar 31, 2015Abbott Diabetes Care Inc.Analyte monitoring system and methods for managing power and noise
US9000929Nov 22, 2013Apr 7, 2015Abbott Diabetes Care Inc.Analyte monitoring system and methods
US9011331Dec 29, 2004Apr 21, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9011332Oct 30, 2007Apr 21, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9014773Mar 7, 2007Apr 21, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9035767May 30, 2013May 19, 2015Abbott Diabetes Care Inc.Analyte monitoring system and methods
US9039975Dec 2, 2013May 26, 2015Abbott Diabetes Care Inc.Analyte monitoring devices and methods therefor
US9042953Mar 2, 2007May 26, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9066694Apr 3, 2007Jun 30, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9066695Apr 12, 2007Jun 30, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9066697Oct 27, 2011Jun 30, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9066709Mar 17, 2014Jun 30, 2015Abbott Diabetes Care Inc.Method and device for early signal attenuation detection using blood glucose measurements
US9072477Jun 21, 2007Jul 7, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9078607Jun 17, 2013Jul 14, 2015Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9095290Feb 27, 2012Aug 4, 2015Abbott Diabetes Care Inc.Method and apparatus for providing rolling data in communication systems
US9177456Jun 10, 2013Nov 3, 2015Abbott Diabetes Care Inc.Analyte monitoring system and methods
US9226701Apr 28, 2010Jan 5, 2016Abbott Diabetes Care Inc.Error detection in critical repeating data in a wireless sensor system
US9314195Aug 31, 2010Apr 19, 2016Abbott Diabetes Care Inc.Analyte signal processing device and methods
US9314198Apr 3, 2015Apr 19, 2016Abbott Diabetes Care Inc.Analyte monitoring system and methods
US9320461Sep 29, 2010Apr 26, 2016Abbott Diabetes Care Inc.Method and apparatus for providing notification function in analyte monitoring systems
US9323898Nov 15, 2013Apr 26, 2016Abbott Diabetes Care Inc.Method and system for providing basal profile modification in analyte monitoring and management systems
US9326714Jun 29, 2010May 3, 2016Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9326716Dec 5, 2014May 3, 2016Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9380971Dec 5, 2014Jul 5, 2016Abbott Diabetes Care Inc.Method and system for powering an electronic device
US9477811Jun 23, 2005Oct 25, 2016Abbott Diabetes Care Inc.Blood glucose tracking apparatus and methods
US9498159Oct 30, 2007Nov 22, 2016Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9574914Mar 3, 2014Feb 21, 2017Abbott Diabetes Care Inc.Method and device for determining elapsed sensor life
US9610034Nov 9, 2015Apr 4, 2017Abbott Diabetes Care Inc.Analyte monitoring device and methods of use
US9625413May 19, 2015Apr 18, 2017Abbott Diabetes Care Inc.Analyte monitoring devices and methods therefor
US9649057May 11, 2015May 16, 2017Abbott Diabetes Care Inc.Analyte monitoring system and methods
US9669162Mar 16, 2016Jun 6, 2017Abbott Diabetes Care Inc.Method and system for providing basal profile modification in analyte monitoring and management systems
US20020066095 *Jan 8, 2001May 30, 2002Yueh-O YuProcess and device for updating personalized products
US20030114256 *Dec 18, 2001Jun 19, 2003Mathog David RossMethod and device for introducing state changes into athletic activities
US20030233432 *Jun 18, 2002Dec 18, 2003John DavisWeb-based interface for building management systems
US20040043373 *Sep 4, 2002Mar 4, 2004Kaiserman Jeffrey M.System for providing computer-assisted development
US20050195076 *Feb 7, 2005Sep 8, 2005Caretouch Communications, Inc.Intelligent message delivery system
US20050195077 *Feb 23, 2005Sep 8, 2005Caretouch Communications, Inc.Communication of long term care information
US20060117324 *Dec 4, 2003Jun 1, 2006Koninklijke Philips Electronics N.V.System and method for controlling a robot
US20070038726 *Aug 15, 2005Feb 15, 2007Microsoft CorporationQuick deploy of content
US20070050431 *Aug 26, 2005Mar 1, 2007Microsoft CorporationDeploying content between networks
US20080139080 *Jul 25, 2007Jun 12, 2008Zheng Yu BrianInteractive Toy System and Methods
US20080214089 *Mar 1, 2007Sep 4, 2008Geraldine VermacGet well toy
US20090157199 *Oct 2, 2008Jun 18, 2009Brown David WMotion Control Systems
US20090157807 *Feb 23, 2009Jun 18, 2009Brown Stephen JSystem and/or method for generating a script relating to a medical task involving motion with a device
US20110143631 *Jan 21, 2009Jun 16, 2011Steven LipmanInteracting toys
US20110169832 *Jan 11, 2011Jul 14, 2011Roy-G-Biv Corporation3D Motion Interface Systems and Methods
US20160042020 *Jun 19, 2015Feb 11, 2016Triplay, Inc.Using Mote-Associated Indexes
DE102011121668A1 *Dec 20, 2011Jun 20, 2013Fresenius Medical Care Deutschland GmbhVerfahren und Vorrichtung zur Vorbereitung von medizinischen Behandlungsvorrichtungen
WO2004056537A2 *Dec 4, 2003Jul 8, 2004Koninklijke Philips Electronics N.V.System and method for controlling a robot
WO2004056537A3 *Dec 4, 2003Oct 21, 2004Yasser AlsafadiSystem and method for controlling a robot
WO2005081802A2 *Feb 10, 2005Sep 9, 2005Caretouch Communications, Inc.Intelligent message delivery system
WO2005081802A3 *Feb 10, 2005Dec 6, 2007Caretouch Communications IncIntelligent message delivery system
WO2017028571A1 *May 17, 2016Feb 23, 2017Smart Kiddo Education LimitedAn education system using connected toys
Classifications
U.S. Classification718/100
International ClassificationG09B5/14, A63H3/28
Cooperative ClassificationA61B34/25, A61B2034/258, H04L67/1078, H04L67/42, A61B2560/0493, G06F17/30312, A63H3/28, A63H2200/00, G09B5/14
European ClassificationA63H3/28, G09B5/14
Legal Events
DateCodeEventDescription
May 21, 2001ASAssignment
Owner name: ROY-G-BIV CORPORATION, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BROWN, STEPHEN J.;BROWN, DAVID W.;REEL/FRAME:011817/0759
Effective date: 20010425