US 20080277477 A1
A system and method for recalibrating a bar code scanner or other optical system is described. In some cases, the system dynamically adjusts the focal length of a lens assembly based on open-loop process, such as based on feedback related to a distance between the lens assembly and an imaged object. In some cases, the system dynamically adjusts the lens assembly in order to provide auto focusing or zoom adjustment
1. A method of dynamically recalibrating a bar code reader, the method comprising:
receiving a recalibration signal or an indication of an error in reading an image of a machine readable code imaged by the bar code reader using a lens assembly calibrated at a first focal length, wherein the bar code reader employs open-loop auto focus controls;
measuring a distance between the machine readable code and a lens assembly in the bar code scanner, wherein the distance is measured with a range finder carried by the bar code scanner;
determining a second focal length of the lens assembly that provides a better image of the machine readable code; and
recalibrating the open-loop auto focus controls of the bar code reader using the determined second focal length and the measured distance.
2. The method of
3. The method of
4. The method of
5. The method of
measuring a second distance between the machine readable code and the lens assembly in the bar code reader, wherein the second distance is measured from a different position with respect to the machine readable code; and
determining a third focal length of the lens assembly that provides a better image of the machine readable code than the first focal length;
wherein recalibrating the bar code reader comprises using at least in part the third focal length and the measured second distance.
6. The method of
7. The method of
8. The method of
9. The method of
10. The method of
11. The method of
12. A hand-held device configured to take an image of an object, comprising:
a lens assembly having an adjustable focal length;
a range finder, wherein the range finder determines a distance between the hand-held device and a position on a selected object;
an operation monitoring component, wherein the operation monitoring component receives an adjustment signal and initiates an adjustment of the focal length; and
a dynamic adjustment component coupled to the lens assembly and range finder, wherein the dynamic adjustment component adjusts the focal length of the lens assembly based on information received from the range finder and based on the received signal from the operation monitoring component.
13. The device of
14. The device of
a bottom plate, wherein the bottom plate includes a substrate layer, one or more electrodes, and a hydrophobic layer that provides a surface for the liquid; and
a top plate, wherein the top plate includes a substrate layer and an electrode;
wherein the bottom plate and top plate form a cavity that contains the liquid.
15. The device of
an adjustable lens component containing the liquid; and
one or more additional lens components configured to provide a substantially infinite focus distance.
16. The device of
17. The device of
18. The device of
19. The device of
a focus optimization component, wherein the focus optimization component determines an optimum focus of the lens assembly by analyzing an image of the selected object;
wherein the dynamic adjustment component adjusts the focal length of the lens assembly at least in part based on information received from the focus optimization component.
20. The device of
a focus optimization component, wherein the focus optimization component determines an optimum focus of the lens assembly;
wherein the dynamic adjustment component adjusts the focal length of the lens assembly at least in part based on information received from the focus optimization component.
21. The device of
22. The device of
23. The device of
24. The device of
25. A computer memory within an imaging device and containing a data structure, the data structure used to calibrate the imaging device, the data structure comprising:
one or more entries relating distance from an object to be imaged with the device and a lens of the device and a focal length to be applied to the lens of the device, wherein the distance is measured by an open-loop control process and the focal length is determined by a closed-loop process.
26. The computer memory of
27. A system for dynamically recalibrating a bar code reader, the method comprising:
means for receiving an indication of an error in reading an image of a machine readable code imaged by the bar code reader using a lens assembly calibrated at a focal length;
means for measuring a first distance and a second distance, wherein the first distance is a distance between a first position on a machine readable code and a lens assembly in the bar code reader and the second distance is a distance between a second position on a machine readable code and the lens assembly;
means for determining an alternate focal length of the lens assembly that provides an adjusted image of the machine readable code; and
recalibrating the bar code reader using the determined alternate focal length and the measured distances.
28. A machine readable symbol reader configured to image a machine readable symbol, comprising:
a lens assembly having an adjustable focal length;
an open-loop measurement component, wherein the open-loop measurement component determines a distance between the machine readable symbol reader and a position on a machine readable symbol;
a focus adjustment component, wherein the focus adjustment component determines an adjusted focus of the lens assembly; and
a dynamic adjustment component coupled to the lens assembly and open-loop measurement component, wherein the dynamic adjustment component adjusts the focal length of the lens assembly based on one or more determined distances and based on one or more focus adjustments.
29. The reader of
This application is related to commonly-assigned U.S. patent application Ser. No. 11/040,485, filed on Jan. 20, 2005, entitled AUTOFOCUS BARCODE SCANNER AND THE LIKE EMPLOYING MICROFLUIDIC LENS, and commonly-assigned U.S. patent application Ser. No. (attorney docket No. 110418335US), filed concurrently herewith, entitled TEMPERATURE COMPENSATED AUTO FOCUS CONTROL FOR A MICROFLUIDIC LENS, SUCH AS AUTO FOCUS CONTROL FOR A MICROFLUIDIC LENS OF A BAR CODE SCANNER, both of which are hereby incorporated by reference.
Closed-loop systems generally employ a feedback component that assesses the operation of the system and modifies aspects of the system based on the operational assessment. One example of such a system is a typical bar code scanner having an auto focus control system. Being closed-loop, the auto focus control system maintains or modifies the focus of optical components by analyzing images captured by the system. These systems often require long response times in refocusing a lens system, as many control and/or measurement cycles are performed during the image analysis in order to accurately determine the correct focus measurement.
Currently, bar code scanners and other machine-readable symbol imagers utilize a variety of lens actuator systems to provide auto focus control. These scanners often have problems related to the speed of correcting optical components (as described) and in the accuracy of measurement (e.g., open-loop scanners without feedback components). These and other problems exist with respect to providing auto focus control in bar code scanners.
Described in detail below is a system of providing auto focus or other control for a lens system in a bar code scanner or other machine readable symbol imaging device using an open-loop control mechanism. In some examples, the system employs a lens having an electrowetting component, and actuates the lens using the electrowetting component. The system dynamically compensates for errors in focusing the device to an object, such as errors due to aging of the lens or of the device, environmental changes, drift, and other errors. Using a range finder that determines the distance between an object and the device, the system is able to determine a correction to apply to the focus of the lens in a single step. For example, the system may relate a measured distance with a current focused image and correct the focus of the image for that distance.
In some examples, the system may use the open-loop data taken from the range finder to calibrate or correct any focus transfer functions that focus or defocus the lens. The focus or defocus transfer functions may use parametric equations or look-up-tables to relate a determined error correction to a focus command.
In some examples, the system triggers calibration (or, recalibration) upon decoding errors or reading errors. The system may receive user input and trigger the calibration based on a user's request, or periodically or occasionally recalibrate, refresh, or change the transfer functions of the device (without being prompted by decoding errors and/or user input).
Various examples of the technology will now be described. The following description provides specific details for a thorough understanding and enabling description of these examples. One skilled in the art will understand, however, that the technology may be practiced without many of these details. Additionally, some well-known structures or functions may not be shown or described in detail, so as to avoid unnecessarily obscuring the relevant description of the various examples.
The terminology used in the description presented below is intended to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of certain specific examples of the technology. Certain terms may even be emphasized below; however, any terminology intended to be interpreted in any restricted manner will be overtly and specifically defined as such in this Detailed Description section.
Aspects of the technology can be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein. Aspects of the technology can also be practiced in distributed computing environments where tasks or modules are performed by remote processing devices, which are linked through a communication network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Aspects of the technology may be stored or distributed on computer-readable media, including magnetically or optically readable computer disks, as microcode on semiconductor memory, nanotechnology memory, organic or optical memory, or other portable data storage media. Indeed, computer-implemented instructions, data structures, screen displays, and other data under aspects of the technology may be distributed over the Internet or over other networks (including wireless networks), on a propagated signal on a propagation medium (e.g., an electromagnetic wave(s), a sound wave, etc.) over a period of time, or may be provided on any analog or digital network (packet switched, circuit switched, or other scheme). Those skilled in the relevant art will recognize that portions of the technology reside on a server computer, while corresponding portions reside on a client computer, such as a hand-held scanning device.
The reader 100 may include a light source 130 to illuminate an object, and may include a range finder 140 to detect distances between the reader 100 and an object. The system may use information derived from the range finder to assist in focus control or other modifications.
The reader 100 may control components and/or the flow or processing of information or data between components using one or more processors 150 in communication with memory 156, such as ROM or RAM (and instructions or data contained therein) and the other components via a bus 152. Memory 156 may contain data structures or other files or applications that provide information related to actuation of the lens assembly. For example, memory 156 may contains one or more look-up-tables 170 or parametric equations 180 that relate open-loop based information (such as information measured by the range finder 140) with system information, such as focus or image information. The device 100 may use the look-up-tables 170 or parametric equations 180 stored in memory 156 as transfer functions that adjust, change or refresh transfer functions that control the actuator of the lens assembly.
Components of the system may receive energy via power component 158. Additionally, the system may receive or transmit information or data to other modules, remote computing devices, and so on via communication component 154. Communication component 154 may be any wired or wireless components capable of communicating data to and from reader 100. Examples include a wireless radio frequency transmitter, infrared transmitter (such as an RFID transmitter) or hard-wired cable, such as a USB cable. Reader may include other additional components 160, 162 not explicitly described herein, such as additional microprocessor components, removable memory components (flash memory components, smart cards, hard drives), and other components.
Additionally, reader 100 may include a temperature sensor 170 and/or other environmental or other sensors 180. For example, other environmental sensors may include humidity sensors, light sensors, pressure sensors, motion sensors, and so on. Other sensors may include an acceleration sensor (such as an accelerometer) that senses movement of the device, such as when a user shakes the device, inverts the device, and so on.
Temperature sensor may interact with lens 120 (and associated actuator system) via bus 152. The temperature sensor 170 may be a number of different sensors, including resistance thermometers, thermistors, thermocouples, silicon bandgap temperature sensors, and other electrical or mechanical sensors. Further details with respect to the temperature sensor are discussed in commonly-assigned U.S. patent application Ser. No. ______ (attorney docket No. 110418335US), filed concurrently herewith, entitled TEMPERATURE COMPENSATED AUTO FOCUS CONTROL FOR A MICROFLUIDIC LENS, SUCH AS AUTO FOCUS CONTROL FOR A MICROFLUIDIC LENS OF A BAR CODE SCANNER.
As described herein, the system may perform auto focusing of the lens assembly 120 using a microfluidic lens actuator. Referring to
The bottom plate 240 may include a substrate 242, a plurality of electrodes 244 a, 244 b, a dielectric layer 246 that overlays the electrodes, and a hydrophobic layer 248 that provides an inner surface of bottom plate 240 in forming cavity 210. In some cases, the entire bottom plate 240 is transparent, although is some case only parts of the bottom plate 240 may be transparent. For example, the bottom plate may be formed of glass for the substrate), indium tin oxide, or ITO, for the electrodes, and a fluoropolymer for the hydrophobic layer. Other materials and configurations are of course possible.
The top plate 250 may include a substrate 252 (formed of glass or other transparent materials), and an electrode 254 (formed of indium tin oxide). As with the bottom plate 240, in some cases the top plate is formed of transparent materials and in some cases the top plate 250 may be only partially transparent.
Applying a voltage V to the electrodes (244 a, 244 b of the bottom plate and 254 of the top plate) causes a first potential to be applied to the first liquid 230 and a second potential to be applied to the second liquid 235. Under the principles of electrowetting, the applied voltage causes the contact between the first liquid and the hydrophobic layer to become less hydrophobic, and liquid 230 may change shape, moving from shape 230 b to shape 230 a. That is, a contact angle Θa between the liquid as shape 230 a and the layer 258 is much smaller than a contact angle Θb between the liquid as shape 230 b and the layer 258.
Using these principles, a simple application of voltage to the lens assembly electrodes changes the shape of liquid 230, effectively changing the focus of the lens assembly. Thus liquid 230 acts as the lens, and the system applies a voltage to the liquid to modify the lens and accurately focus an image of an object to the optical sensor 110 using liquid 230 as the lens. Further details with respect to the lens assembly 120 may be found in commonly-assigned U.S. patent application Ser. No. 11/040,485, filed on Jan. 20, 2005, entitled AUTOFOCUS BARCODE SCANNER AND THE LIKE EMPLOYING MICROFLUIDIC LENS.
For example, in order to accurately read a bar code or other machine readable symbol (an object), the system may require an accurate or clear image of the bar code to be placed on the optical sensor. Using the Gaussian Lens Equation:
(where f is the focal length of the lens assembly, p is the lens to image distance, and p′ is the lens to object distance), the image distance, or p, depends on an accurate focal length of the lens assembly, as the only other variable is the lens to object distance. Thus, modifying the focal length f of the first liquid (in effect, changing the curvature of the liquid) using the electrowetting principles described above allows the system to modify the image distance p, enabling the system to place the image onto the optical sensor 110 with sufficient accuracy. Therefore, because the system may rely on the liquid lens for focusing, the system should be able to compensate for factors that affect the microfluidic lens assembly 120, as the microfluidic lens controls the focal length of the lens.
In some examples, in addition to a microfluidic lens component, the lens assembly may contain a number of stacked lens components (such as stacked transparent plastic lenses, glass lenses, Fresnel diffractive components, and so on) having with the liquid lens an approximately infinite best focus distance. These lens components may provide an initial optical power for the lens assembly. The system then uses the microfluidic lens component to shorten the focal length of the lens assembly (in some cases to 10 centimeters or smaller). Thus, the assembly provides the system with high optical power using the stacked lens components and accurate focusing using the microfluidic lens component.
Alternatively, or additionally, the system may employ other optical components when focusing the lens assembly. In some cases, the system may use a translational optical stage, nematic liquid lens, deformable mirror, and so on.
As described herein, the system may perform auto focusing of the lens based on open-loop information received from the range finder 140. Referring to
In some cases, the length (c) of the emitted beam 310 to the object, the length (b) of the reflected beam 312 to the lens assembly 120, and the length (a) between the lens assembly 120 and the range finder (140) form the sides of a triangle having complementary angles C, B, and A, respectively. Using the Law of Sines and some known values based on device specifications, the length b may be realized and the distance from the spot 322 on the object 320 to the lens assembly 120 may be determined.
For example, if the following values are known:
C=90 degrees−tan−1(h/t)=83 degrees
A=180 degrees−C−B=12 degrees
The value for b may be found using the Law of Sines:
Thus, using the range finder and triangulation, the system is able to determine the distance from the spot on the image 322 to the lens assembly 120 using an open-loop process. Given the measured distance, the system may be able to calibrate the lens assembly (or correct a zoom value) by relating the distance to an image produced at that distance by the object or by a section of the object. Further details will be discussed herein.
Of course, the system may employ other methods to determine the distance using open-loop methods. The system may transmit other types of waves, such as sound waves, electromagnetic waves, infrared waves, and so on.
The system may dynamically measure the distance between an object and the device, and, therefore, dynamically calibrate or recalibrate transfer functions that drive the focus actuator of the lens assembly. Referring to
In step 410, the system may optionally receive an indication of a decoding error in imaging an object, such as in reading a barcode, or an image acquisition query. For example, the system may receive a number of error messages indicating a failure to decode an image of an object. The system may also periodically or occasionally initiate a recalibration process. In step 420, the system initiates an open-loop control process, such as a process to determine the distance from the lens assembly of the device to the object. The system may initiate the open-loop control process after receiving a single decoding error message, or after receiving a predetermined number of error messages, such as a number above a threshold number of messages. Additionally, the system may receive a manual indication from a user of the device to initiate the routine 400.
In step 430, the system determines any error using the open-loop process. For example, the system uses the range finder to determine the distance to the object, and relates the distance to a predetermined better focus position of the lens actuator of the lens assembly. In step 440, the system corrects the focus of the device, such as by correlating the measured distance with the better focus position and updating the command or commands that control the focusing and defocusing of the lens assembly.
In some cases, should routine 400 not correct the focus to an acceptable level, the system may proceed with other processes to correct the focus, such as closed-loop processes that compare images received by the device.
The system may use the open-loop processes described herein in collaboration with passive focusing algorithms in order to recalibrate the device. Referring to
In step 530, the system determines a better or adjusted focus position (such as the best focus position). In some cases, the system uses passive focus algorithms that analyze the image content, such as by maximizing the Sum Modulus Difference of the image. Further details with respect to the determining the better focus position may be found in commonly-assigned U.S. patent application Ser. No. 11/040,485, filed on Jan. 20, 2005, entitled AUTOFOCUS BARCODE SCANNER AND THE LIKE EMPLOYING MICROFLUIDIC LENS.
The system may move the lens assembly forwards and backwards (effectively changing the focal length of the lens assembly) to determine the best focus position from the images received due to the movement. The system then compares the different received images, such as comparing amplitudes of the images and determining the peak of the amplitudes. As described herein, the system may only image a few positions of an object and perform the analysis using imaged positions of the object.
In step 540, the system dynamically recalibrates the distance to focus transfer function using a determined adjusted focus position and the measured distance. For example, the system refreshes look-up-tables, parametric equations, or other algorithms or mathematical functions that control actuation of the focus of the lens assembly.
In some cases, the system may perform routine 420 multiple times, in order to determine an acceptable adjusted focus position for a number of different distances between the object and the lens assembly, and recalibrate the device based on the multiple determinations. The system may perform multiple recalibration routines due to the degree of error (such as due to the drift), due to the equations and/or look-up-tables controlling the focusing of the device, due to expected or future calibrations, due to increasing the accuracy of the recalibration, and other reasons.
Additionally, in some examples the system may use dynamically refocusing with other open-loop processes, such as with the temperature focus compensation process described in commonly-assigned U.S. patent application Ser. No. ______ (attorney docket No. 110418335US), filed concurrently herewith, entitled TEMPERATURE COMPENSATED AUTO FOCUS CONTROL FOR A MICROFLUIDIC LENS, SUCH AS AUTO FOCUS CONTROL FOR A MICROFLUIDIC LENS OF A BAR CODE SCANNER. For example, the system may use the open-loop temperature or other environmental compensation processes to determine the better focus position, and use the range finder to determine the measure distance. Thus, the system may dynamically recalibrate the device only using open loop processes.
The system may attempt to recalibrate the device based on numerous indicators. For example, the system may begin recalibration after receiving error indicators, or may recalibrate periodically. For example, the system may recalibrate the device based on time indicators (e.g. once a month, everyday), every time the device is powered on or powered off, or based on user input indicators (the user presses a recalibrate button, a non-savvy user shakes the device in an effort to “fix” the device and triggers an acceleration sensor, and so on).
Additionally, the system may alert the user when the device is in recalibration mode, such as by blinking a light, providing information to the user via a user interface, beeping or performing other audible indicators, and so on.
In addition to bar code and other symbol readers, the system may employ the methods described herein to adjust a zoom value of other imaging devices, such as the zoom value for a camera, camcorder, and so on.
Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
The above detailed description of embodiments of the technology is not intended to be exhaustive or to limit the technology to the precise form disclosed above. While specific embodiments of, and examples for, the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times.
The teachings of the technology provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments.
Any patents and applications and other references noted above, including any that may be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the technology can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the technology.
These and other changes can be made to the technology in light of the above Detailed Description. While the above description describes certain embodiments of the technology, and describes the best mode contemplated, no matter how detailed the above appears in text, the technology can be practiced in many ways. Details of the data collection and processing system may vary considerably in its implementation details, while still being encompassed by the technology disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the technology should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the technology with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the technology to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the technology encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the technology under the claims.
While certain aspects of the technology are presented below in certain claim forms, the inventors contemplate the various aspects of the technology in any number of claim forms. For example, while only one aspect of the technology is recited as embodied in a computer-readable medium, other aspects may likewise be embodied in a computer-readable medium. Accordingly, the inventors reserve the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the technology.