US20100149211A1 - System and method for cropping and annotating images on a touch sensitive display device - Google Patents
System and method for cropping and annotating images on a touch sensitive display device Download PDFInfo
- Publication number
- US20100149211A1 US20100149211A1 US12/507,039 US50703909A US2010149211A1 US 20100149211 A1 US20100149211 A1 US 20100149211A1 US 50703909 A US50703909 A US 50703909A US 2010149211 A1 US2010149211 A1 US 2010149211A1
- Authority
- US
- United States
- Prior art keywords
- image
- point
- rectangle
- user
- recited
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0637—Strategic management or analysis, e.g. setting a goal or target of an organisation; Planning actions based on goals; Analysis or evaluation of effectiveness of goals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/018—Certifying business or products
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P80/00—Climate change mitigation technologies for sector-wide applications
- Y02P80/20—Climate change mitigation technologies for sector-wide applications using renewable energy
Definitions
- the present invention is generally related to user-interfaces for touch-sensitive displays. More specifically, the instant invention relates to a system and method for capturing, cropping, and annotating images on a touch sensitive display device or other handheld device.
- Cropping and annotating images is important in graphic user interfaces (GUIs), both for manipulating images in graphics applications such as Adobe Photoshop®, Microsoft Paint®, and the like, and also for cropping and annotating images for insertion into textual documents such as Adobe Portable Document Format (PDF)®, Microsoft Word®, and the like.
- GUIs graphic user interfaces
- PDF Adobe Portable Document Format
- FIG. 1 One prior method of image cropping and annotating is shown in FIG. 1 .
- Window 1 ( 101 ) shows an image of a plant 108 , which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation.
- a mouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102 ), then performing a drag operation 110 (shown as dashed line 110 ) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106 ), and then releasing the mouse button at the point 106 .
- Points 102 and 106 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the crop area.
- a user hits, selects, or otherwise operates a menu bar, selecting the crop/annotate operation, which then either crops or annotates the image using the LLH 102 and URH 106 points to define the bounding box.
- This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable for use in a handheld or other field device, or locations or situations where greater flexibility and less interactions with the GUI are highly desirable.
- Applications of the present invention include digital cameras, digital video cameras, phones with built-in cameras, phones with built-in display devices, such as the Apple iPhone®, and the like.
- the present invention may be used to provide a simple and convenient method to crop and annotate images in situations and locations where such ease is important and/or necessary.
- one concrete application of the present invention is related to supplying a convenient user interface for a handheld device used for industrial inspection and maintenance compliance systems, as described in related U.S. Ser. No. 12/489,313.
- the present invention allows an easy mechanism for on-site inspectors to quickly crop and annotate images in the field to substantiate problems found during an inspection.
- the present invention is a system and method for cropping and annotating images on a touch sensitive display device or other handheld device.
- One embodiment of the present invention is a method for cropping images, including the steps of (a) displaying an image of the image file to be cropped; (b) receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle, e.g., a lower left hand corner; (c) receiving a second input from the user designating a second point in the image defining the opposite corner of the crop rectangle, e.g. an upper right hand corner; and (d) cropping the image to the crop rectangle defined by the two corners when the second input is released.
- Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
- Another embodiment of the present invention is the method described above also including the step of displaying a rectangle corresponding to the crop rectangle of the image before cropping the image.
- Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually edit a shape and a size of the crop rectangle.
- Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
- Another embodiment of the present invention is the method described above also including the step of displaying the cropped image in the display area in place of the original image.
- Another embodiment of the present invention is the method described above also including the step of scaling the cropped image to fill the entire display area.
- Yet another embodiment of the present invention is a method of annotating an image (where annotating an image includes superimposing one or more geometrical shapes on top of the image), the method including the steps of (a) displaying an image of the image file to be annotated; (b) receiving a first input from a user designating a first point in the image defining a corner of an annotation rectangle, e.g. the lower left hand corner; (c) receiving a second input from the user designating an opposite corner of the annotation rectangle, e.g., the upper right hand corner; and (d) annotating the image in the annotation rectangle defined by the two corners when the second input is released.
- Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
- Another embodiment of the present invention is the method described above also including the step of displaying a shape corresponding to the annotation of the image before annotating the image.
- Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually show a shape and a size of the annotation area.
- Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
- Another embodiment of the present invention is the method described above also including the step of displaying the annotated image in the display area in place of the original image.
- Another embodiment of the present invention is the method described above also including the step of receiving a third input representing a type of shape and a characteristic of the shape corresponding to the annotation.
- Another embodiment of the present invention is the method described above where the shape is, but is not limited to, a line, a rectangle, an ellipse, or a circle.
- Another embodiment of the present invention is the method described above where the characteristics of the shape include, but is not limited to, a line type, a line width, and a line color.
- the present invention also includes a related system by which the method of capturing, cropping, and annotating an image could be carried out.
- a system could be implemented as a computer system, embodied in a handheld device.
- the system may include integrated or separate hardware components for taking of media samples and means for receiving touch input.
- FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device
- FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention
- FIG. 3A shows a first step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention
- FIG. 3B shows a second step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention
- FIG. 3C shows a third step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention.
- FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device
- FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention
- FIG. 6A shows a first step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention
- FIG. 6B shows a second step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention
- FIG. 6C shows a third step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention.
- FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention
- FIG. 8 is an illustration of a multi-functional handheld device, in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention.
- FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site;
- FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as an Apple iPhone® or other like device; and
- FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display.
- the present invention generally pertains to a system and method for capturing, cropping, and annotating images on a touch sensitive display or other handheld device.
- the interface could have, but is not limited to, the following components. Any subsets of the following components are also within the scope of this invention.
- the user can choose to crop the image as follows:
- the user can choose to annotate the image as follows:
- the invention may be used in an industrial inspection compliance system with which various methods can be carried out to the effect of assisting in an inspection and providing the means for compliance verification of a proper inspection.
- an inspection may represent the process of checking a physical component for safety, security or business reasons, doing the same for compliance with industrial standards and guidelines, or a maintenance operation on a physical component for those same reasons.
- These methods can generally be best executed by a multi-function handheld device, carried to and used in the physical proximity of an inspection component by the inspector. Examples of multi-function handheld devices include the Apple iPhone®, the Psion Teklogix Workabout Pro®, the Motorola MC-75®, and the like, but the present invention is not limited to such devices as shown or described here.
- One embodiment of the inspection compliance method includes the steps of scanning unique machine-readable tags deployed at logical inspection points defined by the inspector, and assigning a timestamp to the scanning operation; taking media samples of logical inspection points defined by the inspector, and assigning a timestamp to the media sample capturing operation; reporting of sub-optimal conditions of the unique machine-readable tags deployed at logical inspection points if its condition warrants such a declaration; associating a media sample with a corresponding scan of a unique machine-readable tag; and annotating a media sample in such ways that substantiate statements of an industrial component passing inspection, or in such ways that substantiate statements of problems found with the industrial component. See U.S. Ser. No. 12/489,313 for more details of an example of an industrial inspection compliance system to which the present invention may be applied.
- FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device.
- Window 1 ( 101 ) shows an image of a plant 108 , which could be any image or photograph, previously stored in memory or taken live right before the cropping operation.
- a mouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102 ), then performing a drag operation 110 (shown as dashed line 110 ) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106 ), and then releasing the mouse button at the point 106 .
- Points 102 and 106 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the crop area.
- a user hits, selects, or otherwise operates a menu bar, selecting the crop operation, which then crops the images using the LLH 102 and URH 106 points to define the bounding box.
- This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field device, or locations or situations where greater flexibility and less interactions with the GUI are highly desirable.
- FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention.
- Process 200 begins at step 202 , where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to the process 200 .
- the image is displayed on the touch sensitive display or other display of the handheld device.
- the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin.
- step 208 the user may click or tap (using the finger, the stylus, the mouse, or other device) at a URH location where the crop is to end.
- step 210 the image is cropped between the LLH location and the URH location.
- step 212 the cropped image is displayed for the user's confirmation. At this point (not shown), the user may cancel, undo, or accept the crop. The process ends in step 214 .
- FIG. 3A shows the first step in the process for cropping the image on the handheld device, in accordance with the embodiment described in relation to FIG. 2 .
- Screen 302 shows a screen or touch-sensitive area of a handheld device.
- Window 2 ( 304 ) shows one of many windows showing the image the user desires to crop.
- the user uses his or her hand 308 (or stylus, mouse, or other device) to click or tap at point 306 (shown as solid cross hair 306 ).
- the position of the click or tap 306 represents a LLH corner of the crop boundary.
- FIG. 3B shows the second step in the process for cropping the image on the handheld device.
- point 306 now shown as a dashed cross hair 306
- the user may move his or her hand 308 (or stylus, mouse, or other device) shown as dashed motion line 310 , to another location 312 (shown as solid cross hair 312 ) and click or tap a second time.
- the second tap at location 312 represents an URH corner of the crop boundary.
- FIG. 3C shows the third and final step in the process for cropping the image on the handheld device.
- the crop operation is performed in the background, and an updated or cropped image is displayed for the user's confirmation.
- Point 306 shown as a dashed cross hair 306
- point 312 now also shown as dashed cross hair 312
- a user of the present invention may implement a crop operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
- FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device.
- Window 1 ( 401 ) shows an image of a plant 408 , which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation.
- a mouse 403 is used to click at a point on the screen 402 (shown as dashed cross hair 402 ), then performing a drag operation 410 (shown as dashed line 410 ) while holding down the mouse button to another point on the screen 406 (shown as solid cross hair 406 ), and finally releasing the mouse button at the point 406 .
- Points 402 and 406 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the annotation area.
- a user hits, selects, or otherwise operates a menu bar, selecting the proper annotate operation (circle, oval, rectangle, arrow, line, etc.), which then annotates the image using the LLH 102 and URH 106 points to define the bounding box.
- This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field devices, or locations or situations where greater flexibility and fewer interactions with the GUI are highly desirable.
- FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention.
- Process 500 begins at step 502 , where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to the process 500 .
- the image is displayed on the touch sensitive display or other display of the handheld device.
- the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin.
- step 508 the user may click or tap (using the finger, the stylus, the mouse, or other device) at an URH location where the annotatio is to end.
- step 510 the image is annotated between the LLH location and the URH location.
- step 512 the annotated image is displayed for the user's confirmation. At this point (not shown), the user may cancel, undo, or accept the annotation. The process ends in step 514 .
- FIG. 6A shows the first step in the process for annotating the image on the handheld device, in accordance with the embodiment described in relation to FIG. 5 .
- Screen 602 shows a screen or touch-sensitive area of a handheld device.
- Window 2 ( 604 ) shows one of many windows showing the image the user desires to annotate.
- the user uses his or her hand 608 (or stylus, mouse, or other device) to click or tap at point 606 (shown as solid cross hair 606 ).
- the position of the click or tap 606 represents an LLH corner of the crop boundary.
- FIG. 6B shows the second step in the process for annotating the image on the handheld device.
- point 606 now shown as a dashed cross hair 606
- the user may move his or her hand 608 (or stylus, mouse, or other device) shown as dashed motion line 610 , to another location 612 (shown as solid cross hair 612 ) and click or tap a second time.
- the second tap at location 612 represents an URH corner of the crop boundary.
- FIG. 6C shows the third and final step in the process for annotating the image on the handheld device.
- an updated or annotated image is displayed for the user's confirmation.
- Point 606 shown as a dashed cross hair 606
- point 612 now also shown as dashed cross hair 612
- a user of the present invention may implement an annotation operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
- FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention.
- Process 700 begins at step 702 , where the user of a handheld device edits an image on the device.
- the user opens the image for viewing and then at step 706 , the user makes an annotation on the image in the spirit of process 500 shown in FIG. 5 .
- the user then proceeds to step 708 , where he or she crops the image using steps described in process 200 shown in FIG. 2 .
- the user now sees only a sub-area of the original image on the screen, as a result of step 212 of FIG. 2 whereby the cropped area is displayed to take up the full screen area of the device.
- step 706 the user decides that what he or she would like to annotated in step 706 was not correct, so he or she reverses that annotation at the click of an UNDO button in step 710 .
- step 712 the user reverses the crop he or she made in step 708 , by which point the image shown to the user looks exactly as it was in step 704 .
- step 714 the user crops the image in step 714 in a different fashion from the one the user did previously in step 708 , but once again using the steps described in process 200 shown in FIG. 2 .
- steps 716 and 718 the user makes two consecutive annotations in the spirit of process 500 shown in FIG. 5 .
- the user is then satisfied with the edits he or she has made and ends the process at step 720 .
- the result of the series of actions illustrated in FIG. 7 may be stored by storing a reference to the original image, storing a final crop rectangle by reference to a LLH and an URH corner, and storing a list of annotations which are also stored by reference to a LLH and an URH corner along with type information, such as annotation type, annotation color, etc. to overlay on the cropped image.
- FIG. 8 is an illustration of a multi-functional handheld device 800 , in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention.
- the handheld device 800 contains a screen or display 802 , which may be a touch-sensitive display, for displaying an image to be cropped and/or annotated with overlaid objects.
- the handheld 800 also contains a toolbar 806 that contains iconographic buttons for each function that a user may execute during a process of taking and editing an image.
- Some possible selectable actions include, but are not limited to, from top to bottom and left to right, “take a picture” 804 (first row, far left), undo, redo, zoom-in, zoom-out (first row, far right), delete/cancel (second row, far left), annotate with an arrow, annotate with a circle, annotate with a rectangle, annotate with a line, and crop (second row, far right).
- take a picture 804
- undo, redo zoom-in
- zoom-out first row, far right
- delete/cancel second row, far left
- annotate with an arrow annotate with a circle
- annotate with a rectangle annotate with a line
- crop second row, far right
- the illustrative user interface 800 is but one of many possible illustrative embodiments of the present invention.
- One of ordinary skill in the art would appreciate that any other configuration of objects in a user interface, as well as any possible extensions to the set of functions presented in the user interface 800 , are all within the spirit and scope of the present invention.
- FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site.
- FIG. 9 shows an inspector carrying out an inspection of wind turbine 902 and wind turbine 904 .
- the inspector 906 is standing next to the tower and foundation sections of wind turbine 904 .
- the inspector 906 is using an industrial inspection handheld device 908 .
- Inspector 906 is more specifically in the process of using the industrial inspection handheld device 908 , even more specifically having an embedded RFID reader, to scan RFID tag 912 on tower section of wind turbine 904 , via radio frequency communication channel 910 .
- the inspection may take a picture of the potential problem area, and then proceed to crop and annotate the problem area using the methods described in the present application. Since the inspector is in the field, the present invention is particularly suitable to helping the inspector complete the inspection in a timely, accurate, and cost effective manner.
- FIG. 9 is but one of many possible illustrative embodiments of the usage of the present invention.
- renewable energy systems and distributed energy systems including wind turbines, solar photovoltaic, solar thermal plants, co-generation plants, biomass-fueled power plants, carbon sequestration projects, enhanced oil recovery systems, and the like.
- FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as an Apple iPhone® 1000 or other like device.
- Users of an Apple iPhone® 1000 may wish to crop and/or annotate an image either taken by the iPhone® 1000 or received from another user, or in some other way obtained on the iPhone® 1000 .
- the present invention is particularly suitable for use with an iPhone®, since an iPhone® as currently practiced does not contain a useful or easy mechanism for cropping or annotating images.
- FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display 1100 .
- Users of a digital camera 1100 may wish to crop and/or annotate an image taken by the digital camera 1100 .
- the present invention is particularly suitable for use with a digital camera, especially a digital camera as shown in FIG. 11 with a touch-sensitive display device 1100 , since digital cameras as currently practiced do not contain a useful or easy mechanism for cropping or annotating images on-site and instead require uploading the images to a computer for further desktop processing to crop and annotate the images.
Abstract
The present invention is a system and method for cropping and annotating images on a touch sensitive display device or other handheld device, including the following steps: (a) displaying an image of the image file to be cropped/annotated; (b) receiving an first input from a user designating a first point in the image defining a corner of a crop/annotation rectangle; (c) receiving a second input from the user designating a second point in the image defining an opposite corner of the crop/annotation rectangle; and (d) cropping and/or annotating the image from the first point to the second point of the crop/annotation rectangle. The present invention may be used in digital cameras, Apple iPhones®, hand-held devices that inspectors may use to annotate photographs taken to substantiate statements of problems found during industrial inspections, and in other purposes.
Description
- This application claims priority from provisional application Ser. No. 61/122,632, filed on Dec. 15, 2008, and entitled “A system, method and apparatus for inspections and compliance verification of industrial equipment using a handheld device,” the entirety of which is hereby incorporated by reference herein. This application is related to co-pending application Ser. No. 12/489,313, filed on Jun. 22, 2009, and entitled “A system and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device,” the entirety of which is hereby incorporated by reference herein.
- The present invention is generally related to user-interfaces for touch-sensitive displays. More specifically, the instant invention relates to a system and method for capturing, cropping, and annotating images on a touch sensitive display device or other handheld device.
- Cropping and annotating images is important in graphic user interfaces (GUIs), both for manipulating images in graphics applications such as Adobe Photoshop®, Microsoft Paint®, and the like, and also for cropping and annotating images for insertion into textual documents such as Adobe Portable Document Format (PDF)®, Microsoft Word®, and the like.
- Multiple end-use applications require cropping and annotating images, including reference manuals, encyclopedias, educational texts, inspection reports, and the like. For example, U.S. Ser. No. 12/489,313, filed on Jun. 22, 2009, entitled “A system and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device,” describes a method for carrying out an inspection on a piece of industrial equipment and generating inspection reports in the field. An inspector out in the field carrying out an inspection operation needs a convenient, quick, and accurate method to crop and annotate images taken in the field.
- One prior method of image cropping and annotating is shown in
FIG. 1 . Window 1 (101) shows an image of aplant 108, which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation. Amouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102), then performing a drag operation 110 (shown as dashed line 110) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106), and then releasing the mouse button at thepoint 106.Points LLH 102 andURH 106 points to define the bounding box. This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable for use in a handheld or other field device, or locations or situations where greater flexibility and less interactions with the GUI are highly desirable. - Applications of the present invention include digital cameras, digital video cameras, phones with built-in cameras, phones with built-in display devices, such as the Apple iPhone®, and the like. In general, the present invention may be used to provide a simple and convenient method to crop and annotate images in situations and locations where such ease is important and/or necessary.
- For example, one concrete application of the present invention is related to supplying a convenient user interface for a handheld device used for industrial inspection and maintenance compliance systems, as described in related U.S. Ser. No. 12/489,313. The present invention allows an easy mechanism for on-site inspectors to quickly crop and annotate images in the field to substantiate problems found during an inspection.
- One of ordinary skill in the art will find many useful applications of the present invention in which a convenient and easy way is needed to either crop or annotate images on a touch-sensitive display or other hand-held device.
- It is against this background that various embodiments of the present invention were developed.
- The present invention is a system and method for cropping and annotating images on a touch sensitive display device or other handheld device.
- One embodiment of the present invention is a method for cropping images, including the steps of (a) displaying an image of the image file to be cropped; (b) receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle, e.g., a lower left hand corner; (c) receiving a second input from the user designating a second point in the image defining the opposite corner of the crop rectangle, e.g. an upper right hand corner; and (d) cropping the image to the crop rectangle defined by the two corners when the second input is released.
- Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
- Another embodiment of the present invention is the method described above also including the step of displaying a rectangle corresponding to the crop rectangle of the image before cropping the image.
- Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually edit a shape and a size of the crop rectangle.
- Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
- Another embodiment of the present invention is the method described above also including the step of displaying the cropped image in the display area in place of the original image.
- Another embodiment of the present invention is the method described above also including the step of scaling the cropped image to fill the entire display area.
- Yet another embodiment of the present invention is a method of annotating an image (where annotating an image includes superimposing one or more geometrical shapes on top of the image), the method including the steps of (a) displaying an image of the image file to be annotated; (b) receiving a first input from a user designating a first point in the image defining a corner of an annotation rectangle, e.g. the lower left hand corner; (c) receiving a second input from the user designating an opposite corner of the annotation rectangle, e.g., the upper right hand corner; and (d) annotating the image in the annotation rectangle defined by the two corners when the second input is released.
- Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
- Another embodiment of the present invention is the method described above also including the step of displaying a shape corresponding to the annotation of the image before annotating the image.
- Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually show a shape and a size of the annotation area.
- Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
- Another embodiment of the present invention is the method described above also including the step of displaying the annotated image in the display area in place of the original image.
- Another embodiment of the present invention is the method described above also including the step of receiving a third input representing a type of shape and a characteristic of the shape corresponding to the annotation.
- Another embodiment of the present invention is the method described above where the shape is, but is not limited to, a line, a rectangle, an ellipse, or a circle. Another embodiment of the present invention is the method described above where the characteristics of the shape include, but is not limited to, a line type, a line width, and a line color.
- The present invention also includes a related system by which the method of capturing, cropping, and annotating an image could be carried out. Such a system could be implemented as a computer system, embodied in a handheld device. The system may include integrated or separate hardware components for taking of media samples and means for receiving touch input.
- The invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
-
FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device; -
FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention; -
FIG. 3A shows a first step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention; -
FIG. 3B shows a second step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention; -
FIG. 3C shows a third step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention; -
FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device; -
FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention; -
FIG. 6A shows a first step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention; -
FIG. 6B shows a second step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention; -
FIG. 6C shows a third step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention; -
FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention; -
FIG. 8 is an illustration of a multi-functional handheld device, in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention; -
FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site; -
FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as an Apple iPhone® or other like device; and -
FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display. - The present invention generally pertains to a system and method for capturing, cropping, and annotating images on a touch sensitive display or other handheld device.
- The interface according to the principles of the present invention could have, but is not limited to, the following components. Any subsets of the following components are also within the scope of this invention. After a user captures an initial image, it is stored and displayed. No actions of the user will modify the initial image, allowing all edits to be undone or re-applied against the original image.
- The user can choose to crop the image as follows:
-
- 1. The user can click once on a point in the image, displaying a point where the click occurred, and then click again at another point in the image;
- 2. A rectangle with the two clicks at opposite corners is displayed. When the user releases the second click, including immediately releasing it, this rectangle becomes the new crop rectangle;
- 3. If the user does not immediately release the second click, they can drag the point to visually edit the shape and size of the rectangle;
- 4. If the point is dragged near the edge of the displayed image and the image is larger than the displayed portion, then the displayed portion will scroll to show the areas in the direction of the dragged point; and
- 5. Once selected, the new crop rectangle becomes the area displayed. The image within the selected rectangle can be scaled to the size of the viewport.
- The user can choose to annotate the image as follows:
-
- 1. The user can choose a type of an annotation shape (e.g., line, rectangle, ellipse), and characteristics of the annotation shape such as line type (e.g., dashed), line width, and line color, etc.;
- 2. The user can click once on a point in the image, displaying a point where the click occurred, and then click again at another point in the image;
- 3. An annotation shape of the appropriate type is displayed over the image with the two clicks at opposite corners of the shape's bounding rectangle. When the user releases the second click, including immediately releasing it, this shape and its location on the image are saved;
- 4. If the user does not immediately release the second click, they can drag the point to visually edit the shape and its size; and
- 5. If the point is dragged near the edge of the displayed image and the image is larger than the displayed portion, then the displayed portion will scroll to show the areas in the direction of the dragged point, and the portion of the shape in the displayed area will be shown.
- The invention may be used in an industrial inspection compliance system with which various methods can be carried out to the effect of assisting in an inspection and providing the means for compliance verification of a proper inspection. For the purposes of the text describing this invention, an inspection may represent the process of checking a physical component for safety, security or business reasons, doing the same for compliance with industrial standards and guidelines, or a maintenance operation on a physical component for those same reasons. These methods can generally be best executed by a multi-function handheld device, carried to and used in the physical proximity of an inspection component by the inspector. Examples of multi-function handheld devices include the Apple iPhone®, the Psion Teklogix Workabout Pro®, the Motorola MC-75®, and the like, but the present invention is not limited to such devices as shown or described here. One embodiment of the inspection compliance method includes the steps of scanning unique machine-readable tags deployed at logical inspection points defined by the inspector, and assigning a timestamp to the scanning operation; taking media samples of logical inspection points defined by the inspector, and assigning a timestamp to the media sample capturing operation; reporting of sub-optimal conditions of the unique machine-readable tags deployed at logical inspection points if its condition warrants such a declaration; associating a media sample with a corresponding scan of a unique machine-readable tag; and annotating a media sample in such ways that substantiate statements of an industrial component passing inspection, or in such ways that substantiate statements of problems found with the industrial component. See U.S. Ser. No. 12/489,313 for more details of an example of an industrial inspection compliance system to which the present invention may be applied.
- The invention is discussed below with reference to
FIGS. 1-11 . However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments. -
FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device. Window 1 (101) shows an image of aplant 108, which could be any image or photograph, previously stored in memory or taken live right before the cropping operation. Amouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102), then performing a drag operation 110 (shown as dashed line 110) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106), and then releasing the mouse button at thepoint 106.Points LLH 102 andURH 106 points to define the bounding box. This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field device, or locations or situations where greater flexibility and less interactions with the GUI are highly desirable. - In order to solve the inherent limitations in the prior art method described in
FIG. 1 , the inventors have invented a novel method, system, and apparatus to facilitate on-site image cropping.FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention.Process 200 begins atstep 202, where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to theprocess 200. Instep 204, the image is displayed on the touch sensitive display or other display of the handheld device. Instep 206, the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin. Instep 208, the user may click or tap (using the finger, the stylus, the mouse, or other device) at a URH location where the crop is to end. Instep 210, the image is cropped between the LLH location and the URH location. Finally, instep 212, the cropped image is displayed for the user's confirmation. At this point (not shown), the user may cancel, undo, or accept the crop. The process ends instep 214. - The process described in
FIG. 2 is more particularly illustrated in relations toFIGS. 3A-3C .FIG. 3A shows the first step in the process for cropping the image on the handheld device, in accordance with the embodiment described in relation toFIG. 2 .Screen 302 shows a screen or touch-sensitive area of a handheld device. Window 2 (304) shows one of many windows showing the image the user desires to crop. The user uses his or her hand 308 (or stylus, mouse, or other device) to click or tap at point 306 (shown as solid cross hair 306). The position of the click or tap 306 represents a LLH corner of the crop boundary. -
FIG. 3B shows the second step in the process for cropping the image on the handheld device. After clicking or tapping on point 306 (now shown as a dashed cross hair 306), the user may move his or her hand 308 (or stylus, mouse, or other device) shown as dashedmotion line 310, to another location 312 (shown as solid cross hair 312) and click or tap a second time. The second tap atlocation 312 represents an URH corner of the crop boundary. -
FIG. 3C shows the third and final step in the process for cropping the image on the handheld device. After indicating a completion of a crop operation, such as by removinghand 308, the crop operation is performed in the background, and an updated or cropped image is displayed for the user's confirmation. Point 306 (shown as a dashed cross hair 306) and point 312 (now also shown as dashed cross hair 312) represent a LLH corner and an URH corner, respectively, of the crop boundary. - Therefore, as shown in FIGS. 2 and 3A-3C, a user of the present invention may implement a crop operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
- Now turning to annotation of images,
FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device. Window 1 (401) shows an image of aplant 408, which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation. Amouse 403 is used to click at a point on the screen 402 (shown as dashed cross hair 402), then performing a drag operation 410 (shown as dashed line 410) while holding down the mouse button to another point on the screen 406 (shown as solid cross hair 406), and finally releasing the mouse button at thepoint 406.Points LLH 102 andURH 106 points to define the bounding box. This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field devices, or locations or situations where greater flexibility and fewer interactions with the GUI are highly desirable. - In order to solve the inherent limitations in the prior art method described in
FIG. 4 , the inventors have invented a novel method, system, and apparatus to facilitate on-site image annotation.FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention.Process 500 begins atstep 502, where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to theprocess 500. Instep 504, the image is displayed on the touch sensitive display or other display of the handheld device. Instep 506, the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin. Instep 508, the user may click or tap (using the finger, the stylus, the mouse, or other device) at an URH location where the annotatio is to end. Instep 510, the image is annotated between the LLH location and the URH location. Finally, instep 512, the annotated image is displayed for the user's confirmation. At this point (not shown), the user may cancel, undo, or accept the annotation. The process ends instep 514. - The process described in
FIG. 5 is more particularly illustrated in relations toFIGS. 6A-6C .FIG. 6A shows the first step in the process for annotating the image on the handheld device, in accordance with the embodiment described in relation toFIG. 5 .Screen 602 shows a screen or touch-sensitive area of a handheld device. Window 2 (604) shows one of many windows showing the image the user desires to annotate. The user uses his or her hand 608 (or stylus, mouse, or other device) to click or tap at point 606 (shown as solid cross hair 606). The position of the click or tap 606 represents an LLH corner of the crop boundary. -
FIG. 6B shows the second step in the process for annotating the image on the handheld device. After clicking or tapping on point 606 (now shown as a dashed cross hair 606), the user may move his or her hand 608 (or stylus, mouse, or other device) shown as dashedmotion line 610, to another location 612 (shown as solid cross hair 612) and click or tap a second time. The second tap atlocation 612 represents an URH corner of the crop boundary. -
FIG. 6C shows the third and final step in the process for annotating the image on the handheld device. After indicating a completion of an annotation operation, such as by removinghand 608, an updated or annotated image is displayed for the user's confirmation. Point 606 (shown as a dashed cross hair 606) and point 612 (now also shown as dashed cross hair 612) represent a LLH corner and an URH corner, respectively, of the annotation boundary. - Therefore, as shown in FIGS. 5 and 6A-6C, a user of the present invention may implement an annotation operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
-
FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention.Process 700 begins atstep 702, where the user of a handheld device edits an image on the device. First, atstep 704 the user opens the image for viewing and then atstep 706, the user makes an annotation on the image in the spirit ofprocess 500 shown inFIG. 5 . The user then proceeds to step 708, where he or she crops the image using steps described inprocess 200 shown inFIG. 2 . Then, after cropping the image, the user now sees only a sub-area of the original image on the screen, as a result ofstep 212 ofFIG. 2 whereby the cropped area is displayed to take up the full screen area of the device. At this point, the user decides that what he or she would like to annotated instep 706 was not correct, so he or she reverses that annotation at the click of an UNDO button instep 710. Then instep 712, the user reverses the crop he or she made instep 708, by which point the image shown to the user looks exactly as it was instep 704. Finally, the user crops the image instep 714 in a different fashion from the one the user did previously instep 708, but once again using the steps described inprocess 200 shown inFIG. 2 . Then insteps process 500 shown inFIG. 5 . The user is then satisfied with the edits he or she has made and ends the process atstep 720. - The result of the series of actions illustrated in
FIG. 7 may be stored by storing a reference to the original image, storing a final crop rectangle by reference to a LLH and an URH corner, and storing a list of annotations which are also stored by reference to a LLH and an URH corner along with type information, such as annotation type, annotation color, etc. to overlay on the cropped image. -
FIG. 8 is an illustration of a multi-functionalhandheld device 800, in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention. Thehandheld device 800 contains a screen ordisplay 802, which may be a touch-sensitive display, for displaying an image to be cropped and/or annotated with overlaid objects. The handheld 800 also contains atoolbar 806 that contains iconographic buttons for each function that a user may execute during a process of taking and editing an image. Some possible selectable actions include, but are not limited to, from top to bottom and left to right, “take a picture” 804 (first row, far left), undo, redo, zoom-in, zoom-out (first row, far right), delete/cancel (second row, far left), annotate with an arrow, annotate with a circle, annotate with a rectangle, annotate with a line, and crop (second row, far right). For example, ifbutton 804 is pressed, the software activates the handheld device's digital camera and places the captured image indisplay screen 802. - The
illustrative user interface 800 is but one of many possible illustrative embodiments of the present invention. One of ordinary skill in the art would appreciate that any other configuration of objects in a user interface, as well as any possible extensions to the set of functions presented in theuser interface 800, are all within the spirit and scope of the present invention. -
FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site.FIG. 9 shows an inspector carrying out an inspection ofwind turbine 902 andwind turbine 904. Theinspector 906 is standing next to the tower and foundation sections ofwind turbine 904. Theinspector 906 is using an industrial inspectionhandheld device 908.Inspector 906 is more specifically in the process of using the industrial inspectionhandheld device 908, even more specifically having an embedded RFID reader, to scanRFID tag 912 on tower section ofwind turbine 904, via radiofrequency communication channel 910. Sinceinspector 906 is within proximity of the inspected component, he is able to successfully scan theRFID tag 912 because it is within the range of radiofrequency communication channel 910. If the inspector recognizes a potential problem with the foundation section of thewind turbine 904, the inspection may take a picture of the potential problem area, and then proceed to crop and annotate the problem area using the methods described in the present application. Since the inspector is in the field, the present invention is particularly suitable to helping the inspector complete the inspection in a timely, accurate, and cost effective manner. - The illustration shown in
FIG. 9 is but one of many possible illustrative embodiments of the usage of the present invention. One of ordinary skill in the art would appreciate that many possible uses of the present invention are all within the spirit and scope of the present invention, including, but not limited to, renewable energy systems and distributed energy systems, including wind turbines, solar photovoltaic, solar thermal plants, co-generation plants, biomass-fueled power plants, carbon sequestration projects, enhanced oil recovery systems, and the like. -
FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as anApple iPhone® 1000 or other like device. Users of anApple iPhone® 1000 may wish to crop and/or annotate an image either taken by theiPhone® 1000 or received from another user, or in some other way obtained on theiPhone® 1000. The present invention is particularly suitable for use with an iPhone®, since an iPhone® as currently practiced does not contain a useful or easy mechanism for cropping or annotating images. -
FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display 1100. Users of adigital camera 1100 may wish to crop and/or annotate an image taken by thedigital camera 1100. The present invention is particularly suitable for use with a digital camera, especially a digital camera as shown inFIG. 11 with a touch-sensitive display device 1100, since digital cameras as currently practiced do not contain a useful or easy mechanism for cropping or annotating images on-site and instead require uploading the images to a computer for further desktop processing to crop and annotate the images. - While the methods disclosed herein have been described and shown with reference to particular operations performed in a particular order, it will be understood that these operations may be combined, sub-divided, or re-ordered to form equivalent methods without departing from the teachings of the present invention. Accordingly, unless specifically indicated herein, the order and grouping of the operations is not a limitation of the present invention.
- While the invention has been particularly shown and described with reference to embodiments thereof, it will be understood by those skilled in the art that various other changes in the form and details may be made without departing from the spirit and scope of the invention.
Claims (22)
1. A method for cropping an image file, comprising the steps of:
displaying an image of the image file to be cropped in a display area;
receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle;
receiving a second input from the user designating a second point in the image defining an opposite corner of the crop rectangle; and
cropping the image to the crop rectangle defined by the first point and the second point to create a cropped image when the second input is released.
2. The method as recited in claim 1 , further comprising:
displaying on the image a location of the first point.
3. The method as recited in claim 1 , further comprising:
displaying a rectangle overlaid over the image corresponding to the crop rectangle before cropping the image.
4. The method as recited in claim 1 , wherein if the user does not immediately release the second input, allowing the user to drag the second point to visually edit a shape and a size of the crop rectangle.
5. The method as recited in claim 1 , wherein if the user drags the second point near an edge of a displayed portion and the image is larger than the displayed portion, then scrolling the displayed portion to show a portion of the image in a direction of the dragged point.
6. The method as recited in 1, further comprising:
displaying the cropped image in the display area in place of the original image.
7. The method as recited in claim 6 , further comprising:
scaling the cropped image to fill the entire display area.
8. A method of annotating an image file, comprising the steps of:
displaying an image of the image file to be annotated in a display area;
receiving a first input from a user designating a first point in the image defining a corner of an annotation rectangle;
receiving a second input from the user designating a second point in the image defining an opposite corner of the annotation rectangle; and
annotating the image from the first point to the second point of the annotation rectangle to create an annotated image when the second input is released.
9. The method as recited in claim 8 , further comprising:
displaying on the image a location of the first point.
10. The method as recited in claim 8 , further comprising:
displaying a shape corresponding to the annotation rectangle of the image before annotating the image.
11. The method as recited in claim 8 , wherein if the user does not immediately release the second input, allowing the user to drag the second point to visually show a shape and a size of the annotation rectangle.
12. The method as recited in claim 8 , wherein if the user drags the second point near an edge of a displayed portion and the image is larger than the displayed portion, then scrolling the displayed portion to show a portion of the image in a direction of the dragged point.
13. The method as recited in 8, further comprising:
displaying the annotated image in the display area in place of the original image.
14. The method as recited in claim 8 , further comprising:
receiving a third input representing a type of shape and a characteristic of the shape corresponding to the annotation rectangle.
15. The method as recited in claim 14 , wherein the shape is selected from the group consisting of a line, a rectangle, an ellipse, and a circle.
16. The method as recited in claim 14 , wherein the characteristic of the shape is selected from the group consisting of a line type, a line width, and a line color.
17. A touch-sensitive hand-held system having a capability of cropping and annotating an image file, comprising:
at least one processor; and
at least one or more memories, operatively coupled to the processor, and containing program code, which when executed causes the processor to execute a process comprising the steps of:
displaying an image of the image file to be cropped or annotated in a display area;
receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle;
receiving a second input from the user designating a second point in the image defining an opposite corner of the crop rectangle;
cropping the image from the first point to the second point of the crop rectangle when the second input is released to form a cropped image;
displaying the cropped image in the display area in place of the original image;
receiving a third input from the user designating a third point in the cropped image defining a corner of an annotation rectangle;
receiving a fourth input from the user designating a fourth point in the cropped image defining an opposite corner of the annotation rectangle; and
annotating the cropped image from the third point to the fourth point of the annotation rectangle when the fourth input is released to form an annotated cropped image.
18. The system as recited in claim 17 , further containing program code, which when executed causes the processor to execute a process further comprising the step of:
displaying on the image a location of the first point and a location of the third point.
19. The system as recited in claim 17 , further containing program code, which when executed causes the processor to execute a process further comprising the steps of:
displaying a rectangle overlaid over the image corresponding to the crop rectangle before cropping the image; and
displaying a shape overlaid over the image corresponding to the annotation rectangle before annotating the image.
20. The system as recited in claim 17 , wherein if the user does not immediately release the second input or the fourth input, allowing the user to drag the second point or the fourth point to visually show a shape and a size of the crop rectangle or the annotation rectangle.
21. The system as recited in claim 17 , wherein if the user drags the second point or the fourth point near an edge of a displayed portion and the image is larger than the displayed portion, then scrolling the displayed portion to show a portion of the image in a direction of the dragged point.
22. The system as recited in 17, further containing program code, which when executed causes the processor to execute a process further comprising the step of:
displaying the annotated cropped image in the display area in place of the cropped image.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/507,039 US20100149211A1 (en) | 2008-12-15 | 2009-07-21 | System and method for cropping and annotating images on a touch sensitive display device |
US12/767,077 US20100194781A1 (en) | 2008-12-15 | 2010-04-26 | System and method for cropping and annotating images on a touch sensitive display device |
PCT/US2010/039472 WO2011005513A1 (en) | 2009-06-22 | 2010-06-22 | An inspection compliance system using a handheld device, and related methods for facilitating inspections and maintenance operations |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12263208P | 2008-12-15 | 2008-12-15 | |
US12/507,039 US20100149211A1 (en) | 2008-12-15 | 2009-07-21 | System and method for cropping and annotating images on a touch sensitive display device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/767,077 Continuation US20100194781A1 (en) | 2008-12-15 | 2010-04-26 | System and method for cropping and annotating images on a touch sensitive display device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100149211A1 true US20100149211A1 (en) | 2010-06-17 |
Family
ID=42239966
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/489,313 Abandoned US20100153168A1 (en) | 2008-12-15 | 2009-06-22 | System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device |
US12/507,039 Abandoned US20100149211A1 (en) | 2008-12-15 | 2009-07-21 | System and method for cropping and annotating images on a touch sensitive display device |
US12/507,071 Expired - Fee Related US8032830B2 (en) | 2008-12-15 | 2009-07-22 | System and method for generating quotations from a reference document on a touch sensitive display device |
US12/758,134 Abandoned US20100185549A1 (en) | 2008-12-15 | 2010-04-12 | System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device |
US12/767,077 Abandoned US20100194781A1 (en) | 2008-12-15 | 2010-04-26 | System and method for cropping and annotating images on a touch sensitive display device |
US12/832,903 Expired - Fee Related US7971140B2 (en) | 2008-12-15 | 2010-07-08 | System and method for generating quotations from a reference document on a touch sensitive display device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/489,313 Abandoned US20100153168A1 (en) | 2008-12-15 | 2009-06-22 | System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/507,071 Expired - Fee Related US8032830B2 (en) | 2008-12-15 | 2009-07-22 | System and method for generating quotations from a reference document on a touch sensitive display device |
US12/758,134 Abandoned US20100185549A1 (en) | 2008-12-15 | 2010-04-12 | System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device |
US12/767,077 Abandoned US20100194781A1 (en) | 2008-12-15 | 2010-04-26 | System and method for cropping and annotating images on a touch sensitive display device |
US12/832,903 Expired - Fee Related US7971140B2 (en) | 2008-12-15 | 2010-07-08 | System and method for generating quotations from a reference document on a touch sensitive display device |
Country Status (1)
Country | Link |
---|---|
US (6) | US20100153168A1 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100171712A1 (en) * | 2009-01-05 | 2010-07-08 | Cieplinski Avi E | Device, Method, and Graphical User Interface for Manipulating a User Interface Object |
US20120023435A1 (en) * | 2010-07-23 | 2012-01-26 | Adolph Johannes Kneppers | Method for Inspecting a Physical Asset |
US20120162228A1 (en) * | 2010-12-24 | 2012-06-28 | Sony Corporation | Information processor, image data optimization method and program |
US20120210200A1 (en) * | 2011-02-10 | 2012-08-16 | Kelly Berger | System, method, and touch screen graphical user interface for managing photos and creating photo books |
US20130024805A1 (en) * | 2011-07-19 | 2013-01-24 | Seunghee In | Mobile terminal and control method of mobile terminal |
US20130063369A1 (en) * | 2011-09-14 | 2013-03-14 | Verizon Patent And Licensing Inc. | Method and apparatus for media rendering services using gesture and/or voice control |
US20130083176A1 (en) * | 2010-05-31 | 2013-04-04 | Pfu Limited | Overhead scanner device, image processing method, and computer-readable recording medium |
WO2013109244A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Three dimensional contriver tool for modeling with multi-touch devices |
WO2013109245A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Dynamic creation and modeling of solid models |
WO2013109246A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Gestures and tools for creating and editing solid models |
US8671361B2 (en) * | 2012-05-24 | 2014-03-11 | Blackberry Limited | Presentation of image on display screen with combination crop and rotation and with auto-resizing of crop field |
US8860675B2 (en) | 2011-07-12 | 2014-10-14 | Autodesk, Inc. | Drawing aid system for multi-touch devices |
US8860726B2 (en) | 2011-04-12 | 2014-10-14 | Autodesk, Inc. | Transform manipulator control |
US8902222B2 (en) | 2012-01-16 | 2014-12-02 | Autodesk, Inc. | Three dimensional contriver tool for modeling with multi-touch devices |
US8947429B2 (en) | 2011-04-12 | 2015-02-03 | Autodesk, Inc. | Gestures and tools for creating and editing solid models |
US9182882B2 (en) | 2011-04-12 | 2015-11-10 | Autodesk, Inc. | Dynamic creation and modeling of solid models |
GB2554121A (en) * | 2016-06-27 | 2018-03-28 | Moletest Ltd | Image processing |
US10564826B2 (en) | 2009-09-22 | 2020-02-18 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10904426B2 (en) | 2006-09-06 | 2021-01-26 | Apple Inc. | Portable electronic device for photo management |
US11307737B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | Media browsing user interface with intelligently selected representative media items |
US11334229B2 (en) | 2009-09-22 | 2022-05-17 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US11446548B2 (en) | 2020-02-14 | 2022-09-20 | Apple Inc. | User interfaces for workout content |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090048949A1 (en) * | 2007-08-16 | 2009-02-19 | Facility Audit Solutions, Llc | System and method for managing photographs from site audits of facilities |
US8013738B2 (en) | 2007-10-04 | 2011-09-06 | Kd Secure, Llc | Hierarchical storage manager (HSM) for intelligent storage of large volumes of data |
WO2009045218A1 (en) | 2007-10-04 | 2009-04-09 | Donovan John J | A video surveillance, storage, and alerting system having network management, hierarchical data storage, video tip processing, and vehicle plate analysis |
TWI497357B (en) * | 2009-04-23 | 2015-08-21 | Waltop Int Corp | Multi-touch pad control method |
JP5607726B2 (en) * | 2009-08-21 | 2014-10-15 | トムソン ライセンシング | Method, apparatus, and program for adjusting parameters on user interface screen |
US8799775B2 (en) * | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for displaying emphasis animations for an electronic document in a presentation mode |
JP5656585B2 (en) * | 2010-02-17 | 2015-01-21 | キヤノン株式会社 | Document creation support apparatus, document creation support method, and program |
US7793850B1 (en) * | 2010-03-14 | 2010-09-14 | Kd Secure Llc | System and method used for configuration of an inspection compliance tool with machine readable tags and their associations to inspected components |
JP5459031B2 (en) * | 2010-04-13 | 2014-04-02 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US8376450B1 (en) * | 2010-08-13 | 2013-02-19 | Kodiak Innovations, LLC | Apparatus and method for mounting an aerodynamic add-on device onto a transport vehicle |
US20120185787A1 (en) * | 2011-01-13 | 2012-07-19 | Microsoft Corporation | User interface interaction behavior based on insertion point |
MY185001A (en) * | 2011-06-06 | 2021-04-30 | Paramit Corp | Computer directed assembly method and system for manufacturing |
US20130009785A1 (en) * | 2011-07-07 | 2013-01-10 | Finn Clayton L | Visual and Audio Warning System Including Test Ledger for Automated Door |
US20130021138A1 (en) * | 2011-07-20 | 2013-01-24 | GM Global Technology Operations LLC | Method of evaluating structural integrity of a vehicle component with radio frequency identification tags and system for same |
US20130086933A1 (en) * | 2011-10-07 | 2013-04-11 | Colleen M. Holtkamp | Controller for a medical products storage system |
TWI475473B (en) * | 2012-02-17 | 2015-03-01 | Mitac Int Corp | Method for generating split screen according to a touch gesture |
US20130227457A1 (en) * | 2012-02-24 | 2013-08-29 | Samsung Electronics Co. Ltd. | Method and device for generating captured image for display windows |
KR102304700B1 (en) * | 2012-02-24 | 2021-09-28 | 삼성전자주식회사 | Method and device for generating capture image for display windows |
US9299019B2 (en) * | 2012-03-14 | 2016-03-29 | Trimble Navigation Limited | Systems for data collection |
CN102662525A (en) * | 2012-04-27 | 2012-09-12 | 上海量明科技发展有限公司 | Method and terminal for carrying out screenshot operation through touch screen |
US20220335385A1 (en) * | 2012-06-07 | 2022-10-20 | Procore Technologies, Inc. | System and method for systematic presentation and ordering of documents based on triggers |
US20170024695A1 (en) * | 2013-12-24 | 2017-01-26 | Scott Gerard Wolfe, Jr. | System and method for systematic presentation and ordering of documents based on triggers |
US20140172684A1 (en) * | 2012-12-14 | 2014-06-19 | Scott Gerard Wolfe, Jr. | System and Method to Utilize Presumptions, Database Information, and/or User Defaults to Calculate Construction Lien, Notice, Bond Claim, and Other Construction Document Deadlines and Requirements |
US20170186105A1 (en) * | 2012-08-30 | 2017-06-29 | Smith and Turner Development LLC | Method to inspect equipment |
US9705835B2 (en) * | 2012-11-02 | 2017-07-11 | Pandexio, Inc. | Collaboration management systems |
US10325298B2 (en) * | 2013-01-22 | 2019-06-18 | General Electric Company | Systems and methods for a non-destructive testing ecosystem |
US20140281895A1 (en) * | 2013-03-15 | 2014-09-18 | Kah Seng Tay | Techniques for embedding quotes of content |
US20140344077A1 (en) * | 2013-03-15 | 2014-11-20 | Contact Marketing Services, Inc. | Used industrial equipment sales application suites, systems, and related apparatus and methods |
CN110490458A (en) | 2013-03-20 | 2019-11-22 | 生活时间品牌公司 | A kind of moving mass control inspection system |
US20140330731A1 (en) * | 2013-05-06 | 2014-11-06 | RDH Environmental Services, LLC | System and method for managing backflow prevention assembly test data |
US9652460B1 (en) | 2013-05-10 | 2017-05-16 | FotoIN Mobile Corporation | Mobile media information capture and management methods and systems |
CN104729714A (en) * | 2013-12-18 | 2015-06-24 | 上海宝钢工业技术服务有限公司 | Spot inspection instrument device based on Android phone platform |
US10417763B2 (en) | 2014-07-25 | 2019-09-17 | Samsung Electronics Co., Ltd. | Image processing apparatus, image processing method, x-ray imaging apparatus and control method thereof |
US10672089B2 (en) * | 2014-08-19 | 2020-06-02 | Bert L. Howe & Associates, Inc. | Inspection system and related methods |
US10671275B2 (en) | 2014-09-04 | 2020-06-02 | Apple Inc. | User interfaces for improving single-handed operation of devices |
US11562448B2 (en) * | 2015-04-27 | 2023-01-24 | First Advantage Corporation | Device and method for performing validation and authentication of a physical structure or physical object |
US9329762B1 (en) * | 2015-06-02 | 2016-05-03 | Interactive Memories, Inc. | Methods and systems for reversing editing operations in media-rich projects |
CN106325663B (en) * | 2015-06-27 | 2019-09-17 | 南昌欧菲光科技有限公司 | Mobile terminal and its screenshotss method |
CN105151436B (en) * | 2015-09-03 | 2017-03-22 | 温州智信机电科技有限公司 | Spark plug sheath sleeving machine with function of material detection, and reliable working |
US11805170B2 (en) * | 2015-10-10 | 2023-10-31 | David Sean Capps | Fire service and equipment inspection test and maintenance system |
US20220188955A1 (en) * | 2015-10-10 | 2022-06-16 | David Sean Capps | Fire Service and Equipment Inspection Test and Maintenance System and Method |
WO2017069979A1 (en) | 2015-10-19 | 2017-04-27 | University Of North Texas | Dynamic reverse gas stack model for portable chemical detection devices to locate threat and point-of-source from effluent streams |
DE102015221313A1 (en) * | 2015-10-30 | 2017-05-04 | Siemens Aktiengesellschaft | System and procedure for the maintenance of a plant |
US11095502B2 (en) * | 2017-11-03 | 2021-08-17 | Otis Elevator Company | Adhoc protocol for commissioning connected devices in the field |
CN108205412B (en) * | 2017-11-09 | 2019-10-11 | 中兴通讯股份有限公司 | A kind of method and apparatus for realizing screenshotss |
US10977874B2 (en) * | 2018-06-11 | 2021-04-13 | International Business Machines Corporation | Cognitive learning for vehicle sensor monitoring and problem detection |
WO2020027813A1 (en) * | 2018-07-31 | 2020-02-06 | Hewlett-Packard Development Company, L.P. | Cropping portions of images |
US11294556B1 (en) * | 2021-01-05 | 2022-04-05 | Adobe Inc. | Editing digital images using multi-panel graphical user interfaces |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6304271B1 (en) * | 1999-02-05 | 2001-10-16 | Sony Corporation | Apparatus and method for cropping an image in a zooming graphical user interface |
US20020172498A1 (en) * | 2001-05-18 | 2002-11-21 | Pentax Precision Instrument Corp. | Computer-based video recording and management system for medical diagnostic equipment |
US20040056869A1 (en) * | 2002-07-16 | 2004-03-25 | Zeenat Jetha | Using detail-in-context lenses for accurate digital image cropping and measurement |
US20040095375A1 (en) * | 2002-05-10 | 2004-05-20 | Burmester Christopher Paul | Method of and apparatus for interactive specification of manufactured products customized with digital media |
US20040128613A1 (en) * | 2002-10-21 | 2004-07-01 | Sinisi John P. | System and method for mobile data collection |
US20060107302A1 (en) * | 2004-11-12 | 2006-05-18 | Opentv, Inc. | Communicating primary content streams and secondary content streams including targeted advertising to a remote unit |
US20100171826A1 (en) * | 2006-04-12 | 2010-07-08 | Store Eyes, Inc. | Method for measuring retail display and compliance |
Family Cites Families (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69104770T2 (en) * | 1990-01-11 | 1995-03-16 | Toshiba Kawasaki Kk | DEVICE TO SUPPORT INSTALLATION INSPECTION. |
US6262732B1 (en) * | 1993-10-25 | 2001-07-17 | Scansoft, Inc. | Method and apparatus for managing and navigating within stacks of document pages |
US7028044B2 (en) | 1994-12-22 | 2006-04-11 | University Of Utah Research Foundation | Highlighting quoted passages in a hypertext system |
US5732230A (en) * | 1995-05-19 | 1998-03-24 | Richo Company Ltd. | Computer user interface for manipulating image fragments using drag, drop and merge operations |
US6665490B2 (en) * | 1998-04-01 | 2003-12-16 | Xerox Corporation | Obtaining and using data associating annotating activities with portions of recordings |
US6283759B1 (en) * | 1998-11-13 | 2001-09-04 | R. J. Price | System for demonstrating compliance with standards |
US20020023109A1 (en) * | 1999-12-30 | 2002-02-21 | Lederer Donald A. | System and method for ensuring compliance with regulations |
US20010047283A1 (en) * | 2000-02-01 | 2001-11-29 | Melick Bruce D. | Electronic system for identification, recording, storing, and retrieving material handling equipment records and certifications |
US20020025085A1 (en) * | 2000-04-19 | 2002-02-28 | Ipads.Com, Inc. | Computer-controlled system and method for generating a customized imprinted item |
US7038714B1 (en) * | 2000-05-16 | 2006-05-02 | Eastman Kodak Company | Printing system and method having a digital printer that uses a digital camera image display |
AU2001288956A1 (en) * | 2000-09-07 | 2002-03-22 | Praeses Corporation | System and method for an online jurisdiction manager |
US8198986B2 (en) * | 2001-11-13 | 2012-06-12 | Ron Craik | System and method for storing and retrieving equipment inspection and maintenance data |
US9760235B2 (en) * | 2001-06-12 | 2017-09-12 | Callahan Cellular L.L.C. | Lens-defined adjustment of displays |
US6772098B1 (en) * | 2001-07-11 | 2004-08-03 | General Electric Company | Systems and methods for managing inspections |
US6587768B2 (en) * | 2001-08-08 | 2003-07-01 | Meritor Heavy Vehicle Technology, Llc | Vehicle inspection and maintenance system |
US6671646B2 (en) * | 2001-09-11 | 2003-12-30 | Zonar Compliance Systems, Llc | System and process to ensure performance of mandated safety and maintenance inspections |
US8400296B2 (en) * | 2001-09-11 | 2013-03-19 | Zonar Systems, Inc. | Method and apparatus to automate data collection during a mandatory inspection |
US7557696B2 (en) * | 2001-09-11 | 2009-07-07 | Zonar Systems, Inc. | System and process to record inspection compliance data |
US20030069716A1 (en) * | 2001-10-09 | 2003-04-10 | Martinez David Frederick | System & method for performing field inspection |
US20030131011A1 (en) * | 2002-01-04 | 2003-07-10 | Argent Regulatory Services, L.L.C. | Online regulatory compliance system and method for facilitating compliance |
CA2394268A1 (en) * | 2002-02-14 | 2003-08-14 | Beyond Compliance Inc. | A compliance management system |
US20050228688A1 (en) * | 2002-02-14 | 2005-10-13 | Beyond Compliance Inc. | A compliance management system |
US20030229858A1 (en) * | 2002-06-06 | 2003-12-11 | International Business Machines Corporation | Method and apparatus for providing source information from an object originating from a first document and inserted into a second document |
US8120624B2 (en) * | 2002-07-16 | 2012-02-21 | Noregin Assets N.V. L.L.C. | Detail-in-context lenses for digital image cropping, measurement and online maps |
US6859757B2 (en) * | 2002-07-31 | 2005-02-22 | Sap Aktiengesellschaft | Complex article tagging with maintenance related information |
US20040177326A1 (en) * | 2002-10-21 | 2004-09-09 | Bibko Peter N. | Internet/intranet software system to audit and manage compliance |
US7356393B1 (en) * | 2002-11-18 | 2008-04-08 | Turfcentric, Inc. | Integrated system for routine maintenance of mechanized equipment |
US20050065842A1 (en) * | 2003-07-28 | 2005-03-24 | Richard Summers | System and method for coordinating product inspection, repair and product maintenance |
US20050111699A1 (en) * | 2003-11-24 | 2005-05-26 | Emil Gran | Suite of parking regulation control systems |
US7536278B2 (en) * | 2004-05-27 | 2009-05-19 | International Electronic Machines Corporation | Inspection method, system, and program product |
US7454050B2 (en) * | 2004-06-18 | 2008-11-18 | Csi Technology, Inc. | Method of automating a thermographic inspection process |
US7743348B2 (en) * | 2004-06-30 | 2010-06-22 | Microsoft Corporation | Using physical objects to adjust attributes of an interactive display application |
US20060132291A1 (en) * | 2004-11-17 | 2006-06-22 | Dourney Charles Jr | Automated vehicle check-in inspection method and system with digital image archiving |
US20060132836A1 (en) * | 2004-12-21 | 2006-06-22 | Coyne Christopher R | Method and apparatus for re-sizing image data |
US7834876B2 (en) * | 2004-12-28 | 2010-11-16 | The Mathworks, Inc. | Providing graphic generating capabilities for a model based development process |
US20060218492A1 (en) * | 2005-03-22 | 2006-09-28 | Andrade Jose O | Copy and paste with citation attributes |
US7869944B2 (en) * | 2005-04-18 | 2011-01-11 | Roof Express, Llc | Systems and methods for recording and reporting data collected from a remote location |
US20060235741A1 (en) * | 2005-04-18 | 2006-10-19 | Dataforensics, Llc | Systems and methods for monitoring and reporting |
US7324905B2 (en) * | 2005-05-11 | 2008-01-29 | Robert James Droubie | Apparatus, system and method for automating an interactive inspection process |
US7357571B2 (en) * | 2005-07-01 | 2008-04-15 | Predictive Service, Llc | Infrared inspection and reporting process |
US20070027704A1 (en) * | 2005-07-28 | 2007-02-01 | Simplikate Systems, L.L.C. | System and method for community association violation tracking and processing |
US20070050177A1 (en) * | 2005-08-25 | 2007-03-01 | Kirkland James R Jr | Methods and Systems for Accessing Code Information |
US7851758B1 (en) * | 2005-09-29 | 2010-12-14 | Flir Systems, Inc. | Portable multi-function inspection systems and methods |
US20070097089A1 (en) * | 2005-10-31 | 2007-05-03 | Battles Amy E | Imaging device control using touch pad |
US8368749B2 (en) * | 2006-03-27 | 2013-02-05 | Ge Inspection Technologies Lp | Article inspection apparatus |
WO2008054847A2 (en) * | 2006-04-03 | 2008-05-08 | 3M Innovative Properties Company | Vehicle inspection using radio frequency identification (rfid) |
JP4781883B2 (en) * | 2006-04-04 | 2011-09-28 | 株式会社日立製作所 | Information management method and information management system |
US8230362B2 (en) * | 2006-05-31 | 2012-07-24 | Manheim Investments, Inc. | Computer-assisted and/or enabled systems, methods, techniques, services and user interfaces for conducting motor vehicle and other inspections |
US20070288859A1 (en) * | 2006-06-07 | 2007-12-13 | Siemens Communications, Inc. | Method and apparatus for selective forwarding of e-mail and document content |
US20080021717A1 (en) * | 2006-06-08 | 2008-01-24 | Db Industries, Inc. | Method of Facilitating Controlled Flow of Information for Safety Equipment Items and Database Related Thereto |
US20080021905A1 (en) * | 2006-06-08 | 2008-01-24 | Db Industries, Inc. | Direct Data Input for Database for Safety Equipment Items and Method |
US20080021718A1 (en) * | 2006-06-08 | 2008-01-24 | Db Industries, Inc. | Centralized Database of Information Related to Inspection of Safety Equipment Items Inspection and Method |
US20080052377A1 (en) * | 2006-07-11 | 2008-02-28 | Robert Light | Web-Based User-Dependent Customer Service Interaction with Co-Browsing |
US8564544B2 (en) * | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US8106856B2 (en) * | 2006-09-06 | 2012-01-31 | Apple Inc. | Portable electronic device for photo management |
GB2452644B (en) * | 2006-10-10 | 2009-09-16 | Promethean Ltd | Automatic tool docking |
US8396280B2 (en) * | 2006-11-29 | 2013-03-12 | Honeywell International Inc. | Apparatus and method for inspecting assets in a processing or other environment |
US7865872B2 (en) * | 2006-12-01 | 2011-01-04 | Murex S.A.S. | Producer graph oriented programming framework with undo, redo, and abort execution support |
US7860268B2 (en) * | 2006-12-13 | 2010-12-28 | Graphic Security Systems Corporation | Object authentication using encoded images digitally stored on the object |
US11126966B2 (en) * | 2007-01-23 | 2021-09-21 | Tegris, Inc. | Systems and methods for a web based inspection compliance registry and communication tool |
KR101420419B1 (en) * | 2007-04-20 | 2014-07-30 | 엘지전자 주식회사 | Electronic Device And Method Of Editing Data Using the Same And Mobile Communication Terminal |
US20080275714A1 (en) * | 2007-05-01 | 2008-11-06 | David Frederick Martinez | Computerized requirement management system |
JP4904223B2 (en) * | 2007-08-23 | 2012-03-28 | キヤノン株式会社 | Image processing apparatus and image processing method |
JP2009104268A (en) * | 2007-10-22 | 2009-05-14 | Hitachi Displays Ltd | Coordinate detection device and operation method using touch panel |
US8224020B2 (en) * | 2007-11-29 | 2012-07-17 | Kabushiki Kaisha Toshiba | Appearance inspection apparatus, appearance inspection system, and appearance inspection appearance |
US8571747B2 (en) * | 2007-12-06 | 2013-10-29 | The Boeing Company | System and method for managing aircraft maintenance |
TW200935278A (en) * | 2008-02-04 | 2009-08-16 | E Lead Electronic Co Ltd | A cursor control system and method thereof |
WO2009117419A2 (en) * | 2008-03-17 | 2009-09-24 | Worcester Polytechnic Institute | Virtual interactive system for ultrasound training |
US20090271243A1 (en) * | 2008-04-25 | 2009-10-29 | Btsafety Llc. | System and method of providing product quality and safety |
US8526767B2 (en) * | 2008-05-01 | 2013-09-03 | Atmel Corporation | Gesture recognition |
US8866698B2 (en) * | 2008-10-01 | 2014-10-21 | Pleiades Publishing Ltd. | Multi-display handheld device and supporting system |
-
2009
- 2009-06-22 US US12/489,313 patent/US20100153168A1/en not_active Abandoned
- 2009-07-21 US US12/507,039 patent/US20100149211A1/en not_active Abandoned
- 2009-07-22 US US12/507,071 patent/US8032830B2/en not_active Expired - Fee Related
-
2010
- 2010-04-12 US US12/758,134 patent/US20100185549A1/en not_active Abandoned
- 2010-04-26 US US12/767,077 patent/US20100194781A1/en not_active Abandoned
- 2010-07-08 US US12/832,903 patent/US7971140B2/en not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6304271B1 (en) * | 1999-02-05 | 2001-10-16 | Sony Corporation | Apparatus and method for cropping an image in a zooming graphical user interface |
US20020172498A1 (en) * | 2001-05-18 | 2002-11-21 | Pentax Precision Instrument Corp. | Computer-based video recording and management system for medical diagnostic equipment |
US20040095375A1 (en) * | 2002-05-10 | 2004-05-20 | Burmester Christopher Paul | Method of and apparatus for interactive specification of manufactured products customized with digital media |
US20040056869A1 (en) * | 2002-07-16 | 2004-03-25 | Zeenat Jetha | Using detail-in-context lenses for accurate digital image cropping and measurement |
US20040128613A1 (en) * | 2002-10-21 | 2004-07-01 | Sinisi John P. | System and method for mobile data collection |
US20060107302A1 (en) * | 2004-11-12 | 2006-05-18 | Opentv, Inc. | Communicating primary content streams and secondary content streams including targeted advertising to a remote unit |
US20100171826A1 (en) * | 2006-04-12 | 2010-07-08 | Store Eyes, Inc. | Method for measuring retail display and compliance |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11601584B2 (en) | 2006-09-06 | 2023-03-07 | Apple Inc. | Portable electronic device for photo management |
US10904426B2 (en) | 2006-09-06 | 2021-01-26 | Apple Inc. | Portable electronic device for photo management |
US8957865B2 (en) * | 2009-01-05 | 2015-02-17 | Apple Inc. | Device, method, and graphical user interface for manipulating a user interface object |
US20100171712A1 (en) * | 2009-01-05 | 2010-07-08 | Cieplinski Avi E | Device, Method, and Graphical User Interface for Manipulating a User Interface Object |
US11334229B2 (en) | 2009-09-22 | 2022-05-17 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10788965B2 (en) | 2009-09-22 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US10564826B2 (en) | 2009-09-22 | 2020-02-18 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20130083176A1 (en) * | 2010-05-31 | 2013-04-04 | Pfu Limited | Overhead scanner device, image processing method, and computer-readable recording medium |
US9064290B2 (en) * | 2010-07-23 | 2015-06-23 | Jkads Llc | Method for inspecting a physical asset |
US20120023435A1 (en) * | 2010-07-23 | 2012-01-26 | Adolph Johannes Kneppers | Method for Inspecting a Physical Asset |
US20120162228A1 (en) * | 2010-12-24 | 2012-06-28 | Sony Corporation | Information processor, image data optimization method and program |
US9710598B2 (en) * | 2010-12-24 | 2017-07-18 | Sony Corporation | Information processor, image data optimization method and program |
US20120210200A1 (en) * | 2011-02-10 | 2012-08-16 | Kelly Berger | System, method, and touch screen graphical user interface for managing photos and creating photo books |
US8860726B2 (en) | 2011-04-12 | 2014-10-14 | Autodesk, Inc. | Transform manipulator control |
US8947429B2 (en) | 2011-04-12 | 2015-02-03 | Autodesk, Inc. | Gestures and tools for creating and editing solid models |
US9182882B2 (en) | 2011-04-12 | 2015-11-10 | Autodesk, Inc. | Dynamic creation and modeling of solid models |
US8860675B2 (en) | 2011-07-12 | 2014-10-14 | Autodesk, Inc. | Drawing aid system for multi-touch devices |
KR101859100B1 (en) * | 2011-07-19 | 2018-05-17 | 엘지전자 주식회사 | Mobile device and control method for the same |
US9240218B2 (en) * | 2011-07-19 | 2016-01-19 | Lg Electronics Inc. | Mobile terminal and control method of mobile terminal |
US20130024805A1 (en) * | 2011-07-19 | 2013-01-24 | Seunghee In | Mobile terminal and control method of mobile terminal |
US20130063369A1 (en) * | 2011-09-14 | 2013-03-14 | Verizon Patent And Licensing Inc. | Method and apparatus for media rendering services using gesture and/or voice control |
WO2013109246A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Gestures and tools for creating and editing solid models |
WO2013109245A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Dynamic creation and modeling of solid models |
WO2013109244A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Three dimensional contriver tool for modeling with multi-touch devices |
US8902222B2 (en) | 2012-01-16 | 2014-12-02 | Autodesk, Inc. | Three dimensional contriver tool for modeling with multi-touch devices |
US8671361B2 (en) * | 2012-05-24 | 2014-03-11 | Blackberry Limited | Presentation of image on display screen with combination crop and rotation and with auto-resizing of crop field |
US9201566B2 (en) * | 2012-05-24 | 2015-12-01 | Blackberry Limited | Presentation of image on display screen with combination crop and rotation and with auto-resizing of crop field |
GB2554121A (en) * | 2016-06-27 | 2018-03-28 | Moletest Ltd | Image processing |
US11307737B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | Media browsing user interface with intelligently selected representative media items |
US11625153B2 (en) | 2019-05-06 | 2023-04-11 | Apple Inc. | Media browsing user interface with intelligently selected representative media items |
US11947778B2 (en) | 2019-05-06 | 2024-04-02 | Apple Inc. | Media browsing user interface with intelligently selected representative media items |
US11446548B2 (en) | 2020-02-14 | 2022-09-20 | Apple Inc. | User interfaces for workout content |
US11452915B2 (en) | 2020-02-14 | 2022-09-27 | Apple Inc. | User interfaces for workout content |
US11564103B2 (en) | 2020-02-14 | 2023-01-24 | Apple Inc. | User interfaces for workout content |
US11611883B2 (en) | 2020-02-14 | 2023-03-21 | Apple Inc. | User interfaces for workout content |
US11638158B2 (en) | 2020-02-14 | 2023-04-25 | Apple Inc. | User interfaces for workout content |
US11716629B2 (en) | 2020-02-14 | 2023-08-01 | Apple Inc. | User interfaces for workout content |
Also Published As
Publication number | Publication date |
---|---|
US20100194781A1 (en) | 2010-08-05 |
US20100269029A1 (en) | 2010-10-21 |
US8032830B2 (en) | 2011-10-04 |
US20100153833A1 (en) | 2010-06-17 |
US7971140B2 (en) | 2011-06-28 |
US20100153168A1 (en) | 2010-06-17 |
US20100185549A1 (en) | 2010-07-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100149211A1 (en) | System and method for cropping and annotating images on a touch sensitive display device | |
EP3058512B1 (en) | Organizing digital notes on a user interface | |
KR102013331B1 (en) | Terminal device and method for synthesizing a dual image in device having a dual camera | |
US8583637B2 (en) | Coarse-to-fine navigation through paginated documents retrieved by a text search engine | |
US20120046071A1 (en) | Smartphone-based user interfaces, such as for browsing print media | |
US20160259766A1 (en) | Ink experience for images | |
EP3100208B1 (en) | Note capture and recognition with manual assist | |
US20070130529A1 (en) | Automatic generation of user interface descriptions through sketching | |
JP2007322847A (en) | Image display method and device, and program | |
JP2014092902A (en) | Electronic apparatus and handwritten document processing method | |
KR20110074166A (en) | Method for generating digital contents | |
JP4021249B2 (en) | Information processing apparatus and information processing method | |
JP6100013B2 (en) | Electronic device and handwritten document processing method | |
US20150098653A1 (en) | Method, electronic device and storage medium | |
JP5807433B2 (en) | Computer apparatus, electronic pen system, and program | |
JP2016085512A (en) | Electronic equipment, method, and program | |
CN105027053A (en) | Electronic device, display method, and program | |
JP5705060B2 (en) | Display device for input support device, input support device, information display method for input support device, and information display program for input support device | |
US20170310920A1 (en) | Method and system for rendering documents with depth camera for telepresence | |
US20150149894A1 (en) | Electronic device, method and storage medium | |
CN113407144B (en) | Display control method and device | |
CN113835598A (en) | Information acquisition method and device and electronic equipment | |
CN113515221A (en) | Picture and character comparison display method and device based on electronic document | |
JP6000069B2 (en) | Inspection system | |
CN112702524A (en) | Image generation method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KD SECURE LLC, FLORIDA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HO, ALBERT, MR;SIEGEL, MARC, MR;TOSSING, CHRISTOPHER, MR;SIGNING DATES FROM 20110111 TO 20110119;REEL/FRAME:025875/0568 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: SECURENET SOLUTIONS GROUP, LLC, FLORIDA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KD SECURE, LLC;REEL/FRAME:043815/0074 Effective date: 20171009 |