|Publication number||US6269457 B1|
|Application number||US 09/585,527|
|Publication date||Jul 31, 2001|
|Filing date||Jun 1, 2000|
|Priority date||Jun 1, 2000|
|Publication number||09585527, 585527, US 6269457 B1, US 6269457B1, US-B1-6269457, US6269457 B1, US6269457B1|
|Original Assignee||Testing Technologies, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (13), Non-Patent Citations (1), Referenced by (29), Classifications (12), Legal Events (4)|
|External Links: USPTO, USPTO Assignment, Espacenet|
1. Field of Invention
This invention relates to Hardware, Network, and/or Software technology and the regression and verification of the accuracy of that technology; more particularly to a method for coordinating and managing the method and procedure, including full regression, to established generational baselines in order to provide errors, deviations, recommendations, and acceptance status information.
2. Description of Prior Art
Over many years of technology development a general methodology for testing and implementing technology has evolved:
Unit Test—Each individual technology, whether it be hardware, network, and/or software, is tested by its creator to demonstrate that it accurately and completely includes all the capability that was requested in the system requirements specification document, and that each technology item can be executed accordingly.
System Test—A group of hardware, network, and/or software technologies are collectively exercised by technical staff to demonstrate that it accurately and completely includes all the capability which was requested in the system requirement specification documents, and that the group can demonstrate accurate and complete processing as was requested. An Information Technologist testing organization typically conducts these tests.
Inter-System Test—An entire platform of hardware, network, and/or software technologies are collectively exercised to demonstrate that the platform accurately and completely includes all the capability which was requested in the system requirement specification documents. An Information Technologist testing organization typically conducts the inter-system tests.
System requirement specifications are prepared by members of technical organizations after they interpret the original business user customer requirement request and determine the technology and computer languages, machines, networks, routers, switches, and other hardware and network components that will be used to support the business user request.
User Acceptance Test—A support platform [or part of it] is collectively exercised to demonstrate that it accurately and completely includes all the capabilities that the business user customer requested in their original business user requirements request initially presented to the Information Technology organization. These tests are typically conducted by, or for business user customers, and they are not technology focused, rather they are exercised to verify that the business needs, transaction processing, and reporting exactly meets the business user's needs as they requested.
Usually the Unit, system, and Inter-System tests are expected to be completed by the Information Technology organization before it delivers the technology to the business user customer for the User Acceptance Test. All these tests should be correctly completed prior to implementing the new or enhanced technology into the operating production environment. On many occasions, however, the business user customer is unable to verify that the technology fulfills their original user request, generally due to their inability to conduct the acceptance verification process, timing constraints, late delivery, lack of available, and/or not having capable staff. Because the business user customer has been unable, in many instances to test the technology, it may be implemented into their production environment with errors that are detected after the production implementation. In a large number of, if not all, situations, these errors would have been identified during a full regression and user acceptance test.
Testing and the verification of accuracy and completeness of technology development are estimated to cost about 60 percent of the technology development budget. Inter-System and especially User Acceptance Testing, the more expensive and time consuming areas, are often shortened or eliminated in order to meet budget and delivery timeframes. Even though there can be significant costs incurred for later corrective actions which are borne by the same business user customer, these costs and the delays associated with corrective actions are now assessed after customer dissatisfaction, errors in transactions, and business delays.
Inventors have created various testing techniques to address the Unit, System, and Inter-System testing processes. These techniques appear to indicate that their use could improve testing, however these tests principally support technologists who are familiar with the complicated details required to build the hardware, network, and software technologies themselves. Importantly, these improvements test and verify the system specifications created by technologists after interpreting the original business user customer request, and therefore the technology developed, even if fully tested, may still not fulfill the original business user customer Requirement. In many cases these differences in interpretation are responsible for errors in the technology developed.
Business user acceptance verification processes in the User Acceptance Test are expected to affirm that the technology to be implemented delivers the business user customer's requested technology. In order to meet the intent of user acceptance testing noted above, however, methods require improvements to verify the technology accuracy, meet business user customer needs, improve business processes, reduce dissatisfaction, business expense and business delays.
U.S. Pat. No. 5,233,611 to Trienthafyllos, Shield, and IBM (1993), discloses automated function testing of application programs which utilizes a test case program which collects and executes commands and transactions. This method, however, does not include regression testing and it does not test hardware or network technologies. Nor does the method specifically support user acceptance by the non-technical business user customer community.
U.S. Pat. No. 5,513,315 to Tierney, Cross, and Microsoft (1996) shows a system and method for automatically testing software using a deterministic acceptance test and random command sequence selections with a results analyzer. This method uses specifically predetermined commands with a log file and tracker file to execute technologist oriented code level testing. Again, there is no regression component; nor are hardware and network elements tested. Evaluative intelligence is not included, and the method does not support user acceptance by the non-technical business user customer community.
U.S. Pat. No. 5,892,947 to DeLong, Carl, and Sun Microsystems (1999) is a test support tool which produces software test programs from logical descriptions and cause-effect graphs. This method generates code level test cases, as well. This method does not support regression testing, does not test hardware and network elements, does not include evaluative intelligence, and does not support user acceptance by the non-technical business user customer community.
U.S. Pat. No. 5,805,795 to Whitten and Sun Microsystems (1998) shows a method and computer program product for generating tests with optimized test cases and a selection method. Again, this method does not support regression testing and does not test hardware and network elements. The invention does not include evaluative intelligence and does not support user acceptance by the non-technical business user customer community.
U.S. Pat. No. 5,913,023 to Szermer and Siemens Corporate Research (1999) is a method for automated generation of tests for software. Again, this method does not support regression testing and does not test hardware and network elements. The invention does not include evaluative intelligence and does not support user acceptance by the non-technical business user customer community.
U.S. Pat. No. 6,002,869 to Hinckley and Novell, Inc. (1999) shows a system and method for automatically testing software programs. This method enables the functional testing of various code of a software program. Even though claim 4 in this invention states the method enables a test history of a test procedure, this is at a very detail program level, and is not regression testing. Also hardware and network components are not tested. Evaluative intelligence is not included, and the method does not support non-technical business user customer community.
U.S. Pat. No. 6,014,760 to Silva, et al. al, and Hewlett-Packard Company discloses a scheduling method and apparatus for a distributed automated testing system. This invention schedules and executes software tests. It does not test hardware and network components. The invention does not add evaluative intelligence, does not control regression, nor does it support users or manage test conditions.
U.S. Pat. No. 5,500,941 to Gil, and Ericsson, S. A. shows an optimum functional test method to determine the quality of a software system embedded in a large electronic system using usage concepts modeled as Markov chains. Code level test cases are generated internally, and are not those that would be selected by the business user customer. Regression testing is not covered here, nor are hardware and network components tested. The invention does not add evaluative intelligence and does not support the non-technical business user customer community.
U.S. Pat. No. 5,870,539 to Schaffer and Sun Microsystems (1999) discloses a method for generalized windows application install testing for use with an automated test tool. This invention tests windows software applications only; and no hardware or network. Regression testing is not included, evaluative intelligence is not included, and the process does not support user acceptance for non-technical business user customer community.
None of the above methods indicates the capability of performing a regression process to verify continuing process accuracy and integrity over time.
None of the above methods indicates the capability of clearly identifying errors, deviations, and specifically related recommendation for corrective action over time, the most important components of a verification process.
None of the above methods indicates they reduce time and costs when using their method.
None of the above methods indicates the ability to easily support the non-technical business user customer.
None of the above methods indicates the ability to customize [their methods] to support various different technology, hardware, network, and/or software components.
A method is needed to support non-technical business users and help them ensure their technology is correct and supports their business plans.
This invention is a holistic method that manages hardware, network, and/or software regression and verification tasks and provides extensive evaluative acceptance results to business user customer communities. This method utilizes various techniques to build customizable components that support various specific business customer environments. Full regression and evaluative intelligence to established generational baselines and timeslices are major+innovative capabilities of this method, greatly enhancing the ability of business user customers to manage their business, make informed management decisions, and reduce time and costs associated with implementing new technology.
The present invention offers distinct advantages to business user customers. Use of the customized components of this method enables the business user customer to timely and cost-effectively review their technology platforms utilizing the important regression capability to verify whether technology platforms continue to meet critical business needs and retain previously implemented capabilities over time. The components produce easy to read and use documentation that indicates generational platform information, particularly errors, deviations, recommendations, and acceptance management reports and metrics.
Accordingly, the main object of this technology regression and verification acceptance invention is to enable a [non-technical] business user customer to insure that the technology they requested from their Information Technology (IT) organization was accurately and completely developed. This method provides the ability to assess currently requested technology, and iterations that were previously requested over time. Several objects and advantages of this invention are:
(a) to provide a method for maintaining and managing a generational set of baselines and timeslices 11 enabling the ability to verify continuing processing integrity over time. The significant advantage of these generational baselines and timeslices 11 of this invention is they form the basis and support mechanism for full regression review and evaluation.
(b) to provide evaluative and comparative information to a business user customer documenting the level of accuracy and completeness of their technology support platform. An important advantage of this component of the invention is that key constructs 12 and platform tasks 18 are related to the generational regression baselines and timeslices 11, and theme 17 to relate error and deviation information 14, recommendations 15, metrics 16, and results reports 17, which is meaningful to the business user customer community.
(c) to disclose errors or deviations 14 in a non-technical and customized way to make it easy for the business user customer to understand what has happened. The advantages of this component of the invention is that it enables the identification of unanticipated differences which can then be classified and presented to the business user customer in customized selected ways.
(d) to provide recommendations 15 for corrective action related to errors and deviations 14 that recommendations how business user customers might proceed and assist them and their Information technologist partners in addressing and correcting inconsistencies. The great advantages of the recommendation process is it develops recommended actions and improvements based upon preparatory work contained in defined criteria 30, thus including the business user customer fully in the process.
(e) To serve the business user customer in a timely, cost effective, and non-technical way. Accordingly, Information is presented to business user customers associated with processing information they are familiar with, their existing production operation information contained in reports, files, on screens, and in documents. Depending on defined criteria, this information can be prepared in timeslices, by process, and/or by error and deviation. Metrics 16 can also be presented in these selected groupings. The methods of this invention can demonstrate time and cost savings and track them with customizable selected metrics.
FIG. 1 is a high level view of the component activities of this invention's regression and verification processes including control, performance, regression, evaluation, classification, and tabulation.
FIG. 2 is a high level view of the Environmental Information of this invention These are the sub-components of FIG. 1, step 2B.
FIG. 3 is a high level view of the verification Information of this invention. These are sublevel components of FIG. 1 step 2A.
FIG. 4 is a high level view of the Recommendations and Reporting Functions of this invention. These items contain the acceptance information used to make key business decisions.
Drawing Cross Reference
FIG. 1 Step 1 Control Mechanism
FIG. 1 Step 2 Performance Mechanism
FIG. 1 Step 3 Regression Evaluator
FIG. 1 Step 4 Results and Regression Classification
FIG. 1 Step 5 Results and Regression Tabulation
FIG. 4 Step 6 Results and Recommendation
FIG. 4 Step 7 Results Reports
FIG. 1 Step 10 Storage Area, Refer to FIG. 3
FIG. 3 Step 11 Regression Baselines and
FIG. 3 Step 12 Key Constructs
FIG. 3 Step 13 Platforms Exercised
FIG. 3 Step 14 Errors and Deviations
FIG. 3 Step 15 Recommendations
FIG. 3 Step 16 Metrics
FIG. 3 Step 17 Theme
FIG. 3 Step 18 Platform Tasks
FIG. 1 Step 20 Environmental Information Refer to
FIG. 2 Step 21 Hardware
FIG. 2 Step 22 Network
FIG. 2 Step 23 Software
FIG. 3 Step 30 Defined Criteria
The figures described below constitute one way the invention's components might be constructed. The customizable nature of this method and its components enables the various components to be assembled as required to support individual business user environments, and various customized versions could utilize the components in selected manners.
In FIG. 1, Step 1, Control Mechanism 1 is the coordinating and management component, and it provides interfaces with various other components as they are selected to be processed by FIG. 3 Step 30, Defined Criteria 30. Control mechanism 1 also provides access to FIG. 3 Step 18 Task 18; and FIG. 3, Step 11 Regression Baselines and Timeslices 11 information. It also provides access to FIG. 2, Step 21 Hardware information 21; FIG. 2, Step 22 Network Information 22: and FIG. 2, Step 23 Software information 23.
Control Mechanism 1 and all other components may provide access to FIG. 1
Step 10, Storage Area 10 shown in further detail in FIG. 3 that stores information related to various processes. Control information, however, once prepared, is managed by Control Mechanism 1 and provided to each component as required.
Control mechanism 1 provides the methods to order and aggregate information from components in FIGS. 1, 2, 3, and 4 of the drawings, particularly from the entry of Defined Criteria 30. It provides the method to build a unique FIG. 3 Step 17 Regression and Verification Theme 17. Control Mechanism 1 provides the completed Theme 17 to FIG. 1, Step 2, Performance Mechanism 2 for processing.
Performance Mechanism 2 and Theme 17 [from Control Mechanism 1], including pointers to/or Task(s) 18, Defined Criteria 30, and Regression Baseline Detail and Timeslices 11, are included to provide FIG. 3 Step 19, Platform 19 to be exercised and FIG. 3, Step 13, Platforms Exercised. Platform 19 could include new or enhanced hardware 21, network 22 and/or software 23 components. Performance Mechanism 2 provides the method to execute the business user technology processes and then provides the method to return selected information to Storage Area 10, associated with various FIG. 3, Step 12, Key Constructs 12, as directed by theme 17, for use by FIG. 1, Step 3 Regression Evaluator 3.
Regression Evaluator 3 provides the method to conduct a variety of operations using information from Performance Mechanism 2 and Key Constructs 12 as directed by Theme 17. Evaluator 3 provides mechanisms to review performance information and compare it to information in Regression Baselines and timeslices 11. Every timeslice comparison may generate errors and deviations 14 that will be classified, tabulated and reported by following components. This mechanism provides Evaluator 3, FIG. 3 Step 14 Errors and Deviations 14 to Storage Area 10, which provides the mechanism to associate the information with various Key Constructs 12 and Regression Baselines and timeslices 11. Regression Baselines and Timeslices 11 also provides mechanisms to support generational logs of Errors and Deviations 14.
FIG. 1 Step 4 Results and Regression Classification 4 provides the method to access Regression Baselines and Timeslices 11, Errors and Deviations 14, and Key constructs 12. This process provides mechanisms to match, categorize, and classify information created by Performance Mechanism 2, Regression Evaluator 3, theme 17, and with Regression Baselines and Timeslices 11. This classification information is stored with Regression Baselines and Timeslices 11 for use by the FIG. 1 Step 5 Results and Regression Tabulation 5 process.
Tabulation 5 provides the mechanism to access Regression Baselines and Timeslices 11 that were updated by Evaluator 3 and Classification 4. Tabulation 5 provides mechanisms to identify and count classification 4 information using theme 17 and store the counts in FIG. 3 Step 16 Regression Metrics 16.
FIG. 4 Step 6 Results and Recommendations 6 provides the mechanism to of actions and Improvements of FIG. 3, Step 15 Recommendations 15, and then store them by timeslice and Key constructs 12 in Regression Baselines and Timeslices 11. The wording of the Recommendations, Actions, and Improvements was previously identified and entered into and maintained as part of Defined Criteria 30. This step prepares pointers to selected wording based upon Theme 17.
FIG. 4, Step 7 Results Reports 7 provides formatted reporting information. It provides mechanisms to store the reporting detail information itself in Regression Baselines and Timeslices 11 by timeslice and Key Constructs 12 according to Theme 17 and to maintain complete regression information on all activities. Results Reports 7 then provides mechanisms to print selected reports according to Theme 17.
Operation of the technology regression and verification acceptance method is further described with reference to FIGS. 1 through 4 and Reference Numerals to indicate.
The present invention is a method of managing and building a collection of instructions to perform acceptance and regression verifications of various combinations of hardware, network and/or software elements, depending upon the particular environment in which it is used. Each of the acceptance and regression verification components and comparative algorithms may be developed with a variety of techniques in order to perform the selected methodology alone or be inter-connected in order to process the acceptance and regression verification based upon defined criteria 30 and theme 17.
Information in defined criteria 30 is incorporated in theme 17 and directs the unique processing which take place in this method's components, defining the particular holistic universe that will be reviewed, including environmental hardware, network, and/or software. Defined criteria 30 and theme 17 include instructions for managing the handling of Regression Baselines and Timeslices 11.
The operation of the components of this invention is based upon comparative algorithms and pointer creation, which utilize information prepared by exercising the business user customer's technology platform. Subsequent information is maintained in the regression baselines and timeslices 11, Key constructs 12, and Theme 17.
When the components of this invention exercise the hardware, network, and software in the unique customer environment, selected activities are managed and guided by Theme 17.
The method of this invention thus enables the business user customer to ensure that the results they see are exact reflections of the operations of their individual unique production technology platforms. Further, they are able to retain and review regression details in customizable reports as selected.
Presentation of Results Reports 7 management reports to the business user customer enables them to make important management decisions as to the acceptance of the technology enhancements incorporated in the Theme 17 execution. The business user customer may decide to implement recommendations 15 provided by this acceptance and regression method. The business user customer may make determinations based upon the results reports 7 and metrics 16 provided.
Internal information is used by the present invention to manage unique [business user] environments which will be evaluated during processing using the instructions from Defined Criteria 30 and subsequent Theme 17, to provide business user customer acceptance status. Information includes:
(a) Defined Criteria 30, including high level instructions identifying the business user customer's environments and scope of the selected area to be evaluated. This information is managed by Control Mechanism 1, maintained in Storage Area 10 and used to create Theme 17. Defined Criteria 30 also contains customer defined error messages and customized pointer criteria.
(b) Theme 17, including pointers to hardware 21, network 22, and software 23 environmental information, and regression and timeslice 11 information. Theme carrying instructions to every selected component and Task 18 enable Performance Mechanism 2 to process. The type of classification selected is indicated in Theme 17. It relates information such as errors and deviations to pointers and selected reporting categories. Theme 17 also contains tabulation and classification instructions based upon defined criteria 30 which cause information prepared in the classification 4 and tabulation 5 components to be associated with key constructs 12.
(c) Key constructs 12 contain customized business user customer selected structure and instructions for their management.
(d) Regression Baselines and Timeslices 11 contain business information associated with a specific business user customer's operating environment at various points in time. Information includes details which have been selected to represent normal business processing with special attention to month end, quarter end, year and end time periods. This business information reflects the condition of information supporting every timeslice selected to be managed. Other information prepared during the performance of this invention is uniquely related to its selected timeslice, its Key Constructs 12, Errors and Deviations 14, Recommendations 15, Metrics 16, Hardware 21, Network 22, Software 23, and Platforms Exercised 13.
(e) Errors and Deviations 14 information consists of pointers to both the causative business information and timeslice, and the particular Defined Criteria 30 message chosen by the business user customer according to Theme 17.
(f) Reporting 7 information consists of pointers to areas to be reported based upon Defined Criteria 30 and Theme 17.
During the operation of the components of this invention, Control Mechanism 1 establishes working parameters for the specific component execution by establishing Theme 17.
Once Control Mechanism 1 prepares Theme 17, each specified component obtains and uses its parts of Defined Criteria 30 [now] from theme 17 to limit or expand the scope of the operation being conducted and the type of results that are expected.
Performance Mechanism 2 initiates the business user customer's technology platform 19. When the business user customer's processes are completed, it returns process execution information to storage area 10 as directed by theme 17, pointers, and Platforms Exercised 13.
Regression Evaluator 3, using instructions from Theme 17, compares information between selected baselines and timeslices 11 and prepares pointer information that identify errors and deviations 14 and selected information in Key Constructs 12.
Results and Regression Classification 4, using instructions from Theme 17, associates the errors and deviations 14 with selected baselines and timeslices 11 using pointers.
Results and Regression Tabulation 5, using instructions from Theme 17, counts the occurrences of the errors and deviations 14 and associates the counts with selected categories using selected pointers.
Results and Recommendations 6 creates pointers from each error or deviation to the appropriate selected messages maintained in defined criteria 30.
Results Reports 7, using Theme 17, selected instructions, and pointers, and then formats selected information and prints reports. If selected this could be on paper, to disk or others selected media.
Accordingly, it can be seen that this method and its components is a unique innovative way to verify information technology enhancements in support of business user customers. This includes providing them timely, cost-effective complete regression and verification review with acceptance status reporting information they can use to make informed acceptance decisions.
Although the description above contains many specifics, these should not be construed as limiting the scope of the method but as merely providing illustrations of some of the presently preferred embodiments of this method.
Thus the scope of the invention should be determined by the claims and their legal equivalents, rather than by the examples given.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5233611 *||Aug 20, 1990||Aug 3, 1993||International Business Machines Corporation||Automated function testing of application programs|
|US5500941 *||Jul 6, 1994||Mar 19, 1996||Ericsson, S.A.||Optimum functional test method to determine the quality of a software system embedded in a large electronic system|
|US5513315 *||Dec 22, 1992||Apr 30, 1996||Microsoft Corporation||System and method for automatic testing of computer software|
|US5539877 *||Jun 27, 1994||Jul 23, 1996||International Business Machine Corporation||Problem determination method for local area network systems|
|US5553235 *||May 1, 1995||Sep 3, 1996||International Business Machines Corporation||System and method for maintaining performance data in a data processing system|
|US5673387 *||Aug 12, 1996||Sep 30, 1997||Lucent Technologies Inc.||System and method for selecting test units to be re-run in software regression testing|
|US5805795 *||Jan 5, 1996||Sep 8, 1998||Sun Microsystems, Inc.||Method and computer program product for generating a computer program product test that includes an optimized set of computer program product test cases, and method for selecting same|
|US5871539 *||Feb 22, 1996||Feb 16, 1999||Biomedical Engineering Trust I||Fixed bearing joint endoprosthesis|
|US5892947 *||Jul 1, 1996||Apr 6, 1999||Sun Microsystems, Inc.||Test support tool system and method|
|US5913023 *||Jun 30, 1997||Jun 15, 1999||Siemens Corporate Research, Inc.||Method for automated generation of tests for software|
|US6002869 *||Feb 26, 1997||Dec 14, 1999||Novell, Inc.||System and method for automatically testing software programs|
|US6014760 *||Sep 22, 1997||Jan 11, 2000||Hewlett-Packard Company||Scheduling method and apparatus for a distributed automated testing system|
|US6061643 *||Jul 7, 1998||May 9, 2000||Tenfold Corporation||Method for defining durable data for regression testing|
|1||*||Lee, Michelle "Algorithmic Analysis of the IMpacts of Changes to Object-oriented Software" IEEE, 0-7695-0774-3/00. copyright 2000.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6434714 *||Feb 4, 1999||Aug 13, 2002||Sun Microsystems, Inc.||Methods, systems, and articles of manufacture for analyzing performance of application programs|
|US6546359||Apr 24, 2000||Apr 8, 2003||Sun Microsystems, Inc.||Method and apparatus for multiplexing hardware performance indicators|
|US6647546||May 3, 2000||Nov 11, 2003||Sun Microsystems, Inc.||Avoiding gather and scatter when calling Fortran 77 code from Fortran 90 code|
|US6802057||May 3, 2000||Oct 5, 2004||Sun Microsystems, Inc.||Automatic generation of fortran 90 interfaces to fortran 77 code|
|US6823478 *||Sep 12, 2000||Nov 23, 2004||Microsoft Corporation||System and method for automating the testing of software processing environment changes|
|US6895533 *||Mar 21, 2002||May 17, 2005||Hewlett-Packard Development Company, L.P.||Method and system for assessing availability of complex electronic systems, including computer systems|
|US6907547 *||Feb 15, 2002||Jun 14, 2005||International Business Machines Corporation||Test tool and methods for testing a computer function employing a multi-system testcase|
|US6910107||Aug 23, 2000||Jun 21, 2005||Sun Microsystems, Inc.||Method and apparatus for invalidation of data in computer systems|
|US6986130||Jul 28, 2000||Jan 10, 2006||Sun Microsystems, Inc.||Methods and apparatus for compiling computer programs using partial function inlining|
|US7143073 *||Apr 4, 2002||Nov 28, 2006||Broadcom Corporation||Method of generating a test suite|
|US7320090 *||Jun 9, 2004||Jan 15, 2008||International Business Machines Corporation||Methods, systems, and media for generating a regression suite database|
|US7406681||Oct 12, 2000||Jul 29, 2008||Sun Microsystems, Inc.||Automatic conversion of source code from 32-bit to 64-bit|
|US7613953 *||May 27, 2003||Nov 3, 2009||Oracle International Corporation||Method of converting a regression test script of an automated testing tool into a function|
|US7694181 *||Dec 12, 2005||Apr 6, 2010||Archivas, Inc.||Automated software testing framework|
|US7711992||Jul 23, 2008||May 4, 2010||International Business Machines Corporation||Generating a regression suite database|
|US7729891 *||Jun 6, 2005||Jun 1, 2010||International Business Machines Corporation||Probabilistic regression suites for functional verification|
|US9619600||Sep 19, 2014||Apr 11, 2017||Mentor Graphics Corporation||Third party component debugging for integrated circuit design|
|US9703579 *||May 1, 2013||Jul 11, 2017||Mentor Graphics Corporation||Debug environment for a multi user hardware assisted verification system|
|US20030065980 *||Feb 15, 2002||Apr 3, 2003||International Business Machines Corporation||Test tool and methods for testing a computer function employing a multi-system testcase|
|US20030182599 *||Mar 21, 2002||Sep 25, 2003||Gray William M.||Method and system for assessing availability of complex electronic systems, including computer systems|
|US20030191985 *||Apr 4, 2002||Oct 9, 2003||Broadcom Corporation||Method of generating a test suite|
|US20050283664 *||Jun 9, 2004||Dec 22, 2005||International Business Machines Corporation||Methods, systems, and media for generating a regression suite database|
|US20060168565 *||Jan 24, 2005||Jul 27, 2006||International Business Machines Corporation||Method and system for change classification|
|US20070010975 *||Jun 6, 2005||Jan 11, 2007||International Business Machines Corporation||Probabilistic regression suites for functional verification|
|US20070234293 *||Dec 12, 2005||Oct 4, 2007||Archivas, Inc.||Automated software testing framework|
|US20080065931 *||Nov 7, 2007||Mar 13, 2008||International Business Machines Corporation||Methods, Systems, and Media for Generating a Regression Suite Database|
|US20080307263 *||Jul 23, 2008||Dec 11, 2008||International Business Machines Corporation||Systems and media for generating a regression suite database|
|US20090070742 *||May 27, 2003||Mar 12, 2009||Venkata Subbarao Voruganti||Method of converting a regression test script of an automated testing tool into a function|
|US20140032204 *||May 1, 2013||Jan 30, 2014||Mentor Graphics Corporation||Partitionless Multi User Support For Hardware Assisted Verification|
|U.S. Classification||714/38.1, 714/E11.208, 714/48, 714/E11.148|
|International Classification||G06F11/22, G06F11/36|
|Cooperative Classification||G06F11/2273, G06F11/3688, G06F11/3672|
|European Classification||G06F11/36T2, G06F11/36T2E, G06F11/22M|
|Apr 23, 2001||AS||Assignment|
Owner name: TESTING TECHNOLOGIES, INC., NEW JERSEY
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LANE, HARRIET;REEL/FRAME:011790/0450
Effective date: 20010420
|Feb 16, 2005||REMI||Maintenance fee reminder mailed|
|Aug 1, 2005||LAPS||Lapse for failure to pay maintenance fees|
|Sep 27, 2005||FP||Expired due to failure to pay maintenance fee|
Effective date: 20050731