US20060026054A1 - Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations - Google Patents

Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations Download PDF

Info

Publication number
US20060026054A1
US20060026054A1 US10/900,959 US90095904A US2006026054A1 US 20060026054 A1 US20060026054 A1 US 20060026054A1 US 90095904 A US90095904 A US 90095904A US 2006026054 A1 US2006026054 A1 US 2006026054A1
Authority
US
United States
Prior art keywords
automated computing
customer
information
instructions
target level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/900,959
Inventor
Miles Barel
Sandra Carter
James Crosskey
Leslie Ernest
David Evans
Lori Ford
Ronald Liles
Dwight Spence
Albert Swett
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US10/900,959 priority Critical patent/US20060026054A1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EVANS, DAVID HOWARD, CARTER, SANDRA, BAREL, MILES A., FORD, LORI LYNN, ERNEST, LESLIE MARK, CROSSKEY, JAMES P., SWETT, ALBERT L., LILES, RONALD C., SPENCE, DWIGHT
Publication of US20060026054A1 publication Critical patent/US20060026054A1/en
Priority to US12/131,611 priority patent/US8019640B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0637Strategic management or analysis, e.g. setting a goal or target of an organisation; Planning actions based on goals; Analysis or evaluation of effectiveness of goals
    • G06Q10/06375Prediction of business process outcome or impact based on a proposed change
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls

Definitions

  • the present invention relates to data processing and, in particular, to autonomic computing. Still more particularly, the present invention provides a method, apparatus, and program product for implementing an automation computing evaluation scale to generate recommendations.
  • An on-demand business is an enterprise whose business processes, when integrated end-to-end across the company with key partners, suppliers, and customers, can respond with speed to any customer, market opportunity, or external threat.
  • An enterprise endeavors to be on-demand ready, it is a goal to increase its sophistication of automation by embedding autonomic capabilities and technologies.
  • An enterprise's autonomic capability may range from basic, where analysis and problem solving are performed manually, to autonomic, where computer systems and networks may configure themselves to changing conditions, for example, and are self-healing in the event of failure with minimal human intervention.
  • Autonomic computing can help to overcome the barrier of infrastructure complexity.
  • the core benefits of autonomic computing are improved resiliency, ability to deploy new capabilities more rapidly and increased return from IT investments. In a rapidly changing market, the ability to react quickly is a competitive advantage.
  • Bottom line advanced automation through utilizing autonomic technology allows companies to focus on business, not on infrastructure. Therefore, it may be a goal of an on-demand business to improve its levels of automation by incorporating autonomic computing technologies.
  • the present invention recognizes the disadvantages of the prior art and provides an automation assessment tool that defines autonomic technology, processes, organization, and skill sets that apply to autonomic computing.
  • the automation assessment tool provides educational material about autonomic computing and a scale of maturity levels, which is used to assess on-demand preparedness.
  • the automation assessment tool presents a survey and collects answers to the survey questions.
  • the automation assessment tool determines solutions and recommendations to achieve a target level of on-demand preparedness.
  • FIG. 1 is a pictorial representation of a data processing system in which exemplary aspects of the present invention may be implemented
  • FIG. 2 is a block diagram of a data processing system in which exemplary embodiments of the present invention may be implemented
  • FIG. 3 is a block diagram illustrating an automation assessment tool in accordance with an exemplary embodiment of the present invention
  • FIGS. 4A-4D illustrate example presentation material presented by an automation assessment tool in accordance with an exemplary embodiment of the present invention
  • FIG. 5 illustrates an example display presenting an automation assessment survey in accordance with an exemplary embodiment of the present invention
  • FIGS. 6A-6C are example displays illustrating results of automation assessment in accordance with an exemplary embodiment of the present invention.
  • FIGS. 7A and 7B are example displays illustrating solutions and recommendations in accordance with an exemplary embodiment of the present invention.
  • FIG. 8 is an example display illustrating estimated financial benefits with automated computing in accordance with an exemplary embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating the operation of an automation assessment tool in accordance with an exemplary embodiment of the present invention.
  • the present invention provides a method, apparatus and computer program product for implementing an automation computing evaluation scale to generate recommendations.
  • the data processing device may be a stand-alone computing device or may be a distributed data processing system in which multiple computing devices are utilized to perform various aspects of the present invention. Therefore, the following FIGS. 1 and 2 are provided as exemplary diagrams of data processing environments in which exemplary aspects of the present invention may be implemented. It should be appreciated that FIGS. 1 and 2 are only exemplary and are not intended to assert or imply any limitation with regard to the environments in which the present invention may be implemented. Many modifications to the depicted environments may be made without departing from the spirit and scope of the present invention.
  • a mobile computer 100 which includes system unit 102 , video display terminal 104 , keyboard 106 , storage devices 108 , which may include floppy drives and other types of permanent and removable storage media, and pointer device 110 . Additional input devices may be included with mobile computer 100 , such as, for example, a mouse, joystick, touch screen, trackball, microphone, and the like.
  • Mobile computer 100 man be implemented using any suitable computer, such as an IBM ThinkPad® computer, which is a product of International Business Machines Corporation, located in Armonk, N.Y.
  • Computer 100 also preferably includes a graphical user interface (GUI) that may be implemented by means of systems software residing in computer readable media in operation within computer 100 .
  • GUI graphical user interface
  • Data processing system 200 is an example of a mobile computer, such as computer 100 in FIG. 1 , in which code or instructions implementing the processes of the present invention may be located.
  • data processing system 200 employs a hub architecture including a north bridge and memory controller hub (MCH) 208 and a south bridge and input/output (I/O) controller hub (ICH) 210 .
  • MCH north bridge and memory controller hub
  • I/O input/output controller hub
  • Processor 202 , main memory 204 , and graphics processor 218 are connected to MCH 208 .
  • Graphics processor 218 may be connected to the MCH through an accelerated graphics port (AGP), for example.
  • AGP accelerated graphics port
  • LAN adapter 212 may be connected to ICH 210 .
  • ROM 224 may include, for example, Ethernet adapters, add-in cards, PC cards for notebook computers, etc.
  • PCI uses a cardbus controller, while PCIe does not.
  • ROM 224 may be, for example, a flash binary input/output system (BIOS).
  • BIOS binary input/output system
  • Hard disk drive 226 and CD-ROM drive 230 may use, for example, an integrated drive electronics (IDE) or serial advanced technology attachment (SATA) interface.
  • a super I/O (SIO) device 236 may be connected to ICH 210 .
  • IDE integrated drive electronics
  • SATA serial advanced technology attachment
  • Docking interface 240 may also be connected to the ICH.
  • Data processing system 200 may be a mobile computing device, such as a laptop computer or handheld computer.
  • Docking interface 240 provides port replication to allow the data processing system to easily connect to a keyboard, pointing device, monitor, printer, speakers, etc.
  • the docking interface allows the mobile computing device to operate as a desktop computer with the more immobile peripheral devices.
  • An operating system runs on processor 202 and is used to coordinate and provide control of various components within data processing system 200 in FIG. 2 .
  • the operating system may be a commercially available operating system such as Windows XPTM, which is available from Microsoft Corporation.
  • An object oriented programming system such as the JavaTM programming system, may run in conjunction with the operating system and provides calls to the operating system from Java programs or applications executing on data processing system 200 .
  • “JAVA” is a trademark of Sun Microsystems, Inc.
  • Instructions for the operating system, the object-oriented programming system, and applications or programs are located on storage devices, such as hard disk drive 226 , and may be loaded into main memory 204 for execution by processor 202 .
  • the processes of the present invention are performed by processor 202 using computer implemented instructions, which may be located in a memory such as, for example, main memory 204 , memory 224 , or in one or more peripheral devices 226 and 230 .
  • FIG. 2 may vary depending on the implementation.
  • Other internal hardware or peripheral devices such as flash memory, equivalent non-volatile memory, or optical disk drives and the like, may be used in addition to or in place of the hardware depicted in FIG. 2 .
  • the processes of the present invention may be applied to a multiprocessor data processing system.
  • data processing system 200 may be a personal digital assistant (PDA), which is configured with flash memory to provide non-volatile memory for storing operating system files and/or user-generated data.
  • PDA personal digital assistant
  • FIG. 2 and above-described examples are not meant to imply architectural limitations.
  • data processing system 200 also may be a tablet computer or telephone device in addition to taking the form of a PDA.
  • an automation assessment tool is provided to assess a client's current information technology (IT) environment to determine on-demand readiness.
  • the automation assessment tool defines autonomic technology, processes, organization, and skill sets that apply to autonomic computing.
  • the automation assessment tool provides educational material about autonomic computing and a scale used to measure on-demand preparedness.
  • the automation assessment tool presents a survey and collects answers to the survey questions.
  • the automation assessment tool determines solutions and recommendations to achieve a target level of on-demand preparedness.
  • FIG. 3 is a block diagram illustrating an automation assessment tool in accordance with an exemplary embodiment of the present invention.
  • Automation assessment tool 310 includes media player 312 , survey module 314 , and analysis module 316 .
  • Media player 312 presents educational presentation material 302 to a customer via an output device, such as display 322 .
  • Presentation material 302 may provide information about automation including information about automation fundamentals, autonomic self-managing capabilities, automation maturity levels, and automation assessment categories. The information provided in presentation material 302 serves to educate the customer generally about automation and, more specifically, about the manner in which automation will be assessed by automation assessment tool 310 .
  • Media player 312 may be, for example, a web browser, video player, or presentation graphics application program. In one exemplary embodiment, media player 312 may be a Flash® player from Macromedia, Inc.
  • Survey module 314 presents survey questions 304 to an operator and receives answers to the questions.
  • a sales representative of a company that provides automated computing technology and services may conduct the survey and enter answers provided by a customer.
  • Survey module 314 stores survey answers 324 for subsequent inspection and for use by analysis module 316 .
  • an example assessment survey for availability management may include the following questions:
  • Automation capabilities of an enterprise include, for example, the ability to be self-configuring, the ability to be self-healing, the ability to be self-optimization, and the ability to be self-protecting. Across the four automation capabilities, there are several key operational areas where one can assess automation maturity. These operational areas are used as automation assessment categories in accordance with an exemplary embodiment of the present invention.
  • the automation assessment categories may include, for example, problem management, availability management, security management, solution deployment, user administration, and performance and capacity management.
  • Problem management is the act of identifying, isolating, and resolving issues that might negatively impact IT service delivery.
  • Availability management is the act of ensuring that required IT services are available, as needed, to ensure business continuity.
  • Security management is the act of securing critical business resources and data against attacks and authorized access from both external and internal threats.
  • Solution deployment is the act of planning, testing, distributing, installing, and validating the deployment of new IT solutions, including the IT infrastructure elements, in a manner that is the least disruptive to operational services. The ability to roll back to a prior functioning environment if a change is unsuccessful is also necessary.
  • User administration is the act of managing the full lifecycle of a user's access to the company resources, such as adding, deleting, and changing access to resources based on business policies and job function.
  • Performance and capacity management is the act of monitoring and managing system performance to adequately meet the throughput and response time requirements associated with operational business needs.
  • Analysis module 316 ranks the various aspects of the customer's on-demand readiness based on a scale of maturity levels.
  • the maturity levels may include, for example, basic, managed, predictive, adaptive, and autonomic.
  • analysis module 316 may rank each of the key operational areas based on this scale.
  • analysis module 316 may optionally rank technology, processes, and skill sets based on this scale.
  • the basic maturity level indicates that the customer uses manual analysis and problem solving. In a real-world scenario, transaction response times may slow during key transactions. To diagnose this problem within the basic maturity level, multiple product experts may analyze product-specific events and logs. The basic maturity level requires extensive, highly skilled IT staff. A benefit of this maturity level is that basic requirements are addressed.
  • the managed maturity level indicates that the customer uses centralized tools and performs manual actions.
  • the IT staff uses tools to look at transaction response data and event data from multiple products to help them make a decision.
  • the IT staff analyzes data and takes actions. Benefits of the managed maturity level include greater system awareness and improved productivity.
  • the predictive maturity level indicates that the customer monitors, correlates data, and recommends action.
  • transaction trend analysis data symptoms
  • Technology-analysis correlates symptoms with recommended actions.
  • the IT staff approves and initiates actions.
  • the predictive maturity level enables reduced dependency on deep IT skills and faster and better decision-making.
  • the adaptive maturity level indicates that the customer uses system that monitors, correlates data, and takes actions.
  • a problem occurs with a transaction, a particular symptom is matched to a recommended action, and the system takes the action.
  • the IT staff manages performance against service level agreements.
  • the adaptive maturity level allows balanced human-to-system interaction and increases IT agility and resiliency.
  • the autonomic maturity level indicates dynamic business policy based management.
  • action is taken based on business policy, for example, giving preference for key transactions over less important ones, or performing an action (like a reboot) during a non-critical time.
  • the IT staff focuses on business needs.
  • Business policy drives IT management.
  • the autonomic maturity level increases business agility and resiliency.
  • Survey module 314 may provide multiple-choice answers to be selected by the customer. These multiple-choice answers may be associated with specific solutions and recommendations. For example, if the customer indicates in the answers to the survey that the IT staff can use cross-resource availability analyses to predict business system availability and manually make adjustments to maintain business system availability based on business objectives, then automation assessment tool 310 may recommend that the customer schedule education on workflow automation and business integration to enable automation of the best-practices processes that keep IT running.
  • Analysis module 316 may also determine a financial impact that may result from achieving a target level of automated computing capability or on-demand preparedness. For example, analysis module 316 may generate a graph that compares target IT spending over time compared to IT spending based on current on-demand readiness. Analysis module 316 provides solutions, recommendations, and financial impact information as output 326 , which may be stored in persistent storage or presented by an output device, such as display 322 .
  • Automation assessment tool 310 may be an expert system that crawls through corporate databases and may infer information to create analysis output 326 .
  • An expert system is an artificial intelligence (AI) application that uses a knowledge base of human expertise or historical information for problem solving. The success of an expert system is based on the quality of the data and rules obtained from a human expert. In practice, expert systems perform both below and above that of a human.
  • Analysis module 316 may use rules (not shown) to derive answers by running information, such as survey answers 324 , through an inference engine (not shown), which is software that processes results from rules and data in a knowledge base.
  • automation assessment tool 310 may provide navigation to higher or lower levels of autonomic computing to generate recommendations.
  • FIGS. 4A-4D illustrate example presentation material presented by an automation assessment tool in accordance with an exemplary embodiment of the present invention. More particularly, with reference to FIG. 4A , presentation display 400 may provide educational information about automation fundamentals including business service management, policy based orchestration, availability, security, optimization, provisioning, and virtualization.
  • Business service management is the activity of integrating business-process-to-business-process.
  • Business process integration provides the tools needed to manage service levels, meter system utilization, and bill customers for that usage, as well as model integrate, connect, monitor, and manage business processes from end to end for complete linkage of business applications and linkage of business processes to the IT environment.
  • Policy based orchestration helps customers automatically control and manage the four capabilities (availability, security, optimization, provisioning) so that the entire IT infrastructure is responding dynamically to changing conditions according to defined business policies.
  • the orchestration builds on industry best practices and the collective IT experience of the customer to ensure that complex deployments are achieved, on demand, with speed and quality.
  • Availability management ensures the health and functioning of IT environments based on business objectives. Delivery of consistent and reliable service levels with reduced IT administration costs is key, enabled by dynamic event generation, correlation and analysis and automated cure aligned with business views of the IT infrastructure. With respect to availability management, the automation assessment tool emphasizes self-healing.
  • Security management ensures that policies for identity management, including access and privacy control, are consistently defined and enforced across the IT environment. Security management enables the automated detection of and response to security threats, including intrusions and insecure configurations. With respect to security management, the automation assessment tool emphasizes self-protecting.
  • Optimization ensures the most productive utilization of IT infrastructure based on business objectives. Capabilities like transaction performance management, dynamic workload management, and dynamic job and task scheduling are key within application domains and across a heterogeneous IT infrastructure. With respect to optimization, the automation assessment tool emphasizes self-optimization.
  • Provisioning provides the ability to automatically and dynamically configure and deploy resources in response to changing business conditions and objectives in heterogeneous environments. Provisioning can be elemental (that is, server provisioning, storage provisioning, and so forth) and horizontal (that is, end-to-end application provisioning). With respect to provisioning, the automation assessment tool emphasizes self-configuration.
  • Virtualization enables resources to be shared, managed, and accessed across a workgroup, enterprise, or even across company boundaries, regardless of operating characteristics. Users benefit from seamless and uninterrupted access to resources, while the physical resources that compose a virtualized environment might reside in multiple locations. Resource virtualization provides access to processing power and data to improve asset utilization and efficiency, to rapidly solve complex business problems, to conduct computer-intensive research and data analysis, and to respond to real-time business fluctuations on demand.
  • Automation capabilities of an enterprise include, for example, the ability to be self-configuring, the ability to be self-healing, the ability to be self-optimization, and the ability to be self-protecting.
  • a self-configuring environment can dynamically configure itself on-the-fly and can adapt itself to the deployment of new components or changes with minimal human intervention.
  • a self-healing IT environment can detect improper operation of systems, transactions, and business processes, and then initiate corrective action without disrupting users or services.
  • a self-optimizing IT environment addresses the complexity of managing system performance.
  • a self-optimizing environment can learn from experience and can proactively tune itself in the context of an overall business objective.
  • a self-protecting IT environment can allow the right people to access the right data at the right time.
  • a self-protecting environment can automatically take the appropriate actions to make itself less vulnerable to attacks on its runtime infrastructure and on its business data.
  • display 420 presents a review of automation maturity levels in accordance with an exemplary embodiment of the present invention.
  • a user may navigate display 420 to view a description of each of the maturity levels to prepare for the assessment survey and the subsequent results.
  • the maturity levels include basic, managed, predictive, adaptive, and autonomic.
  • display 430 presents information concerning automation assessment categories.
  • the automation assessment tool of the present invention uses the scale of automation maturity levels to assess on-demand preparedness of the client in each of these assessment categories.
  • a user may navigate display 420 to view a description of each of the assessment categories to prepare for the assessment survey and the subsequent results.
  • the automation assessment categories include problem management, availability management, security management, solution deployment, user administration, and performance and capacity management.
  • FIG. 5 illustrates an example display presenting an automation assessment survey in accordance with an exemplary embodiment of the present invention.
  • Display 500 presents survey questions for automation assessment.
  • Sets of questions may be presented for availability management, performance and capacity management, security management, user administration, solution deployment, and problem management, for example.
  • Each question may include a set of multiple-choice answers that are selectable using a set of radio buttons, as depicted in the illustrated example. The user may navigate the assessment categories using tabs or the like.
  • the automation assessment tool of the present invention may present sets of survey questions for other aspects of automation. For example, a similar display may be used to present survey questions for server and operating system provisioning. Similarly, a separate display may be used to present survey questions for skill sets, automation technology, or security, for instance.
  • FIGS. 6A-6C are example displays illustrating results of automation assessment in accordance with an exemplary embodiment of the present invention. More particularly, with reference to FIG. 6A , display 600 includes a “spider web” graphical representation of automation assessment. Maturity levels are represented radially and assessment categories are represented as spokes. The maturity level for each assessment category is indicated as a point at the intersection of the radial maturity level and the spoke of the assessment category. These points are connected to form a polygon. Ideally, the polygon should fill as much of the graph as possible.
  • FIG. 6B illustrates an example display presenting a graphical representation of an automation capabilities profile.
  • Display 610 presents an assessment of automation capabilities.
  • the automation capabilities include process assessment, technology assessment, and sills readiness.
  • FIG. 6C illustrates an example display presenting a graphical representation of a provisioning profile.
  • Display 620 presents an assessment of provisioning.
  • provisioning includes server and operation system provisioning, identity provisioning, storage provisioning, application provisioning, and network provisioning.
  • FIGS. 7A and 7B are example displays illustrating solutions and recommendations in accordance with an exemplary embodiment of the present invention. More particularly, with reference to FIG. 7A , display 700 presents exemplary solutions for availability management based on a customer's answers to the assessment survey for availability management. Turning to FIG. 7B , display 710 presents exemplary recommendations for problem management based on the customer's answers to the assessment survey for problem management.
  • FIG. 8 is an example display illustrating estimated financial benefits with automated computing in accordance with an exemplary embodiment of the present invention.
  • Display 800 presents a graph including a curve that estimates future IT spending over time based on current automation capabilities and a curve that estimates a future IT spending over time based on a target level of on-demand preparedness.
  • the graph depicted in FIG. 8 serves to illustrate to the customer the financial benefit of being on-demand ready.
  • FIG. 9 is a flowchart illustrating the operation of an automation assessment tool in accordance with an exemplary embodiment of the present invention. Operation begins and the automation assessment tool presents educational material about autonomic computing (block 902 ). As discussed above, the educational material may provide information about automation including information about automation fundamentals, autonomic self-managing capabilities, automation maturity levels, and automation assessment categories.
  • the survey may include sets of questions for various aspects of automated computing, including, for example, a number of predetermined assessment categories.
  • the automation assessment tool ranks aspects of automated computing based on a scale of maturity levels (block 906 ).
  • the assessment tool determines solutions and recommendations to achieve a target level of automated computing (block 908 ) and determines operational efficiency savings for the target level of automated computing (block 910 ).
  • the tool then presents the solutions, recommendations, and efficiency savings output to the customer (block 912 ) and operation ends.
  • the present invention solves the disadvantages of the prior art by providing an automation assessment tool that defines autonomic technology, processes, organization, and skill sets that apply to autonomic computing.
  • the automation assessment tool provides educational material about autonomic computing and a scale used to measure on-demand preparedness.
  • the automation assessment tool presents a survey and collects answers to the survey questions.
  • the automation assessment tool determines solutions and recommendations to achieve a target level of on-demand preparedness.
  • the present invention provides a unique scale of maturity levels for assessing automated computing.
  • the assessment tool of the present invention is capable of applying specific technology to each level of automated computing and automates the business-level process of automated computing sales and marketing consultation.
  • the present invention also overcomes the complexities of automated computing faced by customers and the sales force by providing a tool that guides the operator through educational materials and survey questions and automatically generates solutions and recommendations.
  • the automation assessment tool of the present invention may also be implemented to navigate through databases of skill sets, organizational information, existing technology, processes, etc., to collect on-demand readiness information, rather than using a question-and-answer survey.
  • the assessment tool may also be applied to corporate education assessment and may extend the virtual engagement process of stand-alone electronic sales.

Abstract

An automation assessment tool is provided that defines autonomic technology, processes, organization, and skill sets that apply to autonomic computing. The automation assessment tool provides educational material about autonomic computing and a scale of maturity levels, which is used to assess on-demand preparedness. The automation assessment tool presents a survey and collects answers to the survey questions. The automation assessment tool then determines solutions and recommendations to achieve a target level of on-demand preparedness.

Description

    BACKGROUND OF THE INVENTION
  • 1. Technical Field:
  • The present invention relates to data processing and, in particular, to autonomic computing. Still more particularly, the present invention provides a method, apparatus, and program product for implementing an automation computing evaluation scale to generate recommendations.
  • 2. Description of Related Art
  • An on-demand business is an enterprise whose business processes, when integrated end-to-end across the company with key partners, suppliers, and customers, can respond with speed to any customer, market opportunity, or external threat. When an enterprise endeavors to be on-demand ready, it is a goal to increase its sophistication of automation by embedding autonomic capabilities and technologies. An enterprise's autonomic capability may range from basic, where analysis and problem solving are performed manually, to autonomic, where computer systems and networks may configure themselves to changing conditions, for example, and are self-healing in the event of failure with minimal human intervention.
  • Autonomic computing can help to overcome the barrier of infrastructure complexity. The core benefits of autonomic computing are improved resiliency, ability to deploy new capabilities more rapidly and increased return from IT investments. In a rapidly changing market, the ability to react quickly is a competitive advantage. Bottom line, advanced automation through utilizing autonomic technology allows companies to focus on business, not on infrastructure. Therefore, it may be a goal of an on-demand business to improve its levels of automation by incorporating autonomic computing technologies.
  • It is also a goal of a company providing automated computing technology and services to assess the autonomic computing capabilities of customers. There are white papers on autonomic computing problem determination and definitions for what an autonomic computing system does. However, there are no clear benchmarks that enable such an assessment and no tools exist for determining recommendations that may allow customers to become more automation computing capable.
  • SUMMARY OF THE INVENTION
  • The present invention recognizes the disadvantages of the prior art and provides an automation assessment tool that defines autonomic technology, processes, organization, and skill sets that apply to autonomic computing. The automation assessment tool provides educational material about autonomic computing and a scale of maturity levels, which is used to assess on-demand preparedness. The automation assessment tool presents a survey and collects answers to the survey questions. The automation assessment tool then determines solutions and recommendations to achieve a target level of on-demand preparedness.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The novel features believed characteristic of the invention are set forth in the appended claims. The invention itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings, wherein:
  • FIG. 1 is a pictorial representation of a data processing system in which exemplary aspects of the present invention may be implemented;
  • FIG. 2 is a block diagram of a data processing system in which exemplary embodiments of the present invention may be implemented;
  • FIG. 3 is a block diagram illustrating an automation assessment tool in accordance with an exemplary embodiment of the present invention;
  • FIGS. 4A-4D illustrate example presentation material presented by an automation assessment tool in accordance with an exemplary embodiment of the present invention;
  • FIG. 5 illustrates an example display presenting an automation assessment survey in accordance with an exemplary embodiment of the present invention;
  • FIGS. 6A-6C are example displays illustrating results of automation assessment in accordance with an exemplary embodiment of the present invention;
  • FIGS. 7A and 7B are example displays illustrating solutions and recommendations in accordance with an exemplary embodiment of the present invention;
  • FIG. 8 is an example display illustrating estimated financial benefits with automated computing in accordance with an exemplary embodiment of the present invention; and
  • FIG. 9 is a flowchart illustrating the operation of an automation assessment tool in accordance with an exemplary embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • The present invention provides a method, apparatus and computer program product for implementing an automation computing evaluation scale to generate recommendations. The data processing device may be a stand-alone computing device or may be a distributed data processing system in which multiple computing devices are utilized to perform various aspects of the present invention. Therefore, the following FIGS. 1 and 2 are provided as exemplary diagrams of data processing environments in which exemplary aspects of the present invention may be implemented. It should be appreciated that FIGS. 1 and 2 are only exemplary and are not intended to assert or imply any limitation with regard to the environments in which the present invention may be implemented. Many modifications to the depicted environments may be made without departing from the spirit and scope of the present invention.
  • With reference now to the figures and in particular with reference to FIG. 1, a pictorial representation of a data processing system in which exemplary aspects of the present invention may be implemented is depicted. A mobile computer 100 is depicted which includes system unit 102, video display terminal 104, keyboard 106, storage devices 108, which may include floppy drives and other types of permanent and removable storage media, and pointer device 110. Additional input devices may be included with mobile computer 100, such as, for example, a mouse, joystick, touch screen, trackball, microphone, and the like. Mobile computer 100 man be implemented using any suitable computer, such as an IBM ThinkPad® computer, which is a product of International Business Machines Corporation, located in Armonk, N.Y. Computer 100 also preferably includes a graphical user interface (GUI) that may be implemented by means of systems software residing in computer readable media in operation within computer 100.
  • With reference now to FIG. 2, a block diagram of a data processing system is shown in which exemplary embodiments of the present invention may be implemented. Data processing system 200 is an example of a mobile computer, such as computer 100 in FIG. 1, in which code or instructions implementing the processes of the present invention may be located. In the depicted example, data processing system 200 employs a hub architecture including a north bridge and memory controller hub (MCH) 208 and a south bridge and input/output (I/O) controller hub (ICH) 210. Processor 202, main memory 204, and graphics processor 218 are connected to MCH 208. Graphics processor 218 may be connected to the MCH through an accelerated graphics port (AGP), for example.
  • In the depicted example, local area network (LAN) adapter 212, audio adapter 216, keyboard and mouse adapter 220, modem 222, read only memory (ROM) 224, hard disk drive (HDD) 226, CD-ROM driver 230, universal serial bus (USB) ports and other communications ports 232, and PCI/PCIe devices 234 may be connected to ICH 210. PCI/PCIe devices may include, for example, Ethernet adapters, add-in cards, PC cards for notebook computers, etc. PCI uses a cardbus controller, while PCIe does not. ROM 224 may be, for example, a flash binary input/output system (BIOS). Hard disk drive 226 and CD-ROM drive 230 may use, for example, an integrated drive electronics (IDE) or serial advanced technology attachment (SATA) interface. A super I/O (SIO) device 236 may be connected to ICH 210.
  • Docking interface 240 may also be connected to the ICH. Data processing system 200 may be a mobile computing device, such as a laptop computer or handheld computer. Docking interface 240 provides port replication to allow the data processing system to easily connect to a keyboard, pointing device, monitor, printer, speakers, etc. The docking interface allows the mobile computing device to operate as a desktop computer with the more immobile peripheral devices.
  • An operating system runs on processor 202 and is used to coordinate and provide control of various components within data processing system 200 in FIG. 2. The operating system may be a commercially available operating system such as Windows XP™, which is available from Microsoft Corporation. An object oriented programming system, such as the Java™ programming system, may run in conjunction with the operating system and provides calls to the operating system from Java programs or applications executing on data processing system 200. “JAVA” is a trademark of Sun Microsystems, Inc. Instructions for the operating system, the object-oriented programming system, and applications or programs are located on storage devices, such as hard disk drive 226, and may be loaded into main memory 204 for execution by processor 202. The processes of the present invention are performed by processor 202 using computer implemented instructions, which may be located in a memory such as, for example, main memory 204, memory 224, or in one or more peripheral devices 226 and 230.
  • Those of ordinary skill in the art will appreciate that the hardware in FIG. 2 may vary depending on the implementation. Other internal hardware or peripheral devices, such as flash memory, equivalent non-volatile memory, or optical disk drives and the like, may be used in addition to or in place of the hardware depicted in FIG. 2. Also, the processes of the present invention may be applied to a multiprocessor data processing system.
  • For example, data processing system 200 may be a personal digital assistant (PDA), which is configured with flash memory to provide non-volatile memory for storing operating system files and/or user-generated data. The depicted example in FIG. 2 and above-described examples are not meant to imply architectural limitations. For example, data processing system 200 also may be a tablet computer or telephone device in addition to taking the form of a PDA.
  • In accordance with exemplary embodiments of the present invention, an automation assessment tool is provided to assess a client's current information technology (IT) environment to determine on-demand readiness. The automation assessment tool defines autonomic technology, processes, organization, and skill sets that apply to autonomic computing. The automation assessment tool provides educational material about autonomic computing and a scale used to measure on-demand preparedness. The automation assessment tool presents a survey and collects answers to the survey questions. The automation assessment tool then determines solutions and recommendations to achieve a target level of on-demand preparedness.
  • FIG. 3 is a block diagram illustrating an automation assessment tool in accordance with an exemplary embodiment of the present invention. Automation assessment tool 310 includes media player 312, survey module 314, and analysis module 316. Media player 312 presents educational presentation material 302 to a customer via an output device, such as display 322.
  • Presentation material 302 may provide information about automation including information about automation fundamentals, autonomic self-managing capabilities, automation maturity levels, and automation assessment categories. The information provided in presentation material 302 serves to educate the customer generally about automation and, more specifically, about the manner in which automation will be assessed by automation assessment tool 310. Media player 312 may be, for example, a web browser, video player, or presentation graphics application program. In one exemplary embodiment, media player 312 may be a Flash® player from Macromedia, Inc.
  • Survey module 314 presents survey questions 304 to an operator and receives answers to the questions. A sales representative of a company that provides automated computing technology and services may conduct the survey and enter answers provided by a customer. Survey module 314 stores survey answers 324 for subsequent inspection and for use by analysis module 316.
  • To illustrate the operation of survey module 314, an example assessment survey for availability management may include the following questions:
      • How would you characterize your current availability processes?
      • How have you leveraged technology to enable your availability management process?
      • How would you define the availability management skill level of your current staff?
        An example assessment survey for performance and capacity management may include the following questions:
      • How would you characterize your current performance and capacity management processes?
      • How have you leveraged technology to enable your performance and capacity management processes?
      • How would you define the performance and capacity plan skill level of your current staff?
        An example survey for security management may include the following questions:
      • How would you characterize your current security management processes?
      • How have you leveraged technology to enable your security management processes?
      • How would you define the security management skill level of your current staff?
        An example survey for user administration may include the following questions:
      • How would you characterize your current user administration processes?
      • How have you leveraged technology to enable your user administration process?
      • How would you define the user administration skill level of your current staff?
        An example assessment survey for solution deployment may include the following questions:
      • How would you characterize current solution deployment processes?
      • How have you leveraged technology to enable your solution deployment processes?
      • How would you define the solution deployment skill level of your current staff?
        An example assessment survey for problem management may include the following questions:
      • How would you characterize your current problem management processes?
      • How have you leveraged technology to enable your problem resolution processes?
      • How would you define the problem determination skill level of your current staff?
        The example survey questions above are merely exemplary. The questions may be modified depending upon the implementation. For example more or fewer questions may be provided.
  • Analysis module 316 analyzes the automation capabilities of the customer based on survey answers 324. Automation capabilities of an enterprise include, for example, the ability to be self-configuring, the ability to be self-healing, the ability to be self-optimization, and the ability to be self-protecting. Across the four automation capabilities, there are several key operational areas where one can assess automation maturity. These operational areas are used as automation assessment categories in accordance with an exemplary embodiment of the present invention. The automation assessment categories may include, for example, problem management, availability management, security management, solution deployment, user administration, and performance and capacity management.
  • Problem management is the act of identifying, isolating, and resolving issues that might negatively impact IT service delivery. Availability management is the act of ensuring that required IT services are available, as needed, to ensure business continuity. Security management is the act of securing critical business resources and data against attacks and authorized access from both external and internal threats. Solution deployment is the act of planning, testing, distributing, installing, and validating the deployment of new IT solutions, including the IT infrastructure elements, in a manner that is the least disruptive to operational services. The ability to roll back to a prior functioning environment if a change is unsuccessful is also necessary. User administration is the act of managing the full lifecycle of a user's access to the company resources, such as adding, deleting, and changing access to resources based on business policies and job function. Performance and capacity management is the act of monitoring and managing system performance to adequately meet the throughput and response time requirements associated with operational business needs.
  • Analysis module 316 ranks the various aspects of the customer's on-demand readiness based on a scale of maturity levels. The maturity levels may include, for example, basic, managed, predictive, adaptive, and autonomic. For example, analysis module 316 may rank each of the key operational areas based on this scale. In addition, analysis module 316 may optionally rank technology, processes, and skill sets based on this scale.
  • The basic maturity level indicates that the customer uses manual analysis and problem solving. In a real-world scenario, transaction response times may slow during key transactions. To diagnose this problem within the basic maturity level, multiple product experts may analyze product-specific events and logs. The basic maturity level requires extensive, highly skilled IT staff. A benefit of this maturity level is that basic requirements are addressed.
  • The managed maturity level indicates that the customer uses centralized tools and performs manual actions. In a real-world scenario, the IT staff uses tools to look at transaction response data and event data from multiple products to help them make a decision. In the managed maturity level, the IT staff analyzes data and takes actions. Benefits of the managed maturity level include greater system awareness and improved productivity.
  • The predictive maturity level indicates that the customer monitors, correlates data, and recommends action. In a real-world scenario, transaction trend analysis data (symptoms) is stored in a central database where this data is used to predict events and to recommend actions. Technology-analysis correlates symptoms with recommended actions. The IT staff approves and initiates actions. The predictive maturity level enables reduced dependency on deep IT skills and faster and better decision-making.
  • The adaptive maturity level indicates that the customer uses system that monitors, correlates data, and takes actions. In a real-world scenario, when a problem occurs with a transaction, a particular symptom is matched to a recommended action, and the system takes the action. The IT staff manages performance against service level agreements. The adaptive maturity level allows balanced human-to-system interaction and increases IT agility and resiliency.
  • The autonomic maturity level indicates dynamic business policy based management. In a real-world scenario, action is taken based on business policy, for example, giving preference for key transactions over less important ones, or performing an action (like a reboot) during a non-critical time. The IT staff focuses on business needs. Business policy drives IT management. The autonomic maturity level increases business agility and resiliency.
  • Analysis module 316 determines solutions and recommendations to achieve a target level of automated computing based on survey answers 324. Survey module 314 may provide multiple-choice answers to be selected by the customer. These multiple-choice answers may be associated with specific solutions and recommendations. For example, if the customer indicates in the answers to the survey that the IT staff can use cross-resource availability analyses to predict business system availability and manually make adjustments to maintain business system availability based on business objectives, then automation assessment tool 310 may recommend that the customer schedule education on workflow automation and business integration to enable automation of the best-practices processes that keep IT running.
  • Analysis module 316 may also determine a financial impact that may result from achieving a target level of automated computing capability or on-demand preparedness. For example, analysis module 316 may generate a graph that compares target IT spending over time compared to IT spending based on current on-demand readiness. Analysis module 316 provides solutions, recommendations, and financial impact information as output 326, which may be stored in persistent storage or presented by an output device, such as display 322.
  • Automation assessment tool 310 may be an expert system that crawls through corporate databases and may infer information to create analysis output 326. An expert system is an artificial intelligence (AI) application that uses a knowledge base of human expertise or historical information for problem solving. The success of an expert system is based on the quality of the data and rules obtained from a human expert. In practice, expert systems perform both below and above that of a human. Analysis module 316 may use rules (not shown) to derive answers by running information, such as survey answers 324, through an inference engine (not shown), which is software that processes results from rules and data in a knowledge base.
  • An operator may change answers in survey answers 324 and determine changes in output 326. In this manner, a customer may see how changes in automation capabilities affect the assessment output. Thus, automation assessment tool 310 may provide navigation to higher or lower levels of autonomic computing to generate recommendations.
  • FIGS. 4A-4D illustrate example presentation material presented by an automation assessment tool in accordance with an exemplary embodiment of the present invention. More particularly, with reference to FIG. 4A, presentation display 400 may provide educational information about automation fundamentals including business service management, policy based orchestration, availability, security, optimization, provisioning, and virtualization.
  • Business service management is the activity of integrating business-process-to-business-process. Business process integration provides the tools needed to manage service levels, meter system utilization, and bill customers for that usage, as well as model integrate, connect, monitor, and manage business processes from end to end for complete linkage of business applications and linkage of business processes to the IT environment.
  • Policy based orchestration helps customers automatically control and manage the four capabilities (availability, security, optimization, provisioning) so that the entire IT infrastructure is responding dynamically to changing conditions according to defined business policies. The orchestration builds on industry best practices and the collective IT experience of the customer to ensure that complex deployments are achieved, on demand, with speed and quality.
  • Availability management ensures the health and functioning of IT environments based on business objectives. Delivery of consistent and reliable service levels with reduced IT administration costs is key, enabled by dynamic event generation, correlation and analysis and automated cure aligned with business views of the IT infrastructure. With respect to availability management, the automation assessment tool emphasizes self-healing.
  • Security management ensures that policies for identity management, including access and privacy control, are consistently defined and enforced across the IT environment. Security management enables the automated detection of and response to security threats, including intrusions and insecure configurations. With respect to security management, the automation assessment tool emphasizes self-protecting.
  • Optimization ensures the most productive utilization of IT infrastructure based on business objectives. Capabilities like transaction performance management, dynamic workload management, and dynamic job and task scheduling are key within application domains and across a heterogeneous IT infrastructure. With respect to optimization, the automation assessment tool emphasizes self-optimization.
  • Provisioning provides the ability to automatically and dynamically configure and deploy resources in response to changing business conditions and objectives in heterogeneous environments. Provisioning can be elemental (that is, server provisioning, storage provisioning, and so forth) and horizontal (that is, end-to-end application provisioning). With respect to provisioning, the automation assessment tool emphasizes self-configuration.
  • Virtualization enables resources to be shared, managed, and accessed across a workgroup, enterprise, or even across company boundaries, regardless of operating characteristics. Users benefit from seamless and uninterrupted access to resources, while the physical resources that compose a virtualized environment might reside in multiple locations. Resource virtualization provides access to processing power and data to improve asset utilization and efficiency, to rapidly solve complex business problems, to conduct computer-intensive research and data analysis, and to respond to real-time business fluctuations on demand.
  • Turning to FIG. 4B, display 410 presents a review of self-managing capabilities in automatic computing. Automation capabilities of an enterprise include, for example, the ability to be self-configuring, the ability to be self-healing, the ability to be self-optimization, and the ability to be self-protecting. A self-configuring environment can dynamically configure itself on-the-fly and can adapt itself to the deployment of new components or changes with minimal human intervention. A self-healing IT environment can detect improper operation of systems, transactions, and business processes, and then initiate corrective action without disrupting users or services. A self-optimizing IT environment addresses the complexity of managing system performance. A self-optimizing environment can learn from experience and can proactively tune itself in the context of an overall business objective. A self-protecting IT environment can allow the right people to access the right data at the right time. A self-protecting environment can automatically take the appropriate actions to make itself less vulnerable to attacks on its runtime infrastructure and on its business data.
  • With reference now to FIG. 4C, display 420 presents a review of automation maturity levels in accordance with an exemplary embodiment of the present invention. A user may navigate display 420 to view a description of each of the maturity levels to prepare for the assessment survey and the subsequent results. In the depicted example, the maturity levels include basic, managed, predictive, adaptive, and autonomic.
  • With reference to FIG. 4D, display 430 presents information concerning automation assessment categories. The automation assessment tool of the present invention uses the scale of automation maturity levels to assess on-demand preparedness of the client in each of these assessment categories. A user may navigate display 420 to view a description of each of the assessment categories to prepare for the assessment survey and the subsequent results. In the depicted example, the automation assessment categories include problem management, availability management, security management, solution deployment, user administration, and performance and capacity management.
  • FIG. 5 illustrates an example display presenting an automation assessment survey in accordance with an exemplary embodiment of the present invention. Display 500 presents survey questions for automation assessment. Sets of questions may be presented for availability management, performance and capacity management, security management, user administration, solution deployment, and problem management, for example. Each question may include a set of multiple-choice answers that are selectable using a set of radio buttons, as depicted in the illustrated example. The user may navigate the assessment categories using tabs or the like.
  • The automation assessment tool of the present invention may present sets of survey questions for other aspects of automation. For example, a similar display may be used to present survey questions for server and operating system provisioning. Similarly, a separate display may be used to present survey questions for skill sets, automation technology, or security, for instance.
  • FIGS. 6A-6C are example displays illustrating results of automation assessment in accordance with an exemplary embodiment of the present invention. More particularly, with reference to FIG. 6A, display 600 includes a “spider web” graphical representation of automation assessment. Maturity levels are represented radially and assessment categories are represented as spokes. The maturity level for each assessment category is indicated as a point at the intersection of the radial maturity level and the spoke of the assessment category. These points are connected to form a polygon. Ideally, the polygon should fill as much of the graph as possible.
  • FIG. 6B illustrates an example display presenting a graphical representation of an automation capabilities profile. Display 610 presents an assessment of automation capabilities. In the depicted example, the automation capabilities include process assessment, technology assessment, and sills readiness.
  • FIG. 6C illustrates an example display presenting a graphical representation of a provisioning profile. Display 620 presents an assessment of provisioning. In the depicted example, provisioning includes server and operation system provisioning, identity provisioning, storage provisioning, application provisioning, and network provisioning.
  • FIGS. 7A and 7B are example displays illustrating solutions and recommendations in accordance with an exemplary embodiment of the present invention. More particularly, with reference to FIG. 7A, display 700 presents exemplary solutions for availability management based on a customer's answers to the assessment survey for availability management. Turning to FIG. 7B, display 710 presents exemplary recommendations for problem management based on the customer's answers to the assessment survey for problem management.
  • FIG. 8 is an example display illustrating estimated financial benefits with automated computing in accordance with an exemplary embodiment of the present invention. Display 800 presents a graph including a curve that estimates future IT spending over time based on current automation capabilities and a curve that estimates a future IT spending over time based on a target level of on-demand preparedness. The graph depicted in FIG. 8 serves to illustrate to the customer the financial benefit of being on-demand ready.
  • FIG. 9 is a flowchart illustrating the operation of an automation assessment tool in accordance with an exemplary embodiment of the present invention. Operation begins and the automation assessment tool presents educational material about autonomic computing (block 902). As discussed above, the educational material may provide information about automation including information about automation fundamentals, autonomic self-managing capabilities, automation maturity levels, and automation assessment categories.
  • Next, an operator, such as a sales representative or a client, conducts a survey and the automation assessment tool collects survey answers (block 904). The survey may include sets of questions for various aspects of automated computing, including, for example, a number of predetermined assessment categories.
  • Thereafter, the automation assessment tool ranks aspects of automated computing based on a scale of maturity levels (block 906). The assessment tool then determines solutions and recommendations to achieve a target level of automated computing (block 908) and determines operational efficiency savings for the target level of automated computing (block 910). The tool then presents the solutions, recommendations, and efficiency savings output to the customer (block 912) and operation ends.
  • Thus, the present invention solves the disadvantages of the prior art by providing an automation assessment tool that defines autonomic technology, processes, organization, and skill sets that apply to autonomic computing. The automation assessment tool provides educational material about autonomic computing and a scale used to measure on-demand preparedness. The automation assessment tool presents a survey and collects answers to the survey questions. The automation assessment tool then determines solutions and recommendations to achieve a target level of on-demand preparedness.
  • The present invention provides a unique scale of maturity levels for assessing automated computing. The assessment tool of the present invention is capable of applying specific technology to each level of automated computing and automates the business-level process of automated computing sales and marketing consultation. The present invention also overcomes the complexities of automated computing faced by customers and the sales force by providing a tool that guides the operator through educational materials and survey questions and automatically generates solutions and recommendations.
  • The automation assessment tool of the present invention may also be implemented to navigate through databases of skill sets, organizational information, existing technology, processes, etc., to collect on-demand readiness information, rather than using a question-and-answer survey. The assessment tool may also be applied to corporate education assessment and may extend the virtual engagement process of stand-alone electronic sales.
  • It is important to note that while the present invention has been described in the context of a fully functioning data processing system, those of ordinary skill in the art will appreciate that the processes of the present invention are capable of being distributed in the form of a computer readable medium of instructions and a variety of forms and that the present invention applies equally regardless of the particular type of signal bearing media actually used to carry out the distribution. Examples of computer readable media include recordable-type media, such as a floppy disk, a hard disk drive, a RAM, CD-ROMs, DVD-ROMs, and transmission-type media, such as digital and analog communications links, wired or wireless communications links using transmission forms, such as, for example, radio frequency and light wave transmissions. The computer readable media may take the form of coded formats that are decoded for actual use in a particular data processing system.
  • The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. The embodiment was chosen and described in order to best explain the principles of the invention, the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (27)

1. A method, in a data processing system, for assessing automated computing capabilities, the method comprising:
receiving information about automated computing capabilities of a customer;
assigning a maturity level from a set of maturity levels to each of a plurality of assessment categories based on the information about the automated computing capabilities of the customer; and
providing information for achieving a target level of automated computing to the customer.
2. The method of claim 1, further comprising:
presenting educational information about automated computing to the customer.
3. The method of claim 2, wherein the educational material identifies the set of maturity levels.
4. The method of claim 2, wherein the educational materials identifies the plurality of assessment categories.
5. The method of claim 1, wherein receiving information about the automated computing capabilities of the customer includes:
presenting a plurality of survey questions; and
receiving answers to the plurality of survey questions.
6. The method of claim 5, wherein the plurality of survey questions include multiple-choice answers.
7. The method of claim 1, wherein providing information for achieving a target level of automated computing includes:
determining solutions for automated computing based on the information about the automated computing capabilities of the customer.
8. The method of claim 1, wherein providing information for achieving a target level of automated computing includes:
determining recommendations for achieving a target level of automated computing based on the information about the automated computing capabilities of the customer.
9. The method of claim 1, wherein providing information for achieving a target level of automated computing includes:
determining operational efficiency savings for the target level of automated computing relative to the automated computing capabilities of the customer.
10. The method of claim 1, wherein the set of maturity levels includes basic, managed, predictive, adaptive, and autonomic.
11. The method of claim 1, wherein the plurality of assessment categories includes problem management, availability management, security management, solution deployment, user administration, and performance and capacity management.
12. The method of claim 1, wherein providing information for achieving a target level of automated computing to the customer includes:
changing at least a portion of the information about automated computing capabilities of the customer to for changed information; and
determining a change in the maturity level for each of the plurality of assessment categories based on the changed information.
13. The method of claim 1, further comprising:
using an expert system to analyze historical information using a set of rules.
14. An apparatus, in a data processing system, for assessing automated computing capabilities, the apparatus comprising:
means for receiving information about automated computing capabilities of a customer;
means for assigning a maturity level from a set of maturity levels to each of a plurality of assessment categories based on the information about the automated computing capabilities of the customer; and
means for providing information for achieving a target level of automated computing to the customer.
15. A computer program product, in a computer readable medium, for assessing automated computing capabilities, the computer program product comprising:
instructions for receiving information about automated computing capabilities of a customer;
instructions for assigning a maturity level from a set of maturity levels to each of a plurality of assessment categories based on the information about the automated computing capabilities of the customer; and
instructions for providing information for achieving a target level of automated computing to the customer.
16. The computer program product of claim 15, further comprising:
instructions for presenting educational information about automated computing to the customer.
17. The computer program product of claim 16, wherein the educational material identifies the set of maturity levels.
18. The computer program product of claim 16, wherein the educational material identifies the plurality of assessment categories.
19. The computer program product of claim 15, wherein the instructions for receiving information about the automated computing capabilities of the customer include:
instructions for presenting a plurality of survey questions; and
instructions for receiving answers to the plurality of survey questions.
20. The computer program product of claim 19, wherein the plurality of survey questions include multiple-choice answers.
21. The computer program product of claim 15, wherein the instructions for providing information for achieving a target level of automated computing include:
instructions for determining solutions for automated computing based on the information about the automated computing capabilities of the customer.
22. The computer program product of claim 15, wherein the instructions for providing information for achieving a target level of automated computing include:
instructions for determining recommendations for achieving a target level of automated computing based on the information about the automated computing capabilities of the customer.
23. The computer program product of claim 15, wherein the instructions for providing information for achieving a target level of automated computing include:
instructions for determining operational efficiency savings for the target level of automated computing relative to the automated computing capabilities of the customer.
24. The computer program product of claim 15, wherein the set of maturity levels includes basic, managed, predictive, adaptive, and autonomic.
25. The computer program product of claim 15, wherein the plurality of assessment categories includes problem management, availability management, security management, solution deployment, user administration, and performance and capacity management.
26. The computer program product of claim 15, wherein the instructions for providing information for achieving a target level of automated computing to the customer includes:
instructions for changing at least a portion of the information about automated computing capabilities of the customer to for changed information; and
instructions for determining a change in the maturity level for each of the plurality of assessment categories based on the changed information.
27. The computer program product of claim 15, further comprising:
instructions for using an expert system to analyze historical information using a set of rules.
US10/900,959 2004-07-28 2004-07-28 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations Abandoned US20060026054A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/900,959 US20060026054A1 (en) 2004-07-28 2004-07-28 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations
US12/131,611 US8019640B2 (en) 2004-07-28 2008-06-02 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/900,959 US20060026054A1 (en) 2004-07-28 2004-07-28 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/131,611 Continuation US8019640B2 (en) 2004-07-28 2008-06-02 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations

Publications (1)

Publication Number Publication Date
US20060026054A1 true US20060026054A1 (en) 2006-02-02

Family

ID=35733531

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/900,959 Abandoned US20060026054A1 (en) 2004-07-28 2004-07-28 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations
US12/131,611 Expired - Fee Related US8019640B2 (en) 2004-07-28 2008-06-02 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/131,611 Expired - Fee Related US8019640B2 (en) 2004-07-28 2008-06-02 Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations

Country Status (1)

Country Link
US (2) US20060026054A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060271660A1 (en) * 2005-05-26 2006-11-30 Bea Systems, Inc. Service oriented architecture implementation planning
US20070038536A1 (en) * 2005-08-11 2007-02-15 Accenture Global Services Gmbh Finance diagnostic tool
US20070078831A1 (en) * 2005-09-30 2007-04-05 Accenture Global Services Gmbh Enterprise performance management tool
US20080235079A1 (en) * 2004-07-28 2008-09-25 International Business Machines Corporation Method, Apparatus, and Program for Implementing an Automation Computing Evaluation Scale to Generate Recommendations
WO2009011916A1 (en) * 2007-07-19 2009-01-22 Depalma Mark S Systems and methods for accumulating accreditation
US20090144120A1 (en) * 2007-11-01 2009-06-04 Ramachandran P G System and Method for Evolving Processes In Workflow Automation
US20120116848A1 (en) * 2010-11-10 2012-05-10 International Business Machines Corporation Optimizing business operational environments
US8873733B1 (en) 2007-06-08 2014-10-28 West Corporation Real-time feedback of survey results
US20150066804A1 (en) * 2013-08-28 2015-03-05 Konica Minolta Laboratory U.S.A., Inc. Method and system for generating a customer survey for an image forming apparatus
US20170024668A1 (en) * 2015-07-23 2017-01-26 Charlene G. Aldridge Automated Assessment and Solution Methodology
CN109426987A (en) * 2017-09-05 2019-03-05 本田技研工业株式会社 Evaluating apparatus, evaluation method, noise elimination apparatus and program storage medium
US20200073639A1 (en) * 2018-08-30 2020-03-05 Accenture Global Solutions Limited Automated process analysis and automation implementation
US11042884B2 (en) * 2004-05-25 2021-06-22 International Business Machines Corporation Method and apparatus for using meta-rules to support dynamic rule-based business systems
US11367512B2 (en) * 2011-11-29 2022-06-21 Eresearchtechnology, Inc. Methods and systems for data analysis

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050010388A1 (en) * 2003-07-11 2005-01-13 International Business Machines Corporation Dynamic online multi-parameter optimization system and method for autonomic computing systems
WO2007063605A1 (en) * 2005-12-02 2007-06-07 Netman Co., Ltd. Action improvement system
WO2009116126A1 (en) * 2008-03-17 2009-09-24 富士通株式会社 Information acquisition support apparatus
US20110066476A1 (en) * 2009-09-15 2011-03-17 Joseph Fernard Lewis Business management assessment and consulting assistance system and associated method
US8781884B2 (en) * 2010-08-19 2014-07-15 Hartford Fire Insurance Company System and method for automatically generating work environment goals for a management employee utilizing a plurality of work environment survey results
US8527326B2 (en) * 2010-11-30 2013-09-03 International Business Machines Corporation Determining maturity of an information technology maintenance project during a transition phase
US9613323B2 (en) * 2012-01-05 2017-04-04 International Business Machines Corporation Organizational agility determination across multiple computing domains
US10621591B2 (en) * 2015-12-01 2020-04-14 Capital One Services, Llc Computerized optimization of customer service queue based on customer device detection
US10885486B2 (en) * 2017-06-16 2021-01-05 Genpact Luxembourg S.a.r.l. System and method for determining automation potential of a process
US9930062B1 (en) 2017-06-26 2018-03-27 Factory Mutual Insurance Company Systems and methods for cyber security risk assessment
DE112019006558T5 (en) * 2019-02-06 2021-10-14 Mitsubishi Electric Corporation Device for evaluating information technology usage, system for evaluating information technology usage and method for evaluating information technology usage

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5546502A (en) * 1993-03-19 1996-08-13 Ricoh Company, Ltd. Automatic invocation of computational resources without user intervention
US6343275B1 (en) * 1997-12-22 2002-01-29 Charles Wong Integrated business-to-business web commerce and business automation system
US6363384B1 (en) * 1999-06-29 2002-03-26 Wandel & Goltermann Technologies, Inc. Expert system process flow
US20030065543A1 (en) * 2001-09-28 2003-04-03 Anderson Arthur Allan Expert systems and methods
US6633861B2 (en) * 1993-03-19 2003-10-14 Ricoh Company Limited Automatic invocation of computational resources without user intervention across a network
US20030212583A1 (en) * 2001-07-25 2003-11-13 Perras Francis A. Automated tool set for improving operations in an ecommerce business
US6662355B1 (en) * 1999-08-11 2003-12-09 International Business Machines Corporation Method and system for specifying and implementing automation of business processes
US20040059966A1 (en) * 2002-09-20 2004-03-25 International Business Machines Corporation Adaptive problem determination and recovery in a computer system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6654816B1 (en) * 2000-05-31 2003-11-25 Hewlett-Packard Development Company, L.P. Communication interface systems for locally analyzing computers
EP1410281A2 (en) * 2000-07-10 2004-04-21 BMC Software, Inc. System and method of enterprise systems and business impact management
US7003564B2 (en) * 2001-01-17 2006-02-21 Hewlett-Packard Development Company, L.P. Method and apparatus for customizably calculating and displaying health of a computer network
US7065566B2 (en) * 2001-03-30 2006-06-20 Tonic Software, Inc. System and method for business systems transactions and infrastructure management
US7360121B2 (en) * 2002-02-22 2008-04-15 Bea Systems, Inc. System for monitoring a subsystem health
US6856942B2 (en) * 2002-03-09 2005-02-15 Katrina Garnett System, method and model for autonomic management of enterprise applications
US20040059704A1 (en) * 2002-09-20 2004-03-25 International Business Machines Corporation Self-managing computing system
US7200657B2 (en) * 2002-10-01 2007-04-03 International Business Machines Corporation Autonomic provisioning of network-accessible service behaviors within a federated grid infrastructure
US20040117234A1 (en) 2002-10-11 2004-06-17 Xerox Corporation System and method for content management assessment
US20040193476A1 (en) * 2003-03-31 2004-09-30 Aerdts Reinier J. Data center analysis
US20040199417A1 (en) * 2003-04-02 2004-10-07 International Business Machines Corporation Assessing information technology products
US7216169B2 (en) * 2003-07-01 2007-05-08 Microsoft Corporation Method and system for administering personal computer health by registering multiple service providers and enforcing mutual exclusion rules
US7370098B2 (en) * 2003-08-06 2008-05-06 International Business Machines Corporation Autonomic management of autonomic systems
US7379923B2 (en) * 2003-11-06 2008-05-27 International Business Machines Corporation Benchmarking of computer and network support services
US7734561B2 (en) * 2003-12-15 2010-06-08 International Business Machines Corporation System and method for providing autonomic management of a networked system using an action-centric approach
US7426498B2 (en) * 2004-07-27 2008-09-16 International Business Machines Corporation Method and apparatus for autonomous classification
US20060026054A1 (en) * 2004-07-28 2006-02-02 International Business Machines Corporation Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations
US7487494B2 (en) * 2004-08-02 2009-02-03 International Business Machines Corporation Approach to monitor application states for self-managing systems

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5546502A (en) * 1993-03-19 1996-08-13 Ricoh Company, Ltd. Automatic invocation of computational resources without user intervention
US6633861B2 (en) * 1993-03-19 2003-10-14 Ricoh Company Limited Automatic invocation of computational resources without user intervention across a network
US6343275B1 (en) * 1997-12-22 2002-01-29 Charles Wong Integrated business-to-business web commerce and business automation system
US6363384B1 (en) * 1999-06-29 2002-03-26 Wandel & Goltermann Technologies, Inc. Expert system process flow
US6662355B1 (en) * 1999-08-11 2003-12-09 International Business Machines Corporation Method and system for specifying and implementing automation of business processes
US20030212583A1 (en) * 2001-07-25 2003-11-13 Perras Francis A. Automated tool set for improving operations in an ecommerce business
US20030065543A1 (en) * 2001-09-28 2003-04-03 Anderson Arthur Allan Expert systems and methods
US20040059966A1 (en) * 2002-09-20 2004-03-25 International Business Machines Corporation Adaptive problem determination and recovery in a computer system

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11042884B2 (en) * 2004-05-25 2021-06-22 International Business Machines Corporation Method and apparatus for using meta-rules to support dynamic rule-based business systems
US20080235079A1 (en) * 2004-07-28 2008-09-25 International Business Machines Corporation Method, Apparatus, and Program for Implementing an Automation Computing Evaluation Scale to Generate Recommendations
US8019640B2 (en) * 2004-07-28 2011-09-13 International Business Machines Corporation Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations
US20060271660A1 (en) * 2005-05-26 2006-11-30 Bea Systems, Inc. Service oriented architecture implementation planning
US20070038536A1 (en) * 2005-08-11 2007-02-15 Accenture Global Services Gmbh Finance diagnostic tool
US8719076B2 (en) * 2005-08-11 2014-05-06 Accenture Global Services Limited Finance diagnostic tool
US20070078831A1 (en) * 2005-09-30 2007-04-05 Accenture Global Services Gmbh Enterprise performance management tool
US8873733B1 (en) 2007-06-08 2014-10-28 West Corporation Real-time feedback of survey results
WO2009011916A1 (en) * 2007-07-19 2009-01-22 Depalma Mark S Systems and methods for accumulating accreditation
WO2009011925A1 (en) * 2007-07-19 2009-01-22 Depalma Mark S Systems and methods for accumulating accreditation
US20100217718A1 (en) * 2007-07-19 2010-08-26 Depalma Mark S Systems and methods for accumulating accreditation
US8341005B2 (en) * 2007-11-01 2012-12-25 International Business Machines Corporation System and method for evolving processes in workflow automation
US20090144120A1 (en) * 2007-11-01 2009-06-04 Ramachandran P G System and Method for Evolving Processes In Workflow Automation
US20130144678A1 (en) * 2007-11-01 2013-06-06 International Business Machines Corporation System and Method for Evolving Processes In Workflow Automation
US20120209655A1 (en) * 2007-11-01 2012-08-16 Ramachandran P G System and Method for Evolving Processes In Workflow Automation
US8301480B2 (en) * 2007-11-01 2012-10-30 International Business Machines Corporation Automatically evolving processes in workflow automation
US10755209B2 (en) * 2007-11-01 2020-08-25 International Business Machines Corporation Automatically evolving processes in workflow automation systems
US20120116848A1 (en) * 2010-11-10 2012-05-10 International Business Machines Corporation Optimizing business operational environments
US11798660B2 (en) 2011-11-29 2023-10-24 Eresearch Technology, Inc. Methods and systems for data analysis
US11367512B2 (en) * 2011-11-29 2022-06-21 Eresearchtechnology, Inc. Methods and systems for data analysis
US20150066804A1 (en) * 2013-08-28 2015-03-05 Konica Minolta Laboratory U.S.A., Inc. Method and system for generating a customer survey for an image forming apparatus
US20170024668A1 (en) * 2015-07-23 2017-01-26 Charlene G. Aldridge Automated Assessment and Solution Methodology
CN109426987A (en) * 2017-09-05 2019-03-05 本田技研工业株式会社 Evaluating apparatus, evaluation method, noise elimination apparatus and program storage medium
US11132699B2 (en) 2017-09-05 2021-09-28 Honda Motor Co., Ltd. Apparatuses, method, and computer program for acquiring and evaluating information and noise removal
US10831448B2 (en) * 2018-08-30 2020-11-10 Accenture Global Solutions Limited Automated process analysis and automation implementation
US20200073639A1 (en) * 2018-08-30 2020-03-05 Accenture Global Solutions Limited Automated process analysis and automation implementation

Also Published As

Publication number Publication date
US20080235079A1 (en) 2008-09-25
US8019640B2 (en) 2011-09-13

Similar Documents

Publication Publication Date Title
US8019640B2 (en) Method, apparatus, and program for implementing an automation computing evaluation scale to generate recommendations
US11030669B1 (en) Best practice analysis, optimized resource use
US11676087B2 (en) Systems and methods for vulnerability assessment and remedy identification
US9197502B1 (en) Best practice analysis, migration advisor
US8539589B2 (en) Adaptive configuration management system
US20140129389A1 (en) Cloud solutions for organizations
US11521143B2 (en) Supply chain disruption advisor
CN112016796B (en) Comprehensive risk score request processing method and device and electronic equipment
US20200090088A1 (en) Enterprise health control processor engine
US10885477B2 (en) Data processing for role assessment and course recommendation
McKinnie Cloud computing: TOE adoption factors by service model in manufacturing
US20200134568A1 (en) Cognitive assessment recommendation and evaluation
US20220207414A1 (en) System performance optimization
Ma et al. Collaborative optimization of service composition for data-intensive applications in a hybrid cloud
US20190333083A1 (en) Systems and methods for quantitative assessment of user experience (ux) of a digital product
Ravi et al. Analytics in/for cloud-an interdependence: A review
US20230117225A1 (en) Automated workflow analysis and solution implementation
US20200410387A1 (en) Minimizing Risk Using Machine Learning Techniques
US20220207443A1 (en) Local agent system for obtaining hardware monitoring and risk information
US11601347B2 (en) Identification of incident required resolution time
Ibrahim Assessing cloud computing adoption by IT professionals in small business using the technology acceptance model
Nanath et al. Individual and organizational factors affecting the implementation of Green IT: a case study of an Indian business school
Muralidharan et al. Risk analysis of cloud service providers by analyzing the frequency of occurrence of problems using E-Eclat algorithm
Kyriakou et al. Enterprise Systems, ICT Capabilities and Business Analytics Adoption–An Empirical Investigation
CN114357056A (en) Detection of associations between data sets

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAREL, MILES A.;CARTER, SANDRA;CROSSKEY, JAMES P.;AND OTHERS;REEL/FRAME:015943/0462;SIGNING DATES FROM 20040728 TO 20050322

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION