|Publication number||US7873715 B1|
|Application number||US 10/740,162|
|Publication date||Jan 18, 2011|
|Filing date||Dec 18, 2003|
|Priority date||Dec 18, 2003|
|Publication number||10740162, 740162, US 7873715 B1, US 7873715B1, US-B1-7873715, US7873715 B1, US7873715B1|
|Original Assignee||Precise Software Solutions, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (17), Non-Patent Citations (1), Referenced by (4), Classifications (7), Legal Events (8)|
|External Links: USPTO, USPTO Assignment, Espacenet|
1. Field of the Invention
This invention is related to the field of application performance management and, more particularly, to the instrumentation of web pages for collecting performance metrics.
2. Description of the Related Art
In the information technology (IT) departments of modern organizations, one of the biggest challenges is meeting the increasingly demanding service levels required by users. With more and more applications directly accessible to customers via automated interfaces such as the world wide web, “normal” business hours for many enterprises are now 24 hours a day, 7 days a week. The need for continuous availability and performance of applications has created complex, tiered IT infrastructures which often include web servers, middleware, networking, database, and storage components. These components may be from different vendors and may reside on different computing platforms. A problem with any of these components can impact the performance of applications throughout the enterprise.
The performance of key applications is a function of how well the infrastructure components work in concert with each other to deliver services. With the growing complexity of heterogeneous IT environments, however, the source of performance problems is often unclear. Consequently, application performance problems can be difficult to detect and correct. Furthermore, tracking application performance manually can be an expensive and labor-intensive task. Therefore, it is usually desirable that application performance management tasks be automated.
Automated tools for application performance management may assist in providing a consistently high level of performance and availability. These automated tools may result in lower costs per transaction while maximizing and leveraging the resources that have already been spent on the application delivery infrastructure. Automated tools for application performance management may give finer control of applications to IT departments. Application performance management tools may enable IT departments to be proactive and fix application performance issues before the issues affect users.
Historical performance data collected by these tools can be used for reports, trending analyses, and capacity planning. By correlating this collected information across application tiers, application performance management tools may provide actionable advice to help IT departments solve current and potential problems.
In a real-world environment, the performance of applications may be highly variable due to normal variations in resource usage over time. Furthermore, requirements such as user needs, usage patterns, customization requirements, system components, architectures, and platform environments may vary from business unit to business unit. These variations may also cause application performance to be highly variable. Tuning applications to work together efficiently and effectively in their unique environments can be crucial to reaching organizational goals and maintaining competitive advantages. Automated tools for application performance management can assist in these tuning operations.
Various embodiments of a system and method described herein may provide optimized instrumentation of web pages in a performance management system. In one embodiment, a method of performance management for web browsing comprises a web server receiving a request for a web page from a web client. A callout to a performance management agent is inserted into the requested web page. The web page, including the callout to the agent, is sent to the web client. The web client may use the callout to load the agent by sending a request for the agent to a collector server. The agent may collect to performance metrics on the web client and send the performance metrics to the collector server for storage and/or analysis.
While the invention is described herein by way of example for several embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments or drawings described. It should be understood, that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the intention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the present invention as defined by the appended claims. As used throughout this application, the word “may” is used in a permissive sense (i.e., meaning “having the potential to”), rather than the mandatory sense (i.e., meaning “must”). Similarly, the words “include,” “including,” and “includes” mean “including, but not limited to.”
A performance management system may include one or more software programs for application performance management. By continuously monitoring key components and/or applications of computer systems, the performance management system may act to detect and correct performance problems among applications and other system components in a complex computing environment. The performance management system may provide performance management in a variety of stages, such as: identification of symptoms that could indicate a performance problem, identification of sources or locations of problems, discovery of root causes of problems, recommendation of measures to address the root causes and improve performance, and verification that the measures have achieved desired goals. By defining baselines for “normal” application behavior, the performance management system may automatically detect degradation based on those established norms.
In one embodiment, the performance management system may be implemented in a variety of versions, each of which is customized for management of a particular class of target software: e.g., various products from PeopleSoft, Inc.; Oracle® database management software and related applications; web-based applications; SAPS; various products from Siebel Systems, Inc.; ClarifyCRM™; J2EE™; and other suitable targets. Furthermore, each of the versions may be implemented on one or more computing platforms (e.g., Solaris running on Sun Microsystems™ hardware, or a Microsoft Windows® OS running on Intel-based hardware). As used herein, the term “performance management system” is intended to include all of these disparate, customized software programs.
In one embodiment, the measurement component 102 uses agent software to capture performance metrics on servers running target applications. The measurement component 102 may provide a “breadth-wise” view of performance across multiple technology tiers (e.g., web clients, web servers, networks, application servers, database servers, storage servers, etc.). The measurement component 102 may measure, for example, end-to-end response times from the viewpoint of a user. The measurement component 102 may measure segmented response times from tier to tier and may therefore indicate the location of performance problems in a particular tier.
In one embodiment, a “base” version of the measurement component 102 may provide monitoring of a limited set of targets (e.g., TCP/IP-based applications). The functionality of the measurement component 102 may be augmented with optional agent modules that are customized to gather and analyze data for particular targets (e.g., web clients, web servers, networks, application servers, database servers, storage servers, etc.). For purposes of illustration and example, three agent modules 104 a, 106 a, and 108 a are shown. Other combinations of agent modules may be used in other configurations.
In one embodiment, the discovery component 112 provides identification and resolution of root causes of performance degradation. By permitting a user to “drill down” through various tiers of hardware and software (e.g., individual servers), the discovery component 112 may provide a “depth-wise” view of performance within each of the tiers that a target application crosses. The discovery component 112 may further indicate steps to be taken to fix current problems or avoid future problems.
In one embodiment, the console component 120 includes a “watchdog” layer that communicates key performance indicators, such as exceptions to service level agreements (SLAs), to appropriate users at appropriate times. The console component 120 may include functionality 122 for establishing SLAs and other thresholds. The console component 120 may include functionality 124 for reporting and charting. The console component 120 may include functionality 126 for providing alerts. Therefore, the console component 120 may function as a management console for user interaction with the measurement component 102 and discovery component 112.
In one embodiment, the performance warehouse 110 includes a repository of performance metrics which are accessible to the other components in the performance management system 100. For example, the historical data in the performance warehouse 110 may be used by the other components to provide short- and long-term analysis in varying degrees of detail.
The performance management system 100 of
In various configurations, the computer system 200 may include devices and components such as a keyboard & mouse 250, a SCSI interface 252, a network interface 254, a graphics & display device 256, a hard disk 258, and/or a CD-ROM 260, all of which are coupled to the processor 210 by a communications bus 207. The network interface 254 may provide a communications link to one or more other computer systems via a LAN (local area network), WAN (wide area network), internet, intranet, or other appropriate networks. It will be apparent to those having ordinary skill in the art that the computer system 200 can also include numerous elements not shown in the figure, such as additional storage devices, communications devices, input devices, and output devices, as illustrated by the ellipsis.
The agent 304 may comprise software which is executable on the client computer system 402. In one embodiment, the agent may comprise instructions which are expressed in a scripting language (e.g., an applet). The instructions may be executable by a web client 402 which is configured to interpret and/or execute the scripting language used for the agent 304. The web client 402 may comprise a web browser or other web-enabled application that is operable to exchange data with a web server 410.
The metrics 426 collected by the agent 304 may include measurements of web-related performance between the web client 402 and a web server 410. For example, the metrics 426 may include transmission times or response times for data sent between the web client 402 and web server 410. In particular, the metrics 426 may include the first byte time, i.e., the time it takes for the client request for a new web page until the web page source (e.g., the HTML document) arrives at the client from the web server 410. The metrics 426 may include the rendering time, i.e., the time it takes from receiving the web page source until all the content of the page (e.g., images and other additional content in the page) is completely downloaded by the client 402. The metrics 426 may include the overall wait time, which comprises both the first byte time and the rendering time. The metrics 426 may include the page views, i.e., the number of times a particular web page was requested. The metrics 426 may include an abandonment ratio, i.e., a ratio of times the user left the web page before it was completely rendered or displayed. The metrics 426 may include an SLA compliance ratio, i.e., a ratio of requests falling within defined SLA thresholds. By “drilling down” into the metrics 426, performance problems may be correlated to particular web pages, particular types of web transactions, particular geographical locations, particular logical domains, particular connection types (e.g., LAN or dial-up), particular file types (e.g., .html or .asp), or particular protocols (e.g., HTTP or HTTPS).
In one embodiment, the callout to the agent is substantially smaller in size than the agent itself. By embedding the callout to an agent rather than the agent itself into web pages, the overhead of using the agent may be reduced. Instead of loading the agent 304 with each and every web page requested from the server 410, the client 402 may load the agent 304 once and store the agent 304 in a cache.
The use of the callout to the agent may also improve the modularity of the performance management system. Periodically, the agent 304 may be updated. Instead of updating every web page on the web server 410 with a new version of the agent 304, the agent may be updated only on the collector server 310. The callout in the web pages is usable to fetch the latest version of the agent 304 from the collector server 310. In one embodiment, loading the agent 304 at the client 402 may therefore comprise automatically updating the agent 304 at the client 402.
It is further noted that any of the embodiments described above may further include receiving, sending or storing instructions and/or data that implement the operations described above in conjunction with
Although the embodiments above have been described in considerable detail, numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5450586||Apr 30, 1992||Sep 12, 1995||Hewlett-Packard Company||System for analyzing and debugging embedded software through dynamic and interactive use of code markers|
|US5732218 *||Jan 2, 1997||Mar 24, 1998||Lucent Technologies Inc.||Management-data-gathering system for gathering on clients and servers data regarding interactions between the servers, the clients, and users of the clients during real use of a network of clients and servers|
|US5996010 *||Jul 28, 1997||Nov 30, 1999||Nortel Networks Corporation||Method of performing a network management transaction using a web-capable agent|
|US6049827 *||Feb 20, 1998||Apr 11, 2000||Hitachi, Ltd.||Network management tool for causing network equipment to display information of a network relevant to the network equipment|
|US6070190 *||May 11, 1998||May 30, 2000||International Business Machines Corporation||Client-based application availability and response monitoring and reporting for distributed computing environments|
|US6081518 *||Jun 2, 1999||Jun 27, 2000||Anderson Consulting||System, method and article of manufacture for cross-location registration in a communication system architecture|
|US6148335 *||Nov 25, 1997||Nov 14, 2000||International Business Machines Corporation||Performance/capacity management framework over many servers|
|US6167448||Jun 11, 1998||Dec 26, 2000||Compaq Computer Corporation||Management event notification system using event notification messages written using a markup language|
|US6317788 *||Oct 30, 1998||Nov 13, 2001||Hewlett-Packard Company||Robot policies for monitoring availability and response of network performance as seen from user perspective|
|US6321263 *||May 11, 1998||Nov 20, 2001||International Business Machines Corporation||Client-based application availability|
|US6697849 *||May 1, 2000||Feb 24, 2004||Sun Microsystems, Inc.||System and method for caching JavaServer Pages™ responses|
|US6701363||Feb 29, 2000||Mar 2, 2004||International Business Machines Corporation||Method, computer program product, and system for deriving web transaction performance metrics|
|US6760903||Aug 22, 2000||Jul 6, 2004||Compuware Corporation||Coordinated application monitoring in a distributed computing environment|
|US6792459||Dec 14, 2000||Sep 14, 2004||International Business Machines Corporation||Verification of service level agreement contracts in a client server environment|
|US6826606 *||Jan 23, 2001||Nov 30, 2004||Citrix Systems, Inc.||Method and apparatus for communicating among a network of servers|
|US6850252 *||Oct 5, 2000||Feb 1, 2005||Steven M. Hoffberg||Intelligent electronic appliance system and method|
|US6857119||Sep 25, 2001||Feb 15, 2005||Oracle International Corporation||Techniques for modifying a compiled application|
|1||"Turbo-Charging Dynamic Web Sites with Akamai EdgeSuite," 2001.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8266281 *||Apr 22, 2010||Sep 11, 2012||Imdb.Com, Inc.||Collecting client-side performance metrics and latencies|
|US8433750||Sep 10, 2012||Apr 30, 2013||Imdb.Com, Inc.||Collecting client-side performance metrics and latencies|
|US8543907 *||Oct 16, 2009||Sep 24, 2013||Google Inc.||Context-sensitive optimization level selection|
|US9134978 *||Jul 2, 2013||Sep 15, 2015||Google Inc.||Context-sensitive optimization level selection|
|U.S. Classification||709/223, 709/219, 707/634, 709/224|
|Dec 18, 2003||AS||Assignment|
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HABER, LIOR;REEL/FRAME:014832/0668
Owner name: VERITAS OPERATING CORPORATION, CALIFORNIA
Effective date: 20031218
|Jun 27, 2005||AS||Assignment|
Owner name: PRECISE SOFTWARE SOLUTIONS LTD., ISRAEL
Effective date: 20050622
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VERITAS OPERATING CORPORATION;REEL/FRAME:016722/0251
|Mar 15, 2012||AS||Assignment|
Owner name: SILICON VALLEY BANK, CALIFORNIA
Effective date: 20120313
Free format text: SECURITY AGREEMENT;ASSIGNOR:PRECISE SOFTWARE SOLUTIONS, INC.;REEL/FRAME:027883/0839
|Jun 18, 2014||FPAY||Fee payment|
Year of fee payment: 4
|Aug 20, 2014||AS||Assignment|
Owner name: PRECISE SOFTWARE SOLUTIONS, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRECISE SOFTWARE SOLUTIONS, LTD.;REEL/FRAME:033574/0052
Effective date: 20140820
|Aug 21, 2014||AS||Assignment|
Owner name: PRECISE SOFTWARE SOLUTIONS, INC., CALIFORNIA
Effective date: 20140820
Free format text: RELEASE;ASSIGNOR:SILICON VALLEY BANK;REEL/FRAME:033597/0801
Free format text: SECURITY INTEREST;ASSIGNOR:PRECISE SOFTWARE SOLUTIONS, INC.;REEL/FRAME:033626/0297
Effective date: 20120313
Owner name: SILICON VALLEY BANK, CALIFORNIA
|Sep 8, 2014||AS||Assignment|
Owner name: COMERICA BANK, AS AGENT, MICHIGAN
Effective date: 20140905
Free format text: SECURITY INTEREST;ASSIGNORS:IDERA, INC.;PRECISE SOFTWARE SOLUTIONS, INC.;COPPEREGG CORPORATION;REEL/FRAME:033696/0004
|Nov 25, 2014||AS||Assignment|
Free format text: SECURITY INTEREST;ASSIGNORS:IDERA, INC.;PRECISE SOFTWARE SOLUTIONS, INC.;COPPEREGG CORPORATION;REEL/FRAME:034260/0360
Owner name: FIFTH STREET MANAGEMENT LLC, AS AGENT, CONNECTICUT
Effective date: 20141105