US 20030097320 A1
A system and method for detecting and analyzing information units and particularly the analysis and evaluation of data records by selecting particular aspects of the data record is disclosed. The method comprises creating a complexity catalog based on an information unit and establishing at least one score unit based on the at least one complexity catalog. The system comprises an infrastructure server device to create at least one complexity catalog, and a complexity catalog to hold at least one list of ordered complexity values associated with partitioned sub-unit blocks, an application server to build at least one information summary unit based on an information unit and on an associated complexity catalog, and a scoring component to provide scores.
1. In a computing environment accommodating at least one input device connectable to at least one server device connectable to at least one output device, a method of processing at least one information unit introduced by the at least one input device by the at least one server device to create at least one information score based on the at least one information unit, the method comprising the steps of:
creating at least one complexity catalog based on the at least one information unit; and
establishing at least one score unit based on the at least one complexity catalog.
2. The method of
obtaining at least one information unit from the at least one input device by the at least one server device; and
displaying the at least one scoring unit.
3. The method of
4. In a computing environment accommodating at least one input device connected to at least one server device having at least one output device, a system for the processing at least one information unit introduced via the at least one input device by the at least one server device to process at least one information unit based, the system comprising the elements of:
an infrastructure server device to create at least one complexity catalog; and
a complexity catalog to hold at least one list of ordered complexity values associated with the partitioned sub-unit blocks; and
an application server to build at least one information summary unit based on the at least one information unit and on at least one associated complexity catalog; and
a scoring component to provide scores.
 A novel method and system for analysis and evaluation of information units is disclosed. The information analysis and evaluation system (IAES) facilitates its user with information regarding the input data of the IAES. The incoming raw data is processed, analyzed and evaluated using the complexity system and method introduced in the pending PCT application PCT/IL01/01074.
 The use of the innovative complexity system and method for analyzing and evaluating data within the present invention provides the option of acquiring accurate information rapidly. The present invention regards the analysis and evaluation data within a computerized environment. According to the present invention the IAES can derive the required information from database records, text and graphics that are inserted within the IAES either on-line or off-line. The input data units received by the IAES are processed by a set of specifically developed computer programs, which read the data units and divide the data records into fragments or blocks known to the IAES. The division of the data records by the routines is performed in accordance with predetermined parameters associated with the format and the content of the data record collection. The fragments and blocks have substantially identical dimensions. Each of the dimensionally substantially identical record fragment is assigned an arbitrarily predetermined complexity value by a set of specifically developed computer programs that calculate the complexity value of the fragments in association with predetermined processing parameters. The division of the related data records into multiple fragments having identical dimensions, the assignment of the complexity value to the fragments, and the subsequent organization of the data fragments, provides the option of creating data segmentation meaningful groups and detecting characteristic groups that provide conclusive information regarding the input information. The complexity value calculation requires no a-priori knowledge of the diverse input data received by the IAES. However, a minimum of indicative input regarding to the type and format of the input data is required. For instance, when IAES is operative as a supervising system for a CCTSO, as illustrated in the preferred embodiment of the present invention, an indication is made for the IAES that the input data has a number of fields, indicates the field types and the like. The complexity values provided by the IAES are processed and organized in accordance to doctrines, such that the required information will be provided most accurately.
 In the preferred embodiment of the present invention the IAES is supplied with data relating to transactions initiated and performed by credit card holders. Accordingly, the IAES operates as the supervising system for a CCTSO. The IAES can provide indication whether a transaction is fraudulent and consequently can provide an alert, a warning or a set of operating instructions following the analysis. Furthermore, the IAES can detect characteristic information regarding a particular credit card holder or a group of credit card holders. The complexity value can be calculated for each field of the input data or for the combination of one or more fields within the input data. The learning and scoring components provide the required output of the IAES.
 The complexity calculation method and system is described in detail in the pending PCT application PCT/IL01/01074, which is incorporated herein by reference. The system environment in which the preferred embodiment of the present invention could operate is illustrated in FIG. 1. Users 10, 12 and 14 are linked communicatively to a data communication network (DCN) 20. The users 10, 12, 14 utilize computing and communicating devices to transmit transaction records via the DCN 20. In accordance with the preferred embodiment of the present invention, users 10, 12 and 14 transmit credit card transaction records to IAES 18 through the DCN 20. The users 10, 12 and 14 may be individuals initiating and performing a credit card transaction or businesses, such as commercial retail outlets or agents providing credit card transaction transmission services. The transaction server 16 represents credit card companies and clearing houses. The transaction server 16 contains databases including information concerning the associated credit card holders, transaction histories of credit card holders, and the like. DCN 20 links the transaction server 16 and the IAES 18. The transaction server 16 and the IAES 18 can be located at identical or adjacent locations and thereby could be provided with the option of being connected physically. The DCN 20 can be the Internet, a LAN, a WAN a satellite communications network or any other communications network. DCN 20 is typically a standard telephone network (POTS) that enables communication via ordinary telephone lines. Users 10, 12 and 14 typically use a dedicated telephone line or a dial-up connection for transmitting information via the POTS. The IAES 18 general structure is described below in association with FIG. 2.
 Referring now to FIG. 2 the IAES 18 includes an input device 56, a communication device 54, an output device 58 and an analysis and evaluation server platform 22. The input device 56 can be a pointing device, a keyboard device or the like. The output device 58 can be a printer, a screen display or the like. The communication device 54 can be a modem, a network interface card or any other suitable communication devices providing transmission and reception of data via DCN 20 of FIG. 1. The analysis and evaluation server platform 22 within the preferred embodiment of the present invention includes a processor device 24, and a memory device 26. The processor device 24 is the logic unit designed to perform arithmetic and logic operations by responding to and processing the basic instructions driving the computing device. The processor device 24 can be one of the Intel Pentium series, the PowerPC series, the K6 series, the Celeron, the Athlon, the Duron, the Alpha, or the like. The memory device 26 includes a reference transaction database 28, an operating system 30, a control database 32, a complexity database catalog 36 and an application server 38. The reference transaction database 28 includes database information including a list of the credit card holders, personal information regarding credit card holders, history files containing the transactions performed by credit card holders, history files containing fraudulent transactions and other relevant information related to credit card holders and agents. The reference transaction database 28 can be located within the IAES 18 as illustrated in FIG. 2, within a transaction server 16 as illustrated in FIG. 1 or in any other separate location. The operating system 30 is responsible for managing the operation of the entire set of software programs implemented in the operation of the IAES 18. The operating system 30 can be of any known operating system such as Windows NT, Windows XP, UNIX, Linux, VMS, OS/400, AIX, OS X and the like. The complexity database catalog 36 includes all the complexity values assigned to the records processed by the complexity engine 52. The complexity values stored within the complexity database catalog 36 are further discussed herein under in association with the description of the complexity engine 52. The control database 32 controls the input data received by the input device 56 and the transfer thereof to the application server 38. The control database 32 also directs the movement of the data from the reference transaction database 28 to the application server 38 and to the complexity database catalog 36 from the application server 38. The application server 38 within the preferred embodiment includes a complexity catalog handler 40, a scoring component 42, a learning component 44, a database handler 46, a resource allocation component 48, a user interface component 50 and a complexity engine 52. The complexity catalog handler 40 is responsible for the obtaining the appropriate complexity metrics records created by the application server 38 from the complexity database catalog 36. The resource allocation component 48 is responsible for allocating variable resources to the processing of the separate records in accordance with the complexity metrics thereof. The user interface component 50 is a set of specifically designed and developed front-end programs. The component 50 allows the user of the system to interact dynamically with the system by performing a set of predefined procedures operative to the running of the method. Via the component 50 the user could select an application, as selected for the CCTSO supervision purposes, activate the selected application, adjust specific processing parameters, select sets of records for processing according to the complexity metrics thereof, and the like. The component 50 could be developed as a plug-in to any of the known user interfaces. The component 50 will be preferably a Graphical User Interface (GUI) but any other manner of interfacing with the user could be used such as a command-driven interface, a menu-driven interface or the like. The database handler 46 receives the input data records from the control database 32 and provides the records to the complexity catalog handler 40. The database handler 46 further receives complexity values and scores provided to data records from the complexity catalog handler 40 and provides the control database 32 that provides the complexity database catalog 36 and reference transaction database 28 with the complexity values and scores regarding to data records. The learning component 44 provides mechanism for matching a given input such as the complexity vectors for each transaction to a given output such as a fraudulent indication. The learning component 44 provides the scoring component 42 with different scores that are than processed within the scoring component 42. The complexity engine 52 provides complexity values to data records received from the control database 32 within the application server 38 and handled by the database handler 46.
 For purposes of clarity the drawing under discussion includes a single analysis and evaluation server platform 22 only and it is shown thereon that the entire set of software routines is co-located on the single platform 22. In a realistic system configuration several platforms could be used for solving practical problems such as activating load-balancing techniques for the enhancement of system performance and the like. Furthermore in a realistic system the analysis and evaluation server platform 22 will include additional hardware elements and software components in order to support the system and method proposed by the present invention or any other non-related applications implemented on the platform 22.
 The stream of data records processed within the application server will be better understood in view of FIG. 3. FIG. 3 illustrates a flowchart including a database handler 60, a preprocessing component 62, a learning component 64, a scoring component 66 and an output generator 68. The stream represents the stages that provide the final fraud detection product. The data records received at the first procedure by the preprocessing component 62 operates a preprocess module that processes the external database from reference transaction database 28 (FIG. 2) and new transactions received within the IAES from the input device 56 (FIG. 2). The preprocessing component 62 uses the internal database received from the complexity database catalog 36 and complexity engine 52 to calculate the complexity vector value for each transaction. Each transaction processed, either drawn from the reference transaction database 28 (FIG. 2) or a new transaction from the input device 56, is inserted as an input to the complexity engine 52 for processing. The database handler 46 provides the input data to the complexity engine 52. The complexity value is calculated within the complexity engine 52. The method used for the complexity calculation is the “Multiple Single Dimension” complexity calculation. The method and system of the operation of the complexity engine is further explained within the pending PCT application PCT/IL01/01074, which is incorporated herein by reference.
 The “Multiple Single Dimension” complexity calculation method is used within the description of the preferred embodiment of the present invention. According to the Multiple Single Dimension, a multiple feed of single dimensions are fed to complexity engine 52 having the complexity calculation calculated. At first the data containing “n” bytes composed out of “f” feeds, each of a single dimension is segmented into “m” blocks, each block having “f” feeds, within each RF bytes, where RF is the reading frame (i.e. n=m*f*RF bytes). For each block, a complexity calculation is made and the complexity metric (i.e. for each block a complexity parameter) is stored in a complexity file. The complexity calculation for each block is as follows: The first stage includes determination of a word size list (WS), a feed number list (FN) and a letter parameter. The second stage includes the modification of each block according to the letter parameter. Accordingly, the number of the maximum different words size is calculated, by either the maximal words given the letter parameter, FN and WS (max1) or by the maximal possible words in the given block (max2). The number of actual different words is calculated. Each word has WS*FN bytes describing it. Finally, the ratio between actual words and maximal words is calculated. The result is multiplied to give the final complexity parameter. The complexity value given by the complexity engine 52 is stored within the complexity database catalog 36.
 In one particular example one transaction record inserted to the complexity engine 52 and has complexity calculation performed for one of its fields or for any combination of fields. The next stage within the preprocessing component 62 is the operation of an access module that provides the complexity engine 52 with a predetermined number of similar last transactions according to resource allocation component 48. The complexity engine 52 calculates the complexity values of the fields of the additional transactions. The output vector of said complexity values is the first step of preprocessing component 62. The second step of the preprocessing component 62 is the calculation of the average and standard deviation of the complexity of each parallel field within the transactions. The calculation of the average and standard deviation is performed only for non-fraudulent transactions. Accordingly, the complexity database catalog 36 stores the complexity vector of each record and of the last transactions as calculated above for each account of credit card holder. Each account within the complexity database catalog 36 contains an average and standard deviation of each element of the vector calculated (e.g. average complexity for amount field, average complexity for date field, standard-deviation complexity for amount field, etc.). The learning component 64 is the next stage within the preferred embodiment. However, other preferred embodiment can operate without the learning component 64. The proposed method includes a fraudulent behavior scoring module. Scoring modules used can be of known techniques for matching a given input (e.g. the complexity vectors for each transaction and average and standard-deviation of the complexity the account, demographic properties of the account, etc.) to a given output (e.g. Fraud\Not Fraud). These techniques can be neural network methods, linear regression techniques, genetic algorithms, etc. After the execution of the scoring module the output received is an appropriate profile of a behavior scoring module components (e.g. weights, matrices, thresholds, etc.). The appropriate profile can produce a score based on a new transaction. The next stage is the scoring component 66 that produces the final calculation the result of which provided at the output generator 68 and indicates whether a transaction is fraudulent. In the scoring procedure, the Integrating module receives the scores from all the scoring modules to produce a single score. The single score providing giving weights for each associated score of related transactions performs the output of the IAES received from the learning component 64 to produce the final score. The scoring component 66 can use an individual scoring module, a group scoring module, a fraud scoring module, a fraudulent behavior scoring module and other modules. The output generator 68 receives the output from the scoring component 66 and processes the result to be presented by the output device 58 to the user of the IAES. The output generator is positioned within the scoring component 42.
 As indicated the scoring component uses different integrating modules that receive the scores from all the previously calculated, within the preprocessing component 62 and learning component 64, scoring modules to produce a single score.
 The Individual scoring module produces a score based on the individual behavior and the new transaction's deviation from it. The new transaction was preprocessed and now has a complexity vector. The absolute value of the deviation of every element of the vector from the preprocessed average of the account is calculated (e.g. ABS ((Amount complexity−Amount average complexity)/(Amount complexity standard-deviation)), etc.). This produces a vector of deviation for the new transaction. All the elements are averaged and scaled to produce a score between 0.999, where 0 denotes average behavior and 0.999 denotes maximal deviation from average behavior.
 The group-scoring module produces a score based on the group behavioral change and its relation to the new transaction. All accounts are segmented to groups (e.g. by demographic properties, average complexity, etc.). For each group, the average complexity is calculated by averaging the average complexity for each field of each account. Thus, for each field, the average of average complexity of the accounts within the group is calculated producing a vector. Taking a time interval parameter from the internal database, this vector is calculated for every such interval (e.g. day, week, month, etc.) and the change in the vector is also calculated. By multiplying each element of the vector with the appropriate element of the new transaction's deviation vector (i.e. the same as in the Individual scoring module, only without the absolute value), a vector containing the relation between the group's behavioral change and the new transaction's deviation is received. Thus if the group's change is the same as the transaction deviation (e.g. both increase in complexity or both decrease in complexity) the new vector will have a positive element, while if the group's change is different from the transaction deviation, the new vector will have a negative element. The new vector is averaged and scaled to produce a score between 0.999, where 0 denotes that the new transaction changed exactly as the group and 0.999 denotes that the new transaction changed exactly opposite to the group's change.
 The Fraud scoring module takes all the fraudulent transactions within the appropriate group and their preprocessed complexity vectors. For each such transaction, a complexity deviation vector is calculated using the account's average and standard deviation. Thus each fraudulent transaction within the group has a complexity deviation vector. These are averaged and each of their elements is multiplied by the new transaction's deviation vector (without the absolute value). The elements of the new vector are averaged and scaled to produce a score between 0.999 where 0 denotes that the new transaction behaved exactly the opposite of the fraudulent transactions and 999 denotes that the new transaction behaved exactly as the fraudulent transaction.
 The Fraudulent behavior scoring module takes the complexity deviation vector of the new transaction, as well as other input variables needed (e.g. averages, demographic properties, etc.) and uses the learned profile to produce a score between 0.999 where 0 denotes that the new transaction is not included in the fraudulent behavior and 0.999 denotes that the new transaction is exactly the fraudulent behavior.
 Referring now to FIG. 4 that illustrates the components of the general infrastructure and manner of operation of the complexity engine 52 (FIG. 2) that was explained within PCT/IL01/01074 incorporated to this application. The server 72 accepts one or more input records from an input records stream 70. The input records stream 70 is provided to the server 72 via diverse input devices described hereinabove. The server 72 is a set of functional computer programs specifically designed and developed to implement the method and system proposed by the present invention. The server 72 includes an input records handler 76, a control table 74, a record dividing component 78, a complexity assignment component 80, and a complexity catalog handler 82. The input records handler 76 receives the input records from the input records stream 70 and provides the records to the record-dividing component 78. The record-dividing component 78 accepts the records, obtains the suitable control parameters from the control table 74, and divides the records into dimensional blocks having a size determined by the control parameters. Subsequently the dimensional blocks are provided to the complexity assignment component 80. The component 80 obtains the suitable control parameters from the control table 74, assigns appropriate complexity metrics to the records, and passes the complexity metrics records to the complexity catalog handler 82. The complexity catalog handler 82 inserts the complexity metrics records that include suitable pointers to the input records to the complexity catalog 84. The catalog 84 is a data structure holding the list of the complexity records for further processing.
FIGS. 5A, 5B and 5C illustrate one example of the preferred embodiment of the present invention. Accordingly, FIG. 5A represents a transaction preformed by one credit card holder. FIG. 5A presents different fields included within the transaction record 104 received within the IAES. The records consist from a serial number field 90, a date field 92, a credit card holder name field 94, an agent/business name field 96, a sum field 98, an address field 100. Additional fields could include the type of services or goods that were purchased during the specific transactions and other relevant information. Each of the fields indicating data concerning the transaction can indicate fraudulent behavior in accordance to the preferred embodiment of the present invention. Accordingly, the date field can indicate fraudulent behavior when a large number of transactions are performed on one particular day. Other examples can demonstrate the ability of each field type within the transaction record to provide indication of fraudulent behavior. The transaction 104 received within the IAES is processed to diagnose whether the transaction is detected as fraudulent or not. The diagnosis can be a result of a fraudulent indicator emerging from the analysis of a single field of a particular transaction or from more the analysis of more than one field of a transaction record. The data within the fields are processed provide scores. The scores of the fields are concentrated to a cluster that is processed according to the procedures illustrated above in accordance to FIG. 3 to provide a final score. Each field within the transaction 104 or any combination thereof is processed as predetermined by the complexity database catalog 36 (FIG. 2). The process includes the procedure stages indicated in FIG. 3 including the preprocessing procedure, a learning procedure and a scoring procedure. The transaction record can be processed by the preprocessing procedure alone or by any other procedure or combination thereof. The complexity values received from the procedures mentioned above that process the transaction record 104 are stored in a record 106 illustrated in FIG. 5B within the complexity database catalog 36 (FIG. 2). One simple example illustrating a fraudulent behavior can be the anomaly of the sum for a single transaction. A credit card holder that initiates a transaction for about $10,000 in comparison to his usual habit of not exceeding about $200 per transaction will indicate a possible fraudulent behavior that will be indicated by high score. The record 106 includes different fields regarding different transactions. Each row relates to a specific transaction. Column 108 includes serial numbers that indicate a different transaction. The different field in each row includes a calculated value received from the procedures operated by one or more of the components illustrated in FIG. 3. Integrating module as indicated above to provide score values indicated in column 110 processes the processed transactions concentrated within the record 106. The transactions are organized within the final result record 112. The final result record 112 indicated a transaction serial number column 108. The transactions within the final result record 112 include all the processed data of relevant transactions and the final score provided by the IAES. The highest score 0.999 indicates the greatest anomaly of the particular transaction. As shown within FIG. 5B the transaction are not arranged and organized according to their final score. FIG. 5C shows the transactions organized according to their anomaly.
 The person skilled in the art will appreciate that what has been shown is not limited to the description above. Many modifications and other embodiments of the invention will be appreciated by those skilled in the art to which this invention pertains. It will be apparent that the present invention is not limited to the specific embodiments disclosed and those modifications and other embodiments are intended to be included within the scope of the invention. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation. The invention, therefore, should not be restricted, except to the following claims are their equivalents.
 The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
FIG. 1 is a schematic block diagram of the system environment of the preferred embodiment of the present invention;
FIG. 2 is a schematic block diagram of the data analysis and evaluation system of the preferred embodiment of the present invention;
FIG. 3 is simplified flow chart illustrating the operation of the system and method of the preferred embodiment of the present invention;
FIG. 4 is a simplified infrastructure of the operation of the complexity engine in accordance with the preferred embodiment of the present invention; and
FIGS. 5A, 5B and 5C illustrate a typical transaction input record and the manner of storage thereof in the information analysis and evaluation system of the preferred embodiment of the present invention.
 The present invention generally relates to a system and method for detecting and analyzing information units. More specifically, the present invention relates to the analysis and evaluation of data records by selecting particular aspects of the data record.
 Enormous volumes of data and information are accessible to users worldwide as a consequence of the so-called Information Age. The development of computerized databases, high-volume storage devices and the capability of transmitting data over global data communications networks provides a great influx of data which is not always processed and evaluated completely. Due to the massive quantities of data and information and the resources required for the processing thereof, the evaluation processes are typically incomplete. Financial organizations that provide credit card usage supervision services, transaction processing services, business indicator analysis services and the like are handling an immense amount of data records daily. The large amount of data records that are formed by business transactions initiated and performed by credit card holders are typically transmitted to credit card supervision transaction services organizations (CCTSO) for appropriate processing. The required processing includes the updating of the relevant accounts of the credit card holders, the updating of the accounts of the services and goods suppliers, and the creation of records associated with clearing instructions. The update of the relevant accounts includes the entire set of information units associated with a single transaction, such as credit card number and type, account number, transaction date, point of sale, type of goods or services purchased and the like. Further processing of data by the CCTSO includes the validation, error checking, authorization, and evaluation of the transaction and most importantly detection or identification of fraudulent transactions. When a fraudulent transaction is recognized a suitable “watchdog” procedure is performed such as the issuance of an alert, a warning message or other pre-defined set of operating instructions that are transmitted to the appropriate functionaries within the CCSTO and, if applicable, transmitted in real-time back to the point of sale to be displayed on the supplier's display device.
 Detection of fraudulent credit card usage requires the CCTSO to analyze a particular transaction initiated and performed by credit card holder by comparing transaction-specific information, such as personal characteristics of the cardholder. The characteristics may include age, address, previous credit card usage and the like. The type of goods or services purchased, the price, the location of the point of sale are also analyzed in order to detect unusual transaction patterns. The immense amount of transactions performed and the urgency concerning the detection of fraudulent behavior as soon as possible consequent to its occurrence necessitates the utilization of sophisticated computerized systems. Computerized evaluations of transactions and the identification of fraudulent behavior are known in the art. Typically these procedures are commonly performed by specific computer programs utilizing large decision trees. As a result, the CCTSO has the ability to recognize obvious fraudulent behavior. For example one anomalous transaction could be associated with a credit card holder that purchases goods and services in a total sum that is significantly higher than the amounts used in prior transactions of the same cardholder. The CCTSO is typically maintains within its computerized database price limits regarding acceptable usage of a specific credit card. The anomalous credit card usage could effect a suitable alert or warning. However, decision trees as currently known in the art are substantially limited and thereby can provide only comparatively simple diagnosis. All too often the diagnosis is inaccurate and as a result fraudulent transactions are ignored while valid transactions could generate an alert. Erroneous diagnosis, which generates non-justified alerts, warnings or operating instructions, has many disadvantages for a CCTSO or any other organizations utilizing computerized analysis and evaluation of data. Erroneous diagnosis provided by CCTSO determines the relation of the decision making managing rank to alerts or warnings raised by the computerized system. Consequently, all too often alerts and warnings are not accepted readily and as a result many fraudulent transactions are not detected in real-time.
 A prior art system known in the field of fraudulent behavior detection is disclosed in U.S. Pat. No. 5,819,226. The patent provides an automated system and method for detecting fraudulent transactions using a neural network as a predictive model. The neural network model “learns” a pattern that it can later identify. The learning process is based on a given number of iterations executed by the neural network based detection system thus, providing its output result. Nevertheless, the ability of a fraudulent behavior detection system based upon a neural network is not accurate and could provide false diagnosis of transactions. The principal reason for providing false diagnosis is subject to the manner the neural networks method operates. The neural network method ability within a fraudulent behavior detection system is limited as it learns the pattern of a single customer, credit card holder, or a group of customers, and their fraudulent behavior and produces a score based on the “learned” patterns. Consequently, the neural network provides a large amount of false recognitions, such as identifying a valid credit card transaction as fraudulent. The inefficiency of neural networks is due to their disability to deal with “trouble making” customers who have a non-simple or erratic behavior pattern.
 There is an urgent need to introduce a system and a method that will minimize false fraudulent behavior detection within a CCTSO. A further need exists for a system and method that is able the create segmentation of the incoming data records, such as business data records and thereby characterizing data within groups separated in a predefined manner. The segmentation enables the processing of data concentrated within separated segments in an efficient and accurate manner providing “clear cut” results. The required system and method will provide analysis and evaluation of the information in such a manner as to provide a minimum of false results. These requirements could be accomplished by the application of the complexity system and method introduced within PCT/IL01/01074.
 The present invention provides a computing environment accommodating at least one input device connectable to at least one server device connectable to at least one output device, a method of processing at least one information unit introduced by the at least one input device by the at least one server device to create at least one information score based on the at least one information unit, the method comprising the steps of: creating at least one complexity catalog based on the at least one information unit, and establishing at least one score unit based on the at least one complexity catalog.
 The method for processing the data mentioned above can further comprise the steps of: obtaining at least one information unit from the at least one input device by the at least one server device, and displaying the at least one scoring unit. The information unit that can be used within the present invention may contain information of a transaction performed by a credit card holder.
 The present invention includes a computing environment accommodating at least one input device connected to at least one server device having at least one output device, a system for the processing at least one information unit introduced via the at least one input device by the at least one server device to process at least one information unit based, the system comprising the elements of: an infrastructure server device to create at least one complexity catalog; and a complexity catalog to hold at least one list of ordered complexity values associated with the partitioned sub-unit blocks; and an application server to build at least one information summary unit based on the at least one information unit and on at least one associated complexity catalog; and a scoring component to provide scores.
 This application claims priority from PCT Application No. PCT/IL01/01074, filed Nov. 21, 2001, and Israeli Patent Application No. 146597, filed Nov. 20, 2001, each of which is hereby incorporated by reference as if fully set forth herein.