Analysis of the time taken (in seconds) for each individual stage of the workflow process, for the selected day(s), including data normalization, consolidation, validation and promotion - all of which can be filtered as appropriate. A second analysis shows the number of individual data points processed within each stage for the selected day(s), including a drill-down for any stage to show asset class (e.g. FX Rate), data vendor (e.g. Bloomberg), data attribute (e.g. Close) and the actual asset (e.g. GBP/USD FX Rate) associated with each point. The third analysis allows the processed data points to be filtered by up to eight attributes to effectively show lineage (life-cycle) of any number of data points as they move between the various stages of the process. This also includes the names of the two Users who were involved in the 4-Eyes validation of each exception that was manually resolved. The remaining analyses show various breakdowns of the types of validation carried out (e.g. Instrument, Curve and Surface), how many points passed and failed these validations and the types of test carried out on the data points. For example, flat spot, spike and outlier tests on FX Rates and changes outside limits for each tenor within a curve or each tenor pair within a surface.