TCMFramework_SM.gif (5258 bytes)TCM Framework: An Integrated Approach to Portfolio, Program and Project Management
(Rev. 2012-01-09)



10.4 Project Historical Database Management

10.4.1 Description

    Project historical database management is a process for collecting, maintaining, and analyzing project historical information so that it is ready for use by the other project control processes and for strategic asset management. Empirical information is the most fundamental project planning resource available and it is manifested in the form of quantified and documented historical data and information. The historical database management process captures empirical information and retains this experience within the institutional memory to support the development of continually improving project plans as well as improved methods and tools. The purpose of the process is not to repeat history, but to learn from it (i.e., to enable continuous improvement in the project system).

    To illustrate historical data’s importance, Figure 10.4-1 provides a simplified block flow diagram of the information flow in the project control process. Each block represents a project control process that incorporates data manipulation methods or tools (e.g., cost estimating system). The interconnecting lines show the general flow of data and information products among the processes. Clearly, if you remove the historical block from the diagram, there is no closure in the information flow. Without the historical data process, project planning methods and tools have no other basis other than the personal knowledge of the project team members; no institutional memory is developed and no opportunity of a learning organization exists.

Figure10.4-1.jpg (127815 bytes)

Figure 10.4-1 Project Control Process Information Flow

    Figure 10.4-1 is essentially the same as Figure 6.3-1 (Section 6.3) for the asset historical database management process. While the figures show separate blocks for the asset and project processes, the distinction between them is somewhat artificial. Both processes have similar needs for performance benchmarks, cost references, and other information. If an enterprise’s project system is viewed as a strategic asset of the enterprise, the asset database can be viewed as the master. In any case, through a relational database structure or some other means, the databases should be integrated, allowing users to access life cycle information about the asset and project portfolios.

10.4.2 Process Map for Project Historical Data Management

    Figure 10.4-2 illustrates the process map for asset historical database management. The two main steps of the process include collecting data of various types and processing it into useful information products.

Figure10.4-2.jpg (125664 bytes)

Figure 10.4-2 Process Map for Project Historical Database Management

    The following sections briefly describe the steps in the project historical database management process.

.1 Plan for Historical Database Management

    Project historical database management starts with planning. The database is a strategic asset of the enterprise. As such, planning the database management process for a given project must first consider the status and requirements of the master database(s). The database plan must then address the interface/interaction of the project data inputs and outputs with the master database as well as issues such as data format and level of granularity.

    The project inputs include planned and actual quantitative data and qualitative information about the performance of project work and control methods and tools. The quantitative data (e.g., cost estimates, actual costs, schedules, etc.) must be processed. Processing includes cleaning, organizing, and normalizing the data as required for inclusion in the master database(s) for continued use. The qualitative information (e.g., learnings, etc.) must be cleaned, organized, and standardized as required for inclusion as well. These collection and processing activities, many of which are done at the time of project completion (i.e., project closeout), must be planned and resources allocated for their performance.

    The processed quantitative data are then used for the development and maintenance of reference databases for planning (e.g., estimating line-item database), metrics for plan validation (e.g., check estimate competitiveness), and tools development (e.g., estimating algorithm development, templates, etc.). The processed information, including lessons learned, helps guide the effective application of the data in the development and maintenance tasks, but also serves as a direct reference to aid project planning. Much of this analysis and development is done on a project system or programmatic level (Section 6.3); however, a given project may require special data products that need to be developed for use on that project (e.g., estimating algorithms for equipment using new technology).

    The plan for project historical database management must address the collection and processing work that will be somewhat project specific, but also the project’s interface with the existing or future master database (i.e., guidelines, procedures, and systems used by all the enterprise’s assets and projects). Planning topics may include, but are not limited to, the following:

    Databases may capture both electronic and hard-copy information; each type of datum has specific considerations. In whatever form it is captured, the goal is to store the data in a way that is easy to find, retrieve, update, and use. Additionally, there may be multiple databases that support specific purposes and each must be considered while planning to ensure that the data collection process captures required information/data at the proper levels of granularity (e.g., databases specific to an estimating system for a particular technology). Finally, some enterprises have limits or restrictions on retaining original data and records that must be adhered to as a matter of policy and must be addressed (e.g., use the raw data to create metrics, then discard the original data).

    Data collection and processing activities may be a responsibility of cost engineers in a project control role supporting the project manager. The project data collected and processed are typically channeled to a single project control professional responsible for overall master database maintenance, analysis, and development supporting the enterprise’s overall project system. These parties must coordinate their efforts in order to develop, implement, and maintain an effective database. Another aspect to incorporate into the data collection and processing plan is the interaction/interface of suppliers and contractors with the owner’s data collection methods and format. Others on the project team are also likely to have roles in providing and processing raw data. Giving prior consideration to the available data format and granularity may significantly ease data integration into the database or facilitate the data’s later use.

.2 Collect and Process Data

Quantitative (Measures)

    The database inputs include estimated, planned, and actual quantitative data about the performance of project work. The quantitative data (cost, hours, schedule durations, etc.) must be collected and processed.

    Data collection is usually performed as a part of the project closeout process. Data are often collected by completing a form or forms that are set up with a standard coding structure and level of detail consistent with master database requirements. The forms should also capture information that identifies and characterizes the project in terms of name, location, project type, execution strategies used, and so on. Schedule durations can be captured in form tables, but it is also useful to keep a copy of the electronic schedule or a printout so that the schedule logic is retained.

    Data processing includes cleaning, organizing, and normalizing as appropriate to incorporate the data into the master database(s) for continued use. Data cleaning refers to ensuring that data are complete and acceptably accurate for database purposes, which may differ from accounting and finance purposes. Organizing data refers to ensuring that data are coded in accordance with the code of accounts and/or other structures used by the database and are otherwise identifiable (e.g., meaningful account titles, category descriptions, etc.).

    It should be noted that cost data as reported directly from cost accounting systems are useful, but are usually neither clean nor organized in a manner that best supports project control planning, methods and tools development, and so on.

    Normalizing data is a more complex step that involves translating the data so that they are on a standard or "normal" basis in terms of time, location, and currency. For example, if the project was located in Canada in 2005, and costs are in Canadian dollars, but all the other projects in the database were U.S. projects, entered in 1998 U.S. dollars, then the project’s cost would need to be adjusted (for currency exchange rate and time value of money differences) before entry into the database. After processing, the data may then be entered in an electronic database and/or kept in hard-copy form.

Qualitative Data (Process Lessons Learned)

    Database inputs also include qualitative information about the performance of the project process or system. This information may include assessments of how successful the project was in achieving its objectives, and what factors contributed to the project’s success or failure. Subjective information may be captured through the use of surveys, narrative descriptions, interviews, or formal lessons learned workshops. More objective information can be obtained by benchmarking the project. Benchmarking involves comparing the project’s practices and performance to selected projects or project systems that used the best practices and achieved the best performance.

    The qualitative information must also be cleaned, organized, and standardized as required for incorporation into master database(s). After processing, the data may then be entered in an electronic database and/or kept in hard-copy form. The goal is to capture qualitative information that will allow the next project team to build on successful approaches, avoid repeating unsuccessful ones, and provide context for assessing quantitative data.

Procedural (Methods and Tools Lessons Learned)

    In an extension of the above methods, additional qualitative information is captured about the performance of project control methods and tools. In this case, the goal is to capture information that will support efforts by the enterprise to develop or improve methods and tools (e.g., new or updated templates, forms, information systems, etc.) for the project system.

    For example, the process lessons learned may identify that project control results were poor because the cost accounting processes did not provide timely cost measurement to support performance assessment. In that case, lessons learned about accounting system methods and tools would be examined to find ways to improve project cost accounting timeliness.

.3 Analyze and Process Data

Reference Data Development

    Many planning and assessment methods and tools rely on reference databases of some sort. For example, the reference database for a cost estimating system may contain standard unit hours, unit costs, adjustment factors, and similar measures of work or material item cost and resource requirements. The reference data provide an empirical basis for planning.

    The reference data should be consistent, reliable, and competitive with a well defined basis (e.g., assumptions, conditions, etc.) such that any project team can determine how its requirements and basis conditions differ from the reference and adjust accordingly. The quality of a reference database is not judged by how correct or accurate its entries are in terms of representing the absolute cost or duration of any given item or activity on any given project. Rather, it is judged by how reliable it is a planning "base" in terms of competitiveness and consistency with consistency meaning that the basis is known and is consistent between similar items and does not change over time unless its change has been justified by analysis.

    Reference data are typically normalized to a standard basis (i.e., in terms of time, location, currency, conditions, etc.). Project data may be normalized at the time of collection and processing, or at the time that the reference database is created or updated. Established reference databases generally do not require constant updating; annual updates are common. At the time of review and update, the data from projects collected over the period are analyzed to determine if the existing data are still good references. If new reference data are developed, the basis must be consistent and well documented.

Benchmarks and Metrics Development

    Benchmarks and metrics are a form of reference data, but the purpose is primarily to support the validation of project plans (Section 8.1). Benchmarking is a process that compares practices, processes, and relevant measures to those of a selected basis of comparison (i.e., the benchmark) with the goal of improving performance. The comparison basis includes internal or external competitive or best practices, processes, or measures. Validation is a form of benchmarking applied specifically to plans to assess whether the plan results are competitive and achieve the performance objectives.

    A planning tool reference database typically contains data that support deterministic (i.e., bottoms-up or detailed) planning of low level components of the work breakdown. A metrics database will typically contain data (e.g., factors, ratios, etc) that support assessment of top levels of the work breakdown structure. For example, an estimating systems reference database may contain standard unit hours for discrete items of work (e.g., 2 hours per cubic meter to install rebar for a concrete spread footer) while a metrics database may contain standard unit hours for aggregated items of work (e.g., 8 hours per cubic meter to install all process plant cast-in-place concrete). Metrics are also useful references for stochastic (i.e., top-down or conceptual) planning methods and tools.

Methods and Tools Development

    Each of the project control processes includes a step for methods and tools development (e.g., new or updated templates, forms, systems, etc.). Each of these processes has historical information as an input for development of methods and tools. This information typically includes examples of methods and tools used on other projects and lessons learned from their use. In addition to qualitative analyses to promote continuous improvement, quantitative data can be used to support the development of planning algorithms (e.g., regression analysis of inputs and outputs to develop parametric estimating or scheduling models). Models are particularly critical to the asset planning process (see Section 3.2).

10.4.3 Inputs to Project Historical Data Management

.1 Project Implementation Basis. (See Section 4.1). This defines the basis asset scope, objectives, constraints, and assumptions. The historical database management plans must be evaluated with respect to their alignment with project objectives and requirements including the status and requirements for the master database(s).

.2 Project Control Plans. The project control plan (Section 8.1) describes specific systems and approaches to be used in project control including historical database management.

.3 Control Baseline Data. The historical database captures plan data (Section 8.1) as well as actual performance data.

.4 Actual Performance Data. Performance assessment (Section 10.1) feeds actual performance data to the historical database management process, most often at the time of project closeout.

.5 Performance and Methods and Tools Experiences. Qualitative lessons learned are collected from all project control processes (Chapters, 7, 8, 9, and 10).

.6 Project System and External Information. The strategic asset management process (Section 6.3) provides both internal project system and external industry benchmarking data.

10.4.4 Outputs from Project Historical Data Management

.1 Project Control Plans. Historical database management plans are considered in overall project control planning (and all other aspects of measurement and assessment planning).

.2 Planning Reference Data. Many planning methods and tools (Chapter 7 sections) rely on historically based reference data.

.3 Plan Validation Data. Benchmarking and validation methods (Section 8.1) rely on historically based benchmarks and metrics.

.4 Data to Support Methods and Tools Development. Each of the project control processes (Chapters, 7, 8, 9, and 10) includes a step for methods and tools development and each of these steps has historical project information (e.g., go-bys, lessons learned, modeling inputs, etc.) as an input.

.5 Information for Project System Management. Project data are inputs to the strategic asset management measurement processes (Sections 5.1 and 5.2) and asset management database (Section 6.3). The database itself is a strategic asset of the enterprise.

10.4.5 Key Concepts for Project Historical Data Management

    The following concepts and terminology described in this and other chapters are particularly important to understanding the project historical database management process of TCM:

.1 Database. Any collection of data or information that is retained for future use. The database is the documented manifestation of experience.

.2 Reference Data. Any database data or information that is used by a system to support its function (primarily empirical, quantitative data). Reference data quality is judged by how reliable it is as a planning "base" in terms of competitiveness and consistency, with consistency meaning that the basis is known, is consistent between similar items, and does not change over time unless its change has been justified by analysis.

.3 Lessons Learned. Qualitative information that describes what was learned during the performance of a process, method, or tool. Lessons learned are captured in a database to support development or improvement of processes, methods, and tools.

.4 Metric. Database data (primarily empirical, quantitative factors, ratios, etc.) that are used to assess the results of a process, method, or tool (see Validation).

.5 Benchmark. A metric that supports the benchmarking process.

.6 Benchmarking. A measurement and analysis process that compares practices, processes, and relevant measures to those of a selected basis of comparison (i.e., the benchmark) with the goal of improving performance.

.7 Validation. In project control, a form of benchmarking applied specifically to project plans to assess whether the plan results are competitive and achieve the project’s performance objectives (see Benchmarking).

.8 Continuous Improvement. Quality or process management methods that continuously identify, assess, and implement ideas to improve process performance. The methods use quantitative performance measurements of the process to identify improvement opportunities and ideas and to assess the results of implemented ideas. Databases are often used to retain measurement data.

.9 Normalization. To adjust data to a standard (i.e., normal) basis in terms of time, location, currency, technology, or other characteristics that define the normal basis.

.10 Project Closeout. A process performed at the end of a project to ensure that all project work, obligations, measurements, and transactions (e.g., charges, payments, etc.) are complete and systems are closed; to perform and report final project performance assessments; and to ensure that all required data, information, lessons learned, and deliverables are collected and processed for the appropriate historical database.

.11 Basis. Documentation that describes how an estimate, schedule, or other plan or database component was developed and defines the information used in support of development. A basis document commonly includes, but is not limited to, a description of the scope included, methodologies used, references and defining deliverables used, assumptions and exclusions made, clarifications, adjustments, and some indication of the level of uncertainty.

.12 Code of Accounts. Systematic coding structures for organizing and managing asset, cost, resource, and schedule activity information. An index to facilitate finding, sorting, compiling, summarizing, and otherwise managing information that the code is tied to.

Further Readings and Sources

    Despite the importance of empirical data to cost engineering processes and methods, the project historical data management process is not well covered by industry texts. The following references provide some basic information and will lead to more detailed treatments.


Copyright © 2008 By AACE® International
Comments/more information on the TCM Framework: An Integrated Approach to Portfolio, Program and Project Management may be directed to