Programming is a creative process, and very often attempts to measure any parameters of a project are considered as something seditious. Indeed, many programmers have a grin trying to measure the performance of a team, for example, by the number of written lines of source code. On the other hand, any process has a set of properties and variable parameters. Knowing their meaning is very important, as it helps to answer the question “How are you?”. Find out what is happening at the moment and plan steps, for example, to reduce errors in the product.
Obviously, the more measurable parameters, the more opportunities to analyze the situation. Very nice if you have data on the current number of bugs in your system. It is even better if you can look at it in the context of the main components, and even better if you know how quickly changes were made to these components in order to assess the risks and assign priorities to correct existing errors. This is one of the trivial scenarios, but to implement it, you need tools that collect data suitable for analysis. This is the very “primary accounting” of the project, which, when turned into reports, allows you to make decisions at a higher quality level.
It turns out that by and large these data have to be entered either manually or they are stored in such an unstructured form that their analysis becomes a laborious task.
')
Where to get the data from?
Automation of development processes is usually reduced to three main components. This management tasks ("task and bugs"), source code and automatic assembly. On the basis of these “three whales”, many software products have been created and this scheme has proved its viability in creating the continuous integration environment. There are quite a few solutions on the market, both free and paid, that will help you manage your code, tasks and errors, and build a project. Moreover, there are complexes that integrate these parts with each other and allow, for example, to associate changes that are made to the source code with errors or tasks, thereby allowing you to manage changes in the project at a higher quality level.
The level of integration of these components may be different, but the question related to measurements and reports is somehow not focused. Well, really, what can be reports on the source code database? The situation with tasks is a little better, many systems will allow you to filter out “tasks and bugs” by any status. If we consider the whole situation in the complex, then it turns out that combining all three components into a single set of reporting data is a non-trivial task.
Instruments
When designing the Visual Studio Team System in 2003, it was decided to create a complete analytics system as the most important component of the complex. The main objectives were as follows:
- 1) Automatically collect all possible information
- 2) Minimally reduce requests to the user for manual entry of information
- 3) Provide quick on-demand analysis tools (ad-hoc)
At the same time, the components of version control, task management and source code are considered as data sources that fill in information in the report database using special adapters. It should be noted that adapters can be extended. This allows you to fill the database with information from additional data sources, expanding the scheme. Source
code example of such an adapter can be found on the Codeplex website.

The processed data in the OLAP database can then be analyzed using reports based on Reporting Services or Pivot Excel tables.
This approach allowed to improve the quality of the source data by an order of magnitude, but most importantly, it allows analyzing the data in a single complex with a variety of filters and sections. Together with Team Foundation Server 2010, several ready-made reports based on Reporting Services are already being delivered, and these reports are described in the documentation for the
Agile and
CMMI process templates. But much more interesting is the situation with the reports that can be built on the basis of OLAP and Excel cross tables.
Excel as a swiss knife
Working with project analytical data using Excel is very simple. To do this, simply join the external data source of the TFS cube:

Typically, the server that hosts the TFS database is the OLAP server. Do not forget to get access rights to the user on whose behalf you are connecting to this server.

Then it remains only to choose the mode of displaying information:

And you can build your first Pivot table based on TFS data.

The TFS analytical cube contains
15 fact tables . This is the data that can be expressed in an aggregated form. This, for example, information such as the number of modified lines of code, the number of passed unit tests, the number of lines of code covered by unit tests (code coverage) and others. This information can be filtered in
various dimensions , which are more than 60. For example, date / time, project areas, iterations, modules, team members, test categories, projects, work item fields, and so on.

We connect all together
All these analytical sections and measurable parameters make it possible to build complex queries in literally “three clicks” or use
samples and change them to your liking. Knowing this information, the team and the management can accurately understand what the current situation is on the project. For example, it is possible to build a query that displays information about the components of the project, the number of tests passed, as well as the rate at which changes are made to these components. This report allows you to analyze the situation with the help of several sets of parameters. The thing is that even if there is a small number of errors in any component of your project, but the speed of making changes to the code is high, this component can be highlighted as a candidate for more attention of testers.

Report Output Options
An additional interesting feature of Excel reports is their “self-sufficiency”. Having formed the necessary report you can save it, send it to the interested persons by mail. And if you need to look at the actual data, just open the file again and update the data from the source. An additional interesting scenario is the publication of an Excel file on the Sharepoint site to a document library with Excel services enabled. This allows you to see graphs and tables in HTML form, and not require the user to access the OLAP cube TFS, and the possibility of attaching to it.
But analytical capabilities based on the OLAP data source can be used not only with Excel. Also, using standard SharePoint Portal Server tools, data from the OLAP cube TFS can be turned into KPIs and dashboards are formed that, in a brief, concise form of “traffic lights,” help to instantly assess the current state of the project. Examples of such indicators can be threshold values ​​of the number of errors on a project, the number of remaining open tasks for the date, data on the percentage of code coverage of the unit tests (code coverage).

I would also like to mention the new tool, Live Labs Pivot - a very interesting data visualization mechanism. There are already
examples of its use as a tool for outputting data from TFS data sources.
Conclusion
Forewarned is forearmed, the proverb says. The complexity of creating software systems more than ever depends on whether you can adequately assess the current situation on a project with the help of analytical tools, formal metrics and reports. If you know what is happening, it means you can take adequate steps to prevent unsuccessful developments. Applying the technical tools of Team System 2010, the assessment of the current situation on a project becomes a solvable task and allows you to focus the team on the things that affect the success of the entire project as a whole.