OpenOffice.org Profiling Tools
In order to speed up the startup time of OpenOffice applications we have developed a set of tools to derive profiling data from those applications. Our approach differs from that of other (commercial) tools in the following points:
The information that we gather from the applications covers larger blocks of code suitable for an overall analysis instead of logging every code line or every instruction.
The ability to write profiling information is compiled automatically into the code at (almost) every build.
Profiling data from different runs can be compared to each other, so that an automatic analysis can take place that informs you about speed-ups or slow-downs from version to version.
The profiling process can be divided into four steps:
Add macros for writing time stamps to your code. This of course requires a rough idea at what code parts are too slow and need to be optimized. Don't write too much time stamps because of the runtime penalty and more important in a run time analysis on a much higher than code line level. This step has to be done only once, so the overhead of adding macros for writing time stamps should not be too high.
Run the office application you are interested in. This generates a log file that contains all time stamps. If you are interested in more then start-up or file-open times use the scripting functionality of the office.
Post process the log files. In this step you can filter away unwanted time stamps that would otherwise clutter the analysis results and obscure those parts that you are interested in. Also the time stamps can be augmented by additional information.
Analyze the log files and generate reports. Compare the new timings with those of previous runs. This enables you to see if a piece of code got slower or faster. Generation of automatic notifications of the responsible developers is also possible.
The classes, functions, and macros used for instrumenting the source code in the first step are ready for using. See the section Source Code for details.
There is a first set of Perl scripts for post processing and generation of simple reports as Calc documents available. Both post-processing and analysis will be extended in the near future. See the Tools section for details.
The classes and macros for writing the time stamps can be found in
the sal project of the udk. They are defined in the files
sal/rtl/source/logfile.cxx. For inclusion into your
source code use
#include <rtl/logfile.h> or
#include <rtl/logfile.hxx> as described in the
How-To (see below under Documentation).
The scripts for post-processing the log files can be found in the (sub) project tools/contrib/profiling. Note that you have to check them out by hand.
write-calc-doc.pl transforms a log file into a Calc document that contains two data sheets for every thread. The first sheet displays a pretty printed version of the raw data. The second sheet is made up by a table that contains for each function or other scope the minimal, maximal, average, and total time and the number of calls. More details.
There is a How-To that describes how to instrument the source code so that profiling time stamps are written to the log file.
This document describes the format of the time stamps written to the log file.
The mailing list for discussing matters concerning this profiling project is firstname.lastname@example.org.