Revision 24 as of 2012-03-01 14:05:55

Clear message

Christoph Bartels

Email: <christoph.bartels@desy.de>

This page is not finished yet, and heavy editing has to be done!!

Documentation of my PhD analyses

A full documentation of the results can be read in my thesis The full source code of both analyses can be found on DESY dCache at:

/whatever/

Currently, all scripts reside also in

/group/pool/cbartels/ILD/thesisscripts/

Model independent WIMP search

Analysis Walkthrough

The WIMP search is designed as model independent as possible. This entails two important decisions:

Data Samples

The data samples used were produced by the ILD community for the detector optimisation effort in 2008. The fully reconstructed data files can be found in the International Linear Collider Simulations Database. The DST-files are processed with the Marlin Processor TreeFiller, which writes the most important event data into a Root tree structure. Most importantly, the MCParticles and ReconstructedParticle collections are written. Furthermore, the Detection probability of high energy electrons and photons in the forward calorimetry is calculated and stored.

Naming scheme for outputfiles (exemplary):

ILD00_n1n1aaa_Pe-1.0_Pp1.0Tree.root

The data flow is steered with .txt files located in the directory "steeringfiles". Depending on which step is to be performed, the corresponding scripts will read the required steeringfile.

Post-reconstruction data processing

Two corrections have been performed on the simulated data samples. Both corrections are linked to the photons being uncharged. While for charged particles tracking information is used in the clustering stage to match particle momentum and energy depositions, without this information, the clustering is purely topological. This results in split em Clusters from single incident photons, each subcluster being identified as an individual photon candidate, and energy lost in the insensitive material of the cracks in the calorimeters.

Photon splitting

The photon splitting is recovered by an iterative merging of photon candidates with a cone based method. The cone opening angle has been optimised with respect to purity and efficiency. The optimisation algorithm can be found in FindMergeAngle.C.

The so found opening angle is then used to merge photon candidates by use of the function MergeRecoPhotons() in the root script PreprocessData.C

The routine adds an additional RootTree to the data files

ILD00_n1n1aaa_Pe-1.0_Pp1.0Tree.root

Energy calibration

After the merging procedure, the photon energies are recalibrated as function of their polar angle. The calibration function is determined with the script CalibrationFunctions.C and subsequently applied to the photon merged data samples with CalibrateData.C

The calibrated data is written into a new set of RootFiles (mostly for keeping filesize in reasonable limits), e.g.

ILD00_Calibrated_n1n1aa_Pe1.0_Pp-1.0Tree.root

Event Selection

The routine Selection.C performs the event selection. The following criteria are applied:

For the signal definition at least one photon with

Then:

The output is a new set of RootFiles

ILD00_Selection_aaaa_Pe-1.0_Pp1.0Tree.root

each holding an additional Tree with the properties of the selected photon.

Next, the

Signal Generation

The selected event files are reweighted with the scripts CreateDataForSignalAndBackground.C and CreateDataForSignalAndBackgroundEqualSign.C. Both scripts have to be compiled for RooT by calling e.g.

.L CreateDataForSignalAndBackground.C++

on the root command line. They use the Fortran library libnunugamma_DiffXSec.so coded by O.Kittel. The library has to be compiled in advance from the file nunugamma_DiffXSec.f

Theoretical Signal and Background predictions

For creation of the theory expectations of signal and background, the following routines are used

Analysis

Helicity structure and signal cross section

The scripts CouplingsXsecInputPp30.C and CouplingsXsecInputPp60.C extract the fully polarized cross sections sigma_LR and from the simulated data and combine the results for the unpolarized cross section sigma_0. Both scripts require Systematics_functions.C which provides functions to calculate systematic errors. To call the script CouplingsXsecInputPp30.C, first decide in the header for the polarisation error dP_P, then call the function with, in this order (experimental luminosity, candidate mass, 0 or 1 for s- or p-wave production, the unpolarised cross section, the coupling scenario equal helicity anti-sm (0,1,2))

The routines create some nice plot for sigma_RL and furthermore gives information on the measured values. For the neutralino part of the anaysis, there exists the script NeutralinoCouplingsXsecInputPp30.C

Mass determination and partial wave

The basic script to be called is WIMPMassScanXsecInput.C with the parameters (experimental luminosity, Pe, Pp, candidate mass, 0 or 1 for s- or p-wave production, the coupling scenario equal helicity anti-sm (0,1,2), the unpolarised cross section, use only signal spectrum edge (0 or1), rebin N bins for nicer plots) The script stores the determined chi2 values and fitted masses in a MassScan_Lumi....root file. To simplify the procedure, the python script RunWIMPMassScanSeriesXSecInput.py loops over all parmeters.

The final results are obtained with the scripts ChiMin.C, FitMassVsTrueMass.C RelativeErrorMassDetermination.C, all of which need the results stored in the MassScan...root files.

Again, the neutralinos have their own script NeutralinoMassScanXsecInput.C

Cherenkov Detector Prototype

Simulation and simulated data

==