|Deletions are marked like this.||Additions are marked like this.|
|Line 244:||Line 244:|
|Where submitter.sh are loop over the mass range and runLambdaSearch.sh run appropriate executable.||Where "submitter.sh" are loop over the mass range and "runLambdaSearch.sh" run appropriate executable.|
Christoph Bartels and Andrii Chaus
This page is not finished yet, and heavy editing has to be done!!
Documentation of my PhD analyses
Currently, all scripts reside also in
Model independent WIMP search
The WIMP search is designed as model independent as possible. This entails two important decisions:
Use a model independent ( ) signal description. The "model" used can be found in the paper of Birkedal "et al."
The signal contribution to the data is not explicitly generated and simulated, but obtained by a reweighting of events after the selection. This is possible because the dominant background process is indecernible from the signal on an event-by-event basis.
- thesisscripts: holds the scripts, of course
- steeringfiles: self-explanatory
TreeFiller: steering files for the conversion from DST-slcio to Root Data structures, see below
- results: holds all final and intermediate Root files (a lot of them, but usually not needed for the user)
- plots: should be clear
- data: top level directory for the data files:
Preprocessed: initial RootFiles from SLCIO conversion. These files are expanded with an additional tree in the photon merging, see below
Calibrated : new set of RootFiles with calibrated photon energies
- Selection: Results of feeding the calibrated data through the event selection
- Final: The selected files are split into three independent sets, (Data, Signal, Background, see below)
- Template: holds the polarisation mixed background and signal files, normalised to L = 50 fb-1.
- Also the parametrized spectra for signal and background are stored here
- Auxiliary: Location of mostly intermediate files for the paramterization, for example the theoretical spectra on tree level that have to be corrected for efficiencies, detector effects etc.
The data samples used were produced by the ILD community for the detector optimisation effort in 2008. The fully reconstructed data files can be found in the International Linear Collider Simulations Database. The DST-files are processed with the Marlin Processor TreeFiller, which writes the most important event data into a Root tree structure. Most importantly, the MCParticles and ReconstructedParticle collections are written. Furthermore, the Detection probability of high energy electrons and photons in the forward calorimetry is calculated and stored.
Naming scheme for outputfiles (exemplary):
The data flow is steered with .txt files located in the directory "steeringfiles". Depending on which step is to be performed, the corresponding scripts will read the required steeringfile.
Post-reconstruction data processing
Two corrections have been performed on the simulated data samples. Both corrections are linked to the photons being uncharged. While for charged particles tracking information is used in the clustering stage to match particle momentum and energy depositions, without this information, the clustering is purely topological. This results in split em Clusters from single incident photons, each subcluster being identified as an individual photon candidate, and energy lost in the insensitive material of the cracks in the calorimeters.
The photon splitting is recovered by an iterative merging of photon candidates with a cone based method. The cone opening angle has been optimised with respect to purity and efficiency. The optimisation algorithm can be found in FindMergeAngle.C.
The so found opening angle is then used to merge photon candidates by use of the function MergeRecoPhotons() in the root script PreprocessData.C
The routine adds an additional RootTree to the data files
After the merging procedure, the photon energies are recalibrated as function of their polar angle. The calibration function is determined with the script CalibrationFunctions.C and subsequently applied to the photon merged data samples with CalibrateData.C
The calibrated data is written into a new set of RootFiles (mostly for keeping filesize in reasonable limits), e.g.
The routine Selection.C performs the event selection. The following criteria are applied:
For the signal definition at least one photon with
10 GeV < E < 220 GeV and
-0.98 < cos( Theta ) < 0.98
Maximal visible energy excluding the most energetic photon E_vis < 20.0 GeV
Maximal track p_T < 3.0 GeV
- Rejection of high energy electrons tagged in the forward calorimetry.
The output is a new set of RootFiles
each holding an additional Tree with the properties of the selected photon.
Next, the Selected event files are split into three sets of independent data files using the routine Create_S_B_D_Files.C. For each Selection root file, three new Files are created for the signal contribution to the data (Signal) the background contribution (Data) and for the spectrum parametrization (Background) The files are named accordingly:
Signal Generation and Polarisation Mixing
Before the parameter determination, the signal contribution to the data for all WIMP parameters and polarisation states is obtained from the ILD00_Signal_n1n1a_Pe-1.0_Pp1.0Tree_1.root files. The cross section normalization of the signal is done later during analysis.
on the root command line. They use the Fortran library libnunugamma_DiffXSec.so coded by O.Kittel. The library has to be compiled in advance from the file nunugamma_DiffXSec.f
Outpt files are named as:
Also, with the script CreateDataBackground.C, the polarisation mixed Background contribution to the data is obtained.
Theoretical Signal and Background predictions
For creation of the theory expectations of signal and background, the following routines are used
BackgroundToParametrise.C creates the background spectra to be parametrised
ParametriseBackground.C Does the parametrisation with a seventh order polynomial. In effect, it only corrects for the remaining differences between the output CorrectBackgroundXSec.C and the data spectra to be parametrised.
CreateSignalExpectation.C finally generates the expected signal contribution from the parametrised background samples in a similar way as the signal in the data.
Helicity structure and signal cross section
The scripts CouplingsXsecInputPp30.C and CouplingsXsecInputPp60.C extract the fully polarized cross sections sigma_LR and from the simulated data and combine the results for the unpolarized cross section sigma_0. Both scripts require Systematics_functions.C which provides functions to calculate systematic errors. To call the script CouplingsXsecInputPp30.C, first decide in the header for the polarisation error dP_P, then call the function with, in this order (experimental luminosity, candidate mass, 0 or 1 for s- or p-wave production, the unpolarised cross section, the coupling scenario equal helicity anti-sm (0,1,2))
The routines create some nice plot for sigma_RL and furthermore gives information on the measured values. For the neutralino part of the anaysis, there exists the script NeutralinoCouplingsXsecInputPp30.C
Mass determination and partial wave
The basic script to be called is WIMPMassScanXsecInput.C with the parameters (experimental luminosity, Pe, Pp, candidate mass, 0 or 1 for s- or p-wave production, the coupling scenario equal helicity anti-sm (0,1,2), the unpolarised cross section, use only signal spectrum edge (0 or1), rebin N bins for nicer plots) The script stores the determined chi2 values and fitted masses in a MassScan_Lumi....root file. To simplify the procedure, the python script RunWIMPMassScanSeriesXSecInput.py loops over all parmeters.
Again, the neutralinos have their own script NeutralinoMassScanXsecInput.C
Cherenkov Detector Prototype
Simulation and simulated data
Sensitivity reach for ILC
Currently, all scripts located in
Simulation data for operator approach
The selected event files are reweighed with the scripts CreateDataForSignalAndBackground.C. This script have to be compiled for RooT by calling e.g.
on the root command line. Or run on Naf script CreateDataForSignalAndBackground.py.
For different operators you need to change parameter "k" (where
int k= 0; // Vector
//int k= 1; // Scalar,s-channel
//int k= 2; // Axial-vector)
and also the cross section formula for correspond operator in function "WIMPdsigmadx_new" in the file WIMP_functions.C need to be chosen.
Output files are named as:
Also, with the script CreateDataBackground.C, the polarization mixed Background contribution to the data is obtained.
The ntuples could be found
Every folder named like Lumi250_Vector_full, where given the luminosity and name of operator.
And there are two folders with background Background_500 and Background_2000, where number indicate luminosity for which this background was generate.
For sensitivity calculation TSysLimit package are used (Sensitivity_3Sigma or Sensitivity_90percent_CL folders).
In TSysLimit_searches.C sensitivity reach calculation for one of the 3 operators(Vector, Axial-vector or Scalar s-channel) with implemented systematic errors for one WIMP mass are written.
To compile the TSysLimit_searches.C the command "make TSysLimit" are used.
Output files are named as:
and saved to "Result" folder.
Final plot could be received using hadd command, when all the jobs are finished.
This steps are shown in case when shape information are taken into account.
For "one bin" calculation TSysLimit_searches_onebin.C are used.