Differences between revisions 49 and 50
Revision 49 as of 2014-06-20 09:38:14
Size: 12452
Editor: AndriiChaus
Comment:
Revision 50 as of 2014-06-20 09:38:44
Size: 12452
Editor: AndriiChaus
Comment:
Deletions are marked like this. Additions are marked like this.
Line 209: Line 209:
(where
int k= 0; // Vector

(where int k= 0; // Vector

Christoph Bartels and Andrii Chaus

Email: <christoph.bartels@desy.de>

This page is not finished yet, and heavy editing has to be done!!

Documentation of my PhD analyses

A full documentation of the results can be read in my thesis The full source code of both analyses can be found on DESY dCache at:

/pnfs/desy.de/ilc/users/cbartels/

Currently, all scripts reside also in

/group/pool/cbartels/ILD/thesisscripts/

Model independent WIMP search

Analysis Walkthrough

The WIMP search is designed as model independent as possible. This entails two important decisions:

  • Use a model independent ( ;) ) signal description. The "model" used can be found in the paper of Birkedal "et al."

  • The signal contribution to the data is not explicitly generated and simulated, but obtained by a reweighting of events after the selection. This is possible because the dominant background process is indecernible from the signal on an event-by-event basis.

Directory structure:

  • thesisscripts: holds the scripts, of course
  • steeringfiles: self-explanatory
    • TreeFiller: steering files for the conversion from DST-slcio to Root Data structures, see below

  • results: holds all final and intermediate Root files (a lot of them, but usually not needed for the user)
  • plots: should be clear
  • data: top level directory for the data files:
    • Preprocessed: initial RootFiles from SLCIO conversion. These files are expanded with an additional tree in the photon merging, see below

    • Calibrated : new set of RootFiles with calibrated photon energies

    • Selection: Results of feeding the calibrated data through the event selection
    • Final: The selected files are split into three independent sets, (Data, Signal, Background, see below)
    • Template: holds the polarisation mixed background and signal files, normalised to L = 50 fb-1.
      • Also the parametrized spectra for signal and background are stored here
    • Auxiliary: Location of mostly intermediate files for the paramterization, for example the theoretical spectra on tree level that have to be corrected for efficiencies, detector effects etc.

Data Samples

The data samples used were produced by the ILD community for the detector optimisation effort in 2008. The fully reconstructed data files can be found in the International Linear Collider Simulations Database. The DST-files are processed with the Marlin Processor TreeFiller, which writes the most important event data into a Root tree structure. Most importantly, the MCParticles and ReconstructedParticle collections are written. Furthermore, the Detection probability of high energy electrons and photons in the forward calorimetry is calculated and stored.

Naming scheme for outputfiles (exemplary):

ILD00_n1n1aaa_Pe-1.0_Pp1.0Tree.root

The data flow is steered with .txt files located in the directory "steeringfiles". Depending on which step is to be performed, the corresponding scripts will read the required steeringfile.

Post-reconstruction data processing

Two corrections have been performed on the simulated data samples. Both corrections are linked to the photons being uncharged. While for charged particles tracking information is used in the clustering stage to match particle momentum and energy depositions, without this information, the clustering is purely topological. This results in split em Clusters from single incident photons, each subcluster being identified as an individual photon candidate, and energy lost in the insensitive material of the cracks in the calorimeters.

Photon splitting

The photon splitting is recovered by an iterative merging of photon candidates with a cone based method. The cone opening angle has been optimised with respect to purity and efficiency. The optimisation algorithm can be found in FindMergeAngle.C.

The so found opening angle is then used to merge photon candidates by use of the function MergeRecoPhotons() in the root script PreprocessData.C

The routine adds an additional RootTree to the data files

ILD00_n1n1aaa_Pe-1.0_Pp1.0Tree.root

Energy calibration

After the merging procedure, the photon energies are recalibrated as function of their polar angle. The calibration function is determined with the script CalibrationFunctions.C and subsequently applied to the photon merged data samples with CalibrateData.C

The calibrated data is written into a new set of RootFiles (mostly for keeping filesize in reasonable limits), e.g.

ILD00_Calibrated_n1n1aa_Pe1.0_Pp-1.0Tree.root

Event Selection

The routine Selection.C performs the event selection. The following criteria are applied:

For the signal definition at least one photon with

  • 10 GeV < E < 220 GeV and

  • -0.98 < cos( Theta ) < 0.98

Then:

  • Maximal visible energy excluding the most energetic photon E_vis < 20.0 GeV

  • Maximal track p_T < 3.0 GeV

  • Rejection of high energy electrons tagged in the forward calorimetry.

The output is a new set of RootFiles

ILD00_Selection_aaaa_Pe-1.0_Pp1.0Tree.root

each holding an additional Tree with the properties of the selected photon.

Next, the Selected event files are split into three sets of independent data files using the routine Create_S_B_D_Files.C. For each Selection root file, three new Files are created for the signal contribution to the data (Signal) the background contribution (Data) and for the spectrum parametrization (Background) The files are named accordingly:

ILD00_Signal_n1n1a_Pe-1.0_Pp1.0Tree_1.root

ILD00_Data_n1n1a_Pe1.0_Pp-1.0Tree.root

ILD00_Background_e1e1_Pe1.0_Pp1.0Tree.root

Signal Generation and Polarisation Mixing

Before the parameter determination, the signal contribution to the data for all WIMP parameters and polarisation states is obtained from the ILD00_Signal_n1n1a_Pe-1.0_Pp1.0Tree_1.root files. The cross section normalization of the signal is done later during analysis.

The selected event files are reweighted with the scripts CreateDataForSignalAndBackground.C and CreateDataForSignalAndBackgroundEqualSign.C. Both scripts have to be compiled for RooT by calling e.g.

.L CreateDataForSignalAndBackground.C++

on the root command line. They use the Fortran library libnunugamma_DiffXSec.so coded by O.Kittel. The library has to be compiled in advance from the file nunugamma_DiffXSec.f

Outpt files are named as:

1D_Data_Neutralino_Pe-0.8_Pp0.0_Lumi50.root

Also, with the script CreateDataBackground.C, the polarisation mixed Background contribution to the data is obtained.

Output files:

Data_Background_Pe-0.8_Pp-0.3_Lumi50.root

Theoretical Signal and Background predictions

For creation of the theory expectations of signal and background, the following routines are used

Analysis

Helicity structure and signal cross section

The scripts CouplingsXsecInputPp30.C and CouplingsXsecInputPp60.C extract the fully polarized cross sections sigma_LR and from the simulated data and combine the results for the unpolarized cross section sigma_0. Both scripts require Systematics_functions.C which provides functions to calculate systematic errors. To call the script CouplingsXsecInputPp30.C, first decide in the header for the polarisation error dP_P, then call the function with, in this order (experimental luminosity, candidate mass, 0 or 1 for s- or p-wave production, the unpolarised cross section, the coupling scenario equal helicity anti-sm (0,1,2))

The routines create some nice plot for sigma_RL and furthermore gives information on the measured values. For the neutralino part of the anaysis, there exists the script NeutralinoCouplingsXsecInputPp30.C

Mass determination and partial wave

The basic script to be called is WIMPMassScanXsecInput.C with the parameters (experimental luminosity, Pe, Pp, candidate mass, 0 or 1 for s- or p-wave production, the coupling scenario equal helicity anti-sm (0,1,2), the unpolarised cross section, use only signal spectrum edge (0 or1), rebin N bins for nicer plots) The script stores the determined chi2 values and fitted masses in a MassScan_Lumi....root file. To simplify the procedure, the python script RunWIMPMassScanSeriesXSecInput.py loops over all parmeters.

The final results are obtained with the scripts ChiMin.C, FitMassVsTrueMass.C RelativeErrorMassDetermination.C, all of which need the results stored in the MassScan...root files.

Again, the neutralinos have their own script NeutralinoMassScanXsecInput.C

Cherenkov Detector Prototype

Simulation and simulated data

==


Andrii Chaus

Sensitivity reach for ILC

Currently, all scripts located in

/afs/desy.de/group/flc/pool/achaus/FINAL_script_and_data/scripts

Simulation data for operator approach

The selected event files are reweighed with the scripts CreateDataForSignalAndBackground.C. This script have to be compiled for RooT by calling e.g.

.L CreateDataForSignalAndBackground.C++

on the root command line. Or run on Naf script CreateDataForSignalAndBackground.py.

For different operators you need to change paramiter "k"

(where int k= 0; // Vector

//int k= 1; // Scalar,s-channel

//int k= 2; // Axial-vector)

  • and also the cross section formula for correspond operator in function "WIMPdsigmadx_new" in the file WIMP_functions.C need to be chosen.

Output files are named as:

1D_Data_Signal_Pe0.0_Pp0.0_Lumi500.root

Also, with the script CreateDataBackground.C, the polarization mixed Background contribution to the data is obtained.

Sensitivity calculation

For sensitivity calculation TSysLimit package are used.

In TSysLimit_searches.C sensitivity reach calculation for 3 operators(Vector, Axial-vector, Scalar s-channel) with implemented systematic errors for one WIMP mass are written.

To compile the TSysLimit_searches.C the command "make TSysLimit" are used.

Output files are named as:

Sensitivity_Pe0.0_Pp0.0_Lumi500_1Mass_90CLTSys_lumi_pole_polp_beamspec_select_withsys_corel.root

Scriptssubmitter.sh and submitter_multi.sh are used to run list of jobs on Naf.

Final plot could be received using hadd command, when all the jobs are finished.

This steps are shown in case when shape information are taken into account. For "one bin" calculation TSysLimit_searches_onebin.C are used.

WIMPs searches on ILC (last edited 2015-02-20 18:27:35 by AnnikaVauth)