General

Useful links:

Status

Status (18.4. evening):

April18 cosmics data is here:

/nfs/dust/ilc/group/flchcal/AHCAL_Commissioning_2017_2018/2018Apr/Full_stack_test/cosmics/slcio

See elog for run numbers

Dummy calibration constants are in DB Folder:

/cd_calice_Ahc2/TestbeamMay2018

Merged calibration constants from Feb18, Mar18 commissioning test beams have tags ahc_001 for Pedestal, Mip constants and Gain constants.

Challenges

Cosmics Challenge:

Run Quasi-Online Monitor on data of cosmic test stand (40 layers). See cosmic ray events in event display.

Tasks to do:

Main Goal done. We see cosmics!

LED Challenge:

Look into LED data of amplitude scan with autotrigger. See that if itensities go up, the channels start firing.

Determine trigger thresholds on MIP scale

Tasks to do:

Tasks Overview

Reconstructed Data

First plots from cosmics: cosmics

To run on the cosmic data use the LATEST Calice soft version

and this steering file: steering_cosmics_2018April.xml. (Only runs with latest CALICE Soft version (revision 4808)

Add module to layer conversion parameters to Ahc2CalibrateProcessor. steering file: steering_cosmics_2018April_4812.xml. (revision 4812)

Fixin Module --> Layer conversion for latest Calice soft version. steering file: steering_cosmics_2018April_4813.xml (revision 4813)

Software Packages

DQM4HEP

Experts: Olin, Vladimir

Instructions to install ILCsoft in Ubuntu 16 LTS release

git clone https://github.com/CALICETB/ILCSoftInstall.git
cd ILCSoftInstall
emacs CMakeLists.txt # make the change XERCES_version to 3.2.1
mkdir build
cd build
cmake ../
make

Then put following lines to the ~/.bashrc:

source /your/path/to/ILCSoftInstall/ilcsoft/init_ilcsoft.sh
export LCIO_DIR=$LCIO
export ILCUTIL_DIR=$ILCSOFT/ilcutil/v01-02-01/
source /your/path/to/ILCSoftInstall/ilcsoft/root/5.34.36/bin/thisroot.sh

To install DQM4HEP:

First download the git-hub package:
git clone https://github.com/DQM4HEP/dqm4hep.git
cd dqm4hep
git checkout v01-04-04
mkdir build
cd build
cmake -C /your/path/to/ILCSoftInstall/ilcsoft/ILCSoft.cmake  -DBUILD_DQMVIZ=on -DBUILD_DQM4ILC=on -DFORCE_DIM_INSTALL=on -DFORCE_DIMJC_INSTALL=on -DBUILD_TESTING=off -DUSE_MASTER=off ../
make # will do make install automatically

After that the DQM4ILC needs to be installed from the separate package, otherwise any make on the DQM4HEP will overwrite the modified files

git clone https://github.com/jkvas/DQM4ILC.git
cd DQM4ILC/
git checkout Testbeam2018May # a new branch will appear soon
mkdir build
cd build/
cmake -C /your/path/to/ILCSoftInstall/ilcsoft/ILCSoft.cmake -DDQMCore_DIR=/your/path/to/dqm4hep/ -Dxdrlcio_DIR=/your/path/to/dqm4hep/ -DBUILD_AHCAL=on ../
make install; #caution do not forget the install!!!

and update again the .bashrc file, the DQM4HEP_PLUGIN_DLL variable should contain all used libraries:

export LD_LIBRARY_PATH=/your/path/to/dqm4hep/lib:$LD_LIBRARY_PATH
export PATH=/your/path/to/dqm4hep/bin:$PATH
export DIM_DNS_NODE=localhost
export DQM4HEP_PLUGIN_DLL=/your/path/to/DQM4ILC/lib/libDQM4ILC.so:/your/path/to/DQM4ILC/lib/libAHCAL_15Channels.so:/your/path/to/DQM4ILC/lib/libAHCAL_15Layers.so:/your/path/to/DQM4ILC/lib/libBIF_AHCAL_Correlation.so:/your/path/to/DQM4ILC/lib/libBIF_FindOffset.so

Setting up the monitor

Environment

The first step is to set up the correct environment, since each element should be run in a separate terminal window. Add the following lines to your .bashrc, changing the directories as necessary.

export LD_LIBRARY_PATH=/your/path/to/dqm4hep/lib:$LD_LIBRARY_PATH
export PATH=/your/path/to/dqm4hep/bin:$PATH
export DIM_DNS_NODE=localhost
export DQM4HEP_PLUGIN_DLL=/your/path/to/DQM4ILC/lib/libAHCAL_15Layers.so:/your/path/to/DQM4ILC/lib/libAHCAL_40Layers.so:/your/path/to/DQM4ILC/lib/libDQM4ILC.so:/your/path/to/DQM4ILC/lib/libAHCAL_15Channels.so

DNS

In a new terminal window with init.sh already executed, enter the following in the terminal window:

dns &

Leave this terminal open for the entire duration; closing it will break the connections between all other processes.

Run Control

In a new terminal window with init.sh already executed, enter this command to start the run control:

dqm4hep_start_run_control_server -v ALL -r AHCALRunControl&

The argument after -r is the name of this process. This must be the same as in your XML file. The argument after -k is the password needed to begin or end a run.

Then you need to start the interface to the server. Enter this command:

dqm4hep_start_run_control_interface -v ALL -r AHCALRunControl&

The argument after -r is the name of this process. This must be the same as in your XML file, as well as the server above. A window with a GUI will open. As far as I know, all that is necessary for online monitoring is to press the START RUN button, which ensures that the event collector and analysis module work correctly.

DQM Monitor GUI

In a new terminal window with init.sh already executed, enter this command to start the monitor GUI:

 dqm4hep_start_dqm_monitor_gui -v WARN -f your/path/to/test_canvases1.xml 

test_canvases1.xml

This will open a GUI for viewing plots and histograms. To see your histograms, click "Browse", then choose the name of your monitor element collector from the drop-down list. You can select which plots to add using the checkboxes. Clicking "Append" adds all checked plots to the main window. There may be no options displayed if the monitor element collector is not running..

Plots will not be updated or kept unless they are subscribed. Individual plots can be subscribed to by checking the box by their names. All plots can be subscribed to by right-clicking anywhere on the list of plots and selecting "Subscribe all". Once subscribed, monitor elements are stored regardless of whether the plot is displayed. A plot can be displayed by double-clicking it. You can click "Update" to manually refresh all plots, or click "Start Update" to make them update automatically. New canvases for plots can be opened using the + sign at the top-right of the window.

Event Collector

In a new terminal window with init.sh already executed, enter this command to start the event collector:

dqm4hep_start_event_collector -v ALL -c ahcal_event_collector&

The argument after -c is the name of this process, the name after -s is the name of the file streamer to save events from. These must be the same as in your XML file.

Monitor Element Collector

In a new terminal window with init.sh already executed, enter this command to start the monitor element collector:

dqm4hep_start_monitor_element_collector -v ALL -c ahcal_me_collector&

The argument after -c is the name of this process. This must be the same as in your XML file.

Analysis Module

In a new terminal window with init.sh already executed, enter this command to start the analysis module:

 dqm4hep_start_analysis_module -v ALL -f /your/path/to/DQM4ILC/scripts/steering4dqm_tb_2018.xml&

steering4dqm_tb_2018.xml

The argument after -f is the path to the XML file used to define your histograms and other processes. ""IMPORTANT:"" Make sure that all the process names in the XML file are the same as the ones you use at runtime. Ensure that you are using the same type of histogram in the XML file as your module file, otherwise the analysis module will throw segmentation faults as soon as it tries to push data into histograms.

LCIO File Service

This should be the last thing to be run, as when the LCIO file service is started the run will begin and data will start being processed by the analysis module. In a new terminal window with init.sh already executed, enter this command to start the LCIO file service:

dqm4hep_start_lcio_file_service  -c ahcal_event_collector -t 1000 -f test/180318120334_ahcalandbif_run009595.slcio&

The argument after -f is the path to the SLCIO file to read events from (separate several with a colon), the argument after -c is the name of the event collector to send events to, and the argument after -t is the time interval between reading events and sending them to the event collector (in microsecs).


Quasi-Online Monitor

Experts: David, Saiva

myMarlin.sh

Reco_sed.sh

Reco.xml

steering.xml

Old Documentation


Installation:

- ssh username@nafhh-ilc01.desy.de (nafhh-ilc02 should not matter also)

- cd /afs/desy.de/group/flc/pool/username/

Making sure the compiler is set correctly:

function iniilcsoftsl6_v17_11(){

for QtReco:

git clone https://github.com/CALICETB/QtReco.git
cd QtReco
mkdir build
cd build
cmake -DBUILD_DOCUMENTATION=ON ..
make
make install

for QtReco-Client:

git clone https://github.com/CALICETB/QtReco-Client.git
cd QtReco-Client
mkdir build
cd build
cmake -DBUILD_DOCUMENTATION=ON ..
make
make install

after installation:

change in /afs/desy.de/group/flc/pool/username/QtReco/source/src in the MarlinModule.cpp file in the run() function in the path for the variable steering tmp/ to tmp/username/

change in /afs/desy.de/group/flc/pool/username/QtReco/xml in the steering.xml file for the parameter name="File" value path tmp/ to tmp/username/

change in /afs/desy.de/group/flc/pool/username/QtReco/scripts in the Reco_sed.sh file everywhere tmp/ to tmp/username/


Instructions for adding histograms

Event Display

Expert: David

The event display is a marlin processor that runs on reconstructed data

steering file for reconstruction: >> reconstructRawData_slcioIutput.xml <<

steering file for event display: >> eventDisplay.xml <<

Steps to run the event display:

Next step: Explore if we can integrate the event display in the Quasi-Online monitor

We can run on cosmic data with 38 layers!

Calibration Constants

Experts: Anna, Daniel, Yuji

Remarks:

Summary of current DB entries

April2018_SoftwareWorkshop/DBEntriesCommissioning

Software to adapt the module numbers

The first two programs swap the layer number with the correct module number, the first one is for the testbeam in Febuary and the second one for the testbeam in March.

sortConstantsFeb2018.py

sortConstantsMar2018.py

The third program merges all the constants from the different testbeams and the batches to one large file per constant (gain, mip and pedestal) and filters out the modules that were used in both testbeams. In these cases, only the constants from the testbeam in Febuary are used.

mergeCalibValues.py

Final values:

DB_constants_gain.txt Complete and OK!

DB_constants_mip.txt 130 Channels are missing in DB due to low statistics -> put dummy values in lines (0)

DB_constants_pedestal.txt Module 1 misses Chip 0-9 and Module 21 has Chip 0-9 twice (different values, values of Module 1!)

Corrected Files:

DB_constants_gain_corr.txt

DB_constants_mip_corr.txt

DB_constants_pedestal_corr.txt

Database

/cd_calice_Ahc2/TestbeamMay2018

dead channels and dummy values

DB Tags

Annas combined files for gain constants, mip constants and pedestals are uploaded to the database with tags: ahc2_001

How to run the DBviewer:

On a NAF machine:

source /afs/desy.de/group/flc/bin/setflcini
flcini calice_pro_test
export DBTestFld=cd_calice_Ahc2

change cd_calice_Ahc2 to the folder you want to investigate.

Then run:

dbview -v&

How to export DB to local files (.slcio)

First: Install the CALICE DataBase-tools as part of the Calice Software

https://svnsrv.desy.de/public/calice/calice_db_tools/trunk/

copy_dbfolder2dbfile --folder /path/to/database/folder --tag nameoftag

For example to extract the Pedestal HEAD values for the 38 layer prototype from the DB:

copy_dbfolder2dbfile --folder /cd_calice_Ahc2/TestbeamMay2018/Pedestal --tag ahc2_001

The .slcio file is named automatically, depending on what you exported and when. The file is saved in your current working directory.

IMPORTANT

-> Small bash script to create a folder in your current working directory and export all DB-constant subfolders as .slcio files to it (DB-path hard-coded!) DBtoslcio.sh

/nfs/dust/ilc/user/heucheld/dbfiles_may2018

How to use DB local files (.slcio) within reconstruction via Marlin

Instead of using the Parameter "DBCondHandler" use "DBFileHandler" in the steering file for reconstruction like:

 <processor name="GeoConditions" type="ConditionsProcessor">
    <parameter name="DBFileHandler" type="StringVec">

    conditionname1     path/to/local/calibration/container/file1.slcio   collectionname1   
    conditionname2     path/to/local/calibration/container/file2.slcio   collectionname2
    ...
    ...
         </parameter>
  </processor>

(Further information about the different file/condition handlers here: https://github.com/iLCSoft/Marlin/blob/master/source/src/ConditionsProcessor.cc)

For example:

 <processor name="GeoConditions" type="ConditionsProcessor">
    <parameter name="DBFileHandler" type="StringVec">

     E4DGainConstants    /afs/desy.de/group/flc/pool/heucheld/DBfiles_mar18_batch1/condDB_gain_constants_ahc2_001_20180426_094603.slcio    gain_constants
     ...
     ...



         </parameter>
  </processor>

where E4DGainConstants is the condition-name used further in the steering file and gain_constants is the collection-name as in the DB.

After doing this for all needed conditions/constants the reconstruction can be started as normal.

April2018_SoftwareWorkshop (last edited 2018-05-09 20:06:32 by VladimirBocharnikov)