Status (18.4. evening):
- We got it running on Feb2018 data
- Got it running on April18 cosmics data with 38 layer
- Can select events with a minimum number of hits
We got it running on Feb2018 data (Currently, we only see EnergySum plot: problems with mapping).
- We can add histograms (e.g., timing histogram was added)
- can run on April18 cosmics data and see plots
DQM4HEP: Fighting (Vladimir, Olin: please update status)
- Duplicate modules resolved.
- Merging and changing layer to module number done.
- Learned how to export as slcio file
April18 cosmics data is here:
See elog for run numbers
Dummy calibration constants are in DB Folder:
Merged calibration constants from Feb18, Mar18 commissioning test beams have tags ahc_001 for Pedestal, Mip constants and Gain constants.
Run Quasi-Online Monitor on data of cosmic test stand (40 layers). See cosmic ray events in event display.
Tasks to do:
Prepare calibration constants for 40 layers --> Done
Get event display running, maybe integrate into Quasi Online Monitor--> Runs, integration to Quasi-Online monitor will be tried
Select proper events --> Needs extra processor
Main Goal done. We see cosmics!
Look into LED data of amplitude scan with autotrigger. See that if itensities go up, the channels start firing.
Determine trigger thresholds on MIP scale
Tasks to do:
- Run reconstruction on LED data
- Combine over several reconstructed runs
- Get mapping right for current 38 layer setup: Need to change module-Number / Layer
Combine calibration constants --> Done
Upload --> Done
- Get event display running
Run on Februar2018 data --> Done
Run on April2018 data --> Done
Explore if we can integrate in QtReco
- Can we select certain events?
Implementing a cut on minimum number of hits --> Done
- Get Quasi-Online monitor running
Run on Februar2018 data --> Done
Run on April2018 cosmics data--> Done
Try to add new histograms --> Done for timing
- Add useful histograms
- Get DQM4HEP running
Set up a development environment --> In progress
- Try to develop new features
- Learn how to export DB as a file
Export --> Done
Import --> Done
- Integrate pedestal offset values for memory cells
- Fix missing collection errors on software
All: Work on documentation
First plots from cosmics: cosmics
To run on the cosmic data use the LATEST Calice soft version
and this steering file: steering_cosmics_2018April.xml. (Only runs with latest CALICE Soft version (revision 4808)
Fixin Module --> Layer conversion for latest Calice soft version. steering file: steering_cosmics_2018April_4813.xml (revision 4813)
Experts: Olin, Vladimir
Instructions to install ILCsoft in Ubuntu 16 LTS release
git clone https://github.com/CALICETB/ILCSoftInstall.git cd ILCSoftInstall emacs CMakeLists.txt # make the change XERCES_version to 3.2.1 mkdir build cd build cmake ../ make
Then put following lines to the ~/.bashrc:
source /your/path/to/ILCSoftInstall/ilcsoft/init_ilcsoft.sh export LCIO_DIR=$LCIO export ILCUTIL_DIR=$ILCSOFT/ilcutil/v01-02-01/ source /your/path/to/ILCSoftInstall/ilcsoft/root/5.34.36/bin/thisroot.sh
To install DQM4HEP:
First download the git-hub package: git clone https://github.com/DQM4HEP/dqm4hep.git cd dqm4hep git checkout v01-04-04 mkdir build cd build cmake -C /your/path/to/ILCSoftInstall/ilcsoft/ILCSoft.cmake -DBUILD_DQMVIZ=on -DBUILD_DQM4ILC=on -DFORCE_DIM_INSTALL=on -DFORCE_DIMJC_INSTALL=on -DBUILD_TESTING=off -DUSE_MASTER=off ../ make # will do make install automatically
After that the DQM4ILC needs to be installed from the separate package, otherwise any make on the DQM4HEP will overwrite the modified files
git clone https://github.com/jkvas/DQM4ILC.git cd DQM4ILC/ git checkout Testbeam2018May # a new branch will appear soon mkdir build cd build/ cmake -C /your/path/to/ILCSoftInstall/ilcsoft/ILCSoft.cmake -DDQMCore_DIR=/your/path/to/dqm4hep/ -Dxdrlcio_DIR=/your/path/to/dqm4hep/ -DBUILD_AHCAL=on ../ make install; #caution do not forget the install!!!
and update again the .bashrc file, the DQM4HEP_PLUGIN_DLL variable should contain all used libraries:
export LD_LIBRARY_PATH=/your/path/to/dqm4hep/lib:$LD_LIBRARY_PATH export PATH=/your/path/to/dqm4hep/bin:$PATH export DIM_DNS_NODE=localhost export DQM4HEP_PLUGIN_DLL=/your/path/to/DQM4ILC/lib/libDQM4ILC.so:/your/path/to/DQM4ILC/lib/libAHCAL_15Channels.so:/your/path/to/DQM4ILC/lib/libAHCAL_15Layers.so:/your/path/to/DQM4ILC/lib/libBIF_AHCAL_Correlation.so:/your/path/to/DQM4ILC/lib/libBIF_FindOffset.so
Setting up the monitor
The first step is to set up the correct environment, since each element should be run in a separate terminal window. Add the following lines to your .bashrc, changing the directories as necessary.
export LD_LIBRARY_PATH=/your/path/to/dqm4hep/lib:$LD_LIBRARY_PATH export PATH=/your/path/to/dqm4hep/bin:$PATH export DIM_DNS_NODE=localhost export DQM4HEP_PLUGIN_DLL=/your/path/to/DQM4ILC/lib/libAHCAL_15Layers.so:/your/path/to/DQM4ILC/lib/libAHCAL_40Layers.so:/your/path/to/DQM4ILC/lib/libDQM4ILC.so:/your/path/to/DQM4ILC/lib/libAHCAL_15Channels.so
In a new terminal window with init.sh already executed, enter the following in the terminal window:
Leave this terminal open for the entire duration; closing it will break the connections between all other processes.
In a new terminal window with init.sh already executed, enter this command to start the run control:
dqm4hep_start_run_control_server -v ALL -r AHCALRunControl&
The argument after -r is the name of this process. This must be the same as in your XML file. The argument after -k is the password needed to begin or end a run.
Then you need to start the interface to the server. Enter this command:
dqm4hep_start_run_control_interface -v ALL -r AHCALRunControl&
The argument after -r is the name of this process. This must be the same as in your XML file, as well as the server above. A window with a GUI will open. As far as I know, all that is necessary for online monitoring is to press the START RUN button, which ensures that the event collector and analysis module work correctly.
DQM Monitor GUI
In a new terminal window with init.sh already executed, enter this command to start the monitor GUI:
dqm4hep_start_dqm_monitor_gui -v WARN -f your/path/to/test_canvases1.xml
This will open a GUI for viewing plots and histograms. To see your histograms, click "Browse", then choose the name of your monitor element collector from the drop-down list. You can select which plots to add using the checkboxes. Clicking "Append" adds all checked plots to the main window. There may be no options displayed if the monitor element collector is not running..
Plots will not be updated or kept unless they are subscribed. Individual plots can be subscribed to by checking the box by their names. All plots can be subscribed to by right-clicking anywhere on the list of plots and selecting "Subscribe all". Once subscribed, monitor elements are stored regardless of whether the plot is displayed. A plot can be displayed by double-clicking it. You can click "Update" to manually refresh all plots, or click "Start Update" to make them update automatically. New canvases for plots can be opened using the + sign at the top-right of the window.
In a new terminal window with init.sh already executed, enter this command to start the event collector:
dqm4hep_start_event_collector -v ALL -c ahcal_event_collector&
The argument after -c is the name of this process, the name after -s is the name of the file streamer to save events from. These must be the same as in your XML file.
Monitor Element Collector
In a new terminal window with init.sh already executed, enter this command to start the monitor element collector:
dqm4hep_start_monitor_element_collector -v ALL -c ahcal_me_collector&
The argument after -c is the name of this process. This must be the same as in your XML file.
In a new terminal window with init.sh already executed, enter this command to start the analysis module:
dqm4hep_start_analysis_module -v ALL -f /your/path/to/DQM4ILC/scripts/steering4dqm_tb_2018.xml&
The argument after -f is the path to the XML file used to define your histograms and other processes. ""IMPORTANT:"" Make sure that all the process names in the XML file are the same as the ones you use at runtime. Ensure that you are using the same type of histogram in the XML file as your module file, otherwise the analysis module will throw segmentation faults as soon as it tries to push data into histograms.
LCIO File Service
This should be the last thing to be run, as when the LCIO file service is started the run will begin and data will start being processed by the analysis module. In a new terminal window with init.sh already executed, enter this command to start the LCIO file service:
dqm4hep_start_lcio_file_service -c ahcal_event_collector -t 1000 -f test/180318120334_ahcalandbif_run009595.slcio&
The argument after -f is the path to the SLCIO file to read events from (separate several with a colon), the argument after -c is the name of the event collector to send events to, and the argument after -t is the time interval between reading events and sending them to the event collector (in microsecs).
Experts: David, Saiva
- ssh firstname.lastname@example.org (nafhh-ilc02 should not matter also)
- cd /afs/desy.de/group/flc/pool/username/
Making sure the compiler is set correctly:
- echo "Init ilcsoft 17-11 sl6 g++ 4.8"
export CC=/cvmfs/sft.cern.ch/lcg/external/gcc/4.8.1/x86_64-slc6-gcc48-opt/bin/gcc export CXX=/cvmfs/sft.cern.ch/lcg/external/gcc/4.8.1/x86_64-slc6-gcc48-opt/bin/g++ source /cvmfs/ilc.desy.de/sw/x86_64_gcc48_sl6/v01-17-11/init_ilcsoft.sh}
git clone https://github.com/CALICETB/QtReco.git cd QtReco mkdir build cd build cmake -DBUILD_DOCUMENTATION=ON .. make make install
git clone https://github.com/CALICETB/QtReco-Client.git cd QtReco-Client mkdir build cd build cmake -DBUILD_DOCUMENTATION=ON .. make make install
change in /afs/desy.de/group/flc/pool/username/QtReco/source/src in the MarlinModule.cpp file in the run() function in the path for the variable steering tmp/ to tmp/username/
change in /afs/desy.de/group/flc/pool/username/QtReco/xml in the steering.xml file for the parameter name="File" value path tmp/ to tmp/username/
change in /afs/desy.de/group/flc/pool/username/QtReco/scripts in the Reco_sed.sh file everywhere tmp/ to tmp/username/
The event display is a marlin processor that runs on reconstructed data
steering file for reconstruction: >> reconstructRawData_slcioIutput.xml <<
steering file for event display: >> eventDisplay.xml <<
Steps to run the event display:
- source ILCsoft
- export CED_PORT=7927 (check with what is written in eventDisplay.xml)
- Marlin eventDisplay.xml
Next step: Explore if we can integrate the event display in the Quasi-Online monitor
We can run on cosmic data with 38 layers!
Experts: Anna, Daniel, Yuji
- If modules were in both testbeams february and march 2018: Use February calibration constants
Important: Link calibration constants from DB to module numbers NOT layer numbers
Summary of current DB entries
Software to adapt the module numbers
The first two programs swap the layer number with the correct module number, the first one is for the testbeam in Febuary and the second one for the testbeam in March.
The third program merges all the constants from the different testbeams and the batches to one large file per constant (gain, mip and pedestal) and filters out the modules that were used in both testbeams. In these cases, only the constants from the testbeam in Febuary are used.
DB_constants_gain.txt Complete and OK!
DB_constants_mip.txt 130 Channels are missing in DB due to low statistics -> put dummy values in lines (0)
DB_constants_pedestal.txt Module 1 misses Chip 0-9 and Module 21 has Chip 0-9 twice (different values, values of Module 1!)
dead channels and dummy values
Annas combined files for gain constants, mip constants and pedestals are uploaded to the database with tags: ahc2_001
How to run the DBviewer:
On a NAF machine:
source /afs/desy.de/group/flc/bin/setflcini flcini calice_pro_test export DBTestFld=cd_calice_Ahc2
change cd_calice_Ahc2 to the folder you want to investigate.
How to export DB to local files (.slcio)
First: Install the CALICE DataBase-tools as part of the Calice Software
- Second: Use the tool copy_dbfolder2dbfile in the following way:
copy_dbfolder2dbfile --folder /path/to/database/folder --tag nameoftag
For example to extract the Pedestal HEAD values for the 38 layer prototype from the DB:
copy_dbfolder2dbfile --folder /cd_calice_Ahc2/TestbeamMay2018/Pedestal --tag ahc2_001
The .slcio file is named automatically, depending on what you exported and when. The file is saved in your current working directory.
- The connection to the DB (flccaldb02.desy.de:calice:caliceon:Delice.1:3306) is hard-coded and done automatically
- The default for the tag option is HEAD
- It is not possible to export a main folder including DB-subfolders with the calibration constants like /cd_calice_Ahc2/TestbeamMay2018. Only the individual subfolders with the calibration constants like /cd_calice_Ahc2/TestbeamMay2018/Pedestal or /cd_calice_Ahc2/TestbeamMay2018/mip-constants are possible
-> Small bash script to create a folder in your current working directory and export all DB-constant subfolders as .slcio files to it (DB-path hard-coded!) DBtoslcio.sh
- Latest local .slcio DB data of April/May prototype can be found in :
How to use DB local files (.slcio) within reconstruction via Marlin
Instead of using the Parameter "DBCondHandler" use "DBFileHandler" in the steering file for reconstruction like:
<processor name="GeoConditions" type="ConditionsProcessor"> <parameter name="DBFileHandler" type="StringVec"> conditionname1 path/to/local/calibration/container/file1.slcio collectionname1 conditionname2 path/to/local/calibration/container/file2.slcio collectionname2 ... ... </parameter> </processor>
(Further information about the different file/condition handlers here: https://github.com/iLCSoft/Marlin/blob/master/source/src/ConditionsProcessor.cc)
<processor name="GeoConditions" type="ConditionsProcessor"> <parameter name="DBFileHandler" type="StringVec"> E4DGainConstants /afs/desy.de/group/flc/pool/heucheld/DBfiles_mar18_batch1/condDB_gain_constants_ahc2_001_20180426_094603.slcio gain_constants ... ... </parameter> </processor>
where E4DGainConstants is the condition-name used further in the steering file and gain_constants is the collection-name as in the DB.
Important: The TAG parameter is no input parameter in the string vector anymore, since the downloaded local .slcio files are already representing specific TAGs
After doing this for all needed conditions/constants the reconstruction can be started as normal.