WAC Collection

From PREX Wiki
Jump to navigationJump to search
Documentation
HOW TOs for shift crew
Expert Tools
All Expert Contacts


PREX Main << Weekly Analysis Coordinator << WAC Notes

Daily

Data Monitoring

The WAC is primarily tasked with verifying the quality of the data taken during each shift - the WAC Notes pages are the ideal place for this information.

Analysis Minding

It is necessary that the WAC keep track of the various runs, conditions, cuts, channel maps, and pedestal changes as a function of time and file system.

Reanalyzing with updated cuts as needed

The WAC is responsible for updating the "official" rootfiles for everyone else to use (and everyone else is not permitted to change these root files without WAC permission).

Utilizing the "Prompt" WAC version of the "operations" branch of Japan is the ideal way to accomplish this.

Prompt Code

The prompt code lives in the master branch of Prex-Prompt

An easy script to launch a new terminal, log into a new processor, and run a copy of prompt in the right environment is a cascade of shell scripts living in the ~/scripts/ folder

[apar@aonl1 ~/scripts]$ ./terminal_prompt_1.sh adaq3 3602

Tracking Data over time

The run-wise plots are uploaded to the folder /u/group/halla/www/hallaweb/html/parity/prex/onlinePlots, which is visible from Haweb

The WAC is also responsible for keeping track of the data over the course of "slugs" (generally defined as in/out insertions of the IHWP or flips of the Wein).

  • There are several codebases, initially we used Tao's postpan collection tool, the "Collector".
  • This was then enhanced with a more general drawing and textfile printing macro.
  • For general data analysis and daily aggregation, this has since been replaced with the root-file reading and histogram filling/parsing set of methods referred to as the "Aggregator"
  • The Aggregator is designed for pulling the data out of rootfiles directly and calculating histogram based quantities off of that (rather than trusting intermediary programs to have the extensibility needed for arbitrary data manipulations)

Collector Code

The collector code lives in Tao's collector repository

To Collect:

  1. login as apar@adaq3
  2. cd into ~/PREX/prompt/collector
  3. source ../../setup_japan.tcsh

WAC List Creation

  1. create a list of run number you want to (see for example test.list)
  2. use PVDB/RCND/RCDB commands/website
  3. make a slug#.list

Collector:

  1. run the command: ./collector -d ../results/ -l test.list (Note:replace test with your file name)
  2. a root file named prexPrompt_test.root will be stored in the rootfiles directory
  3. to produce plots stay in the collector directory and run ./aggregate
  4. you will be asked to enter a test part of the rootfile, enter it and hit Enter
  5. a aggregate_plots_test.pdf file will be created in the plots directory.

Draw Post Pan:

  1. cd into drawPostpan
  2. [apar@adaq1 ~/PREX/prompt/collector/drawPostpan]$ root -l -b -q drawPostPan.C'("../rootfiles/prexPrompt_slug10.root","slug10-Left","list.txt")'

Aggregator

To reanalyze and make new run and minirun files for each run

  • Make your slug list, then do:
gojapan ; cd ../prompt/ ; ./agg_prompt_list.sh ~/PREX/prompt/collector/run_list/slug7.list
  • Once the aggregator is done running on all runs in a given slug you can hadd them together and add the units branch to them with
~/PREX/prompt/accumulate_aggFiles_list.sh ~/PREX/prompt/Aggregator/drawPostpan/run_lists/slug11.list slug11.root
  • Then to make plots/csv file do
[apar@adaq2 ~/PREX/prompt/Aggregator/drawPostpan]$ root -l -b -q plotAgg.C'("slugRootfiles/run_slug9.root","run_slug9")'
  • To look at the data by hand, find the output slug aggregated files and do:
root /chafs2/work1/apar/aggRootfiles/slugRootfiles/run_slug11.root
root [1] agg->Draw("reg_asym_usl_mean:run_number","","*")