Skip to content
This repository has been archived by the owner on Jun 5, 2024. It is now read-only.

Commit

Permalink
Update fast_simulator (#54)
Browse files Browse the repository at this point in the history
* Harmonize with the latest WFSim/epix versions
* Fix.
* Fix.
* Added a draft of the nT macro-clustering based on Micha's study
* Fixed some calculation parameters, added a new example notebook
* Add NR filter before quanta generation (#53)
* New branch version.
* NR-only cut string.
* Fixed import of the NR cut config parameter
* Fixed version label
* nr cut with ER component smaller than 32kev
* nr cut
* field
* change er threshold
* add nr to bin run_epix
* Add NR cut info to README
* store_true for NR cut

Co-authored-by: Pavel Kavrigin <[email protected]>
Co-authored-by: Shenyang Shi <[email protected]>
Co-authored-by: ramirezdiego <[email protected]>

* Update HISTORY for release
* Bump version: 0.2.2 → 0.3.0
* Set arg defaults (#55)
* set nr_only cut default to false
* add error for invalid input format
* Save primary particle position (#56)
* Added x_pri, y_pri, z_pri to the output
* Debugged x_pri, y_pri, z_pri recording
* WFSim requirements

Co-authored-by: Diego Ramírez García <[email protected]>

* Update HISTORY for release
* Bump version: 0.3.0 → 0.3.1
* Fix the notebook + fix cS2 calculation.
* First implementation of BBF quanta generator (#57)
* Added BBF yield quanta generation in epix
* Fix white space
* updated readme
* New intermediate results as defaults
* Nex/Ni ratio for NR is computed inside and does not need to be provided
* update nr parameters
* add back fixed parameters in BBF fitting
* Correct syntax

Co-authored-by: Zihao Xu <[email protected]>
Co-authored-by: Diego Ramírez García <[email protected]>

* Update HISTORY for release
* Bump version: 0.3.1 → 0.3.2
* fix conflict with master
* removing import simualtor for merging with master
* adding simualator to init

Co-authored-by: Pavel Kavrigin <[email protected]>
Co-authored-by: Shenyang Shi <[email protected]>
Co-authored-by: ramirezdiego <[email protected]>
Co-authored-by: Andrii Terliuk <[email protected]>
Co-authored-by: Zihao Xu <[email protected]>
Co-authored-by: Giovanni Volta <[email protected]>
  • Loading branch information
7 people authored Sep 29, 2022
1 parent f27f1b5 commit 442274a
Show file tree
Hide file tree
Showing 18 changed files with 1,496 additions and 714 deletions.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.2.2
current_version = 0.3.2
files = setup.py epix/__init__.py
commit = True
tag = True
Expand Down
14 changes: 14 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
0.3.2 (2022-08-15)
==================
* First implementation of BBF quanta generator (#57)

0.3.1 (2022-06-15)
==================
* Save primary particle position (#56)
* Set arg defaults (#55)

0.3.0 (2022-05-20)
==================
* Add NR filter before quanta generation (#53)
* Update energy selection for Kr83m model (#52)

0.2.2 (2021-10-31)
==================
* Fixed field map loading (#50)
Expand Down
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,10 @@ The other keyword arguments are:
| `--MicroSeparationTime` | Clustering time (ns) | `10` |
| `--TagClusterBy` | decide if you tag the cluster (particle type, energy depositing process) according to first interaction in it (`time`) or most energetic (`energy`) | `energy` |
| `--MaxDelay` | Time after which we cut the rest of the event (ns) | `1e7` |
| `--YieldModel` | Model for yield/quanta generationg (nest or bbf) | `nest` |
| `--SourceRate` | Event rate for event separation<br /> - `0` for no time shift (G4 time remains)<br /> - `-1` for clean time shift between events<br /> - `>0` (Hz) for random spacing | `0` |
| `--CutNeutron` | Add if you want to filter only nuclear recoil events (maximum ER energy deposit 10 keV) | `false` |
| `--JobNumber` | Job number in full chain simulation. Offset is computed as `JobNumber * n_simulated_events/SourceRate`, where `n_simulated_events` is read from file. | `0` |
| `--OutputPath` | Output file path | Same directory as input file |
| `--Debug` | Tell epix if you want timing outputs | `false` |

8 changes: 6 additions & 2 deletions bin/run_epix
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ def pars_args():
parser.add_argument('--CutOnEventid', dest='cut_by_eventid',
action='store_true', default=False,
help='If true eventy start/stop acts on eventid instead of rows.'),
parser.add_argument('--CutNeutron', dest='nr_only',
action='store_true', default=False,
help='Keep NR included events with ER components smaller than 10 keV'),
parser.add_argument('--EntryStart', dest='entry_start', type=int,
action='store',
help='First event to be read. Defaulted is zero.'),
Expand All @@ -44,6 +47,8 @@ def pars_args():
action='store', default=0,
help='Event rate for event separation. 0 (default) means no time shift is applied for the '
'different events. Use -1 for clean spacing or give a rate >0 to space events randomly.')
parser.add_argument('--YieldModel', dest='yield', type=str,
default='nest', help = 'Switch between BBF and NEST yield model')
parser.add_argument('--JobNumber', dest='job_number', type=int,
action='store', default=0,
help='Job number in full chain simulation. Offset is computed as '
Expand All @@ -55,8 +60,7 @@ def pars_args():
'in the same dir as the input file.'))
parser.add_argument('--Debug', dest='debug',
action='store_true', default=False,
help=('If specifed additional information is printed to the console.')
)
help='If specifed additional information is printed to the console.')

args = parser.parse_args(sys.argv[1:])
return args
Expand Down
2 changes: 1 addition & 1 deletion configs/example_xenonnt.ini
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,4 @@ xe_density = 2.862
[GasPhase]
to_be_stored = False
electric_field = 200
xe_density = 0.0176
xe_density = 0.0176
3 changes: 2 additions & 1 deletion epix/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = '0.2.2'
__version__ = '0.3.2'

from .common import *
from .io import *
Expand All @@ -9,4 +9,5 @@
from .event_separation import *
from .detectors import *
from .run_epix import *
from .quantgen_wrapper import BBF_quanta_generator
from .simulator import *
81 changes: 56 additions & 25 deletions epix/io.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,6 +90,7 @@ def __init__(self,
outer_cylinder=None,
kwargs={},
cut_by_eventid=False,
cut_nr_only=False,
):

self.directory = directory
Expand All @@ -98,21 +99,23 @@ def __init__(self,
self.outer_cylinder = outer_cylinder
self.kwargs = kwargs
self.cut_by_eventid = cut_by_eventid
self.cut_nr_only = cut_nr_only

self.file = os.path.join(self.directory, self.file_name)

self.column_names = ["x", "y", "z", "t", "ed",
self.column_names = ["x", "y", "z",
"t", "ed",
"type", "trackid",
"parenttype", "parentid",
"creaproc", "edproc"]

#Prepare cut for root and csv case
if self.outer_cylinder:
self.cut_string = (f'(r < {self.outer_cylinder["max_r"]})'
f' & ((zp >= {self.outer_cylinder["min_z"] * 10}) & (zp < {self.outer_cylinder["max_z"] * 10}))')
f' & ((zp >= {self.outer_cylinder["min_z"] * 10}) & (zp < {self.outer_cylinder["max_z"] * 10}))')
else:
self.cut_string = None

def load_file(self):
"""
Function which reads a root or csv file and removes
Expand All @@ -128,6 +131,8 @@ def load_file(self):
interactions, n_simulated_events, start, stop = self._load_root_file()
elif self.file.endswith(".csv"):
interactions, n_simulated_events, start, stop = self._load_csv_file()
else:
raise ValueError(f'Cannot load events from file "{self.file}": .root or .cvs file needed.')

if np.any(interactions['ed'] < 0):
warnings.warn('At least one of the energy deposits is negative!')
Expand All @@ -140,15 +145,21 @@ def load_file(self):
m = m & m2
interactions = interactions[m]

if self.cut_nr_only:
m = ((interactions['type'] == "neutron")&(interactions['edproc'] == "hadElastic")) | (interactions['edproc'] == "ionIoni")
e_dep_er = ak.sum(interactions[~m]['ed'], axis=1)
e_dep_nr = ak.sum(interactions[m]['ed'], axis=1)
interactions = interactions[(e_dep_er<10) & (e_dep_nr>0)]

# Removing all events with no interactions:
m = ak.num(interactions['ed']) > 0
interactions = interactions[m]

return interactions, n_simulated_events

def _load_root_file(self):
"""
Function which reads a root file using uproot,
"""
Function which reads a root file using uproot,
performs a simple cut and builds an awkward array.
Returns:
Expand All @@ -157,7 +168,7 @@ def _load_root_file(self):
start: Index of the first loaded interaction
stop: Index of the last loaded interaction
"""

ttree, n_simulated_events = self._get_ttree()

if self.arg_debug:
Expand Down Expand Up @@ -192,10 +203,10 @@ def _load_root_file(self):

# Conversions and parameters to be computed:
alias = {'x': 'xp/10', # converting "geant4" mm to "straxen" cm
'y': 'yp/10',
'z': 'zp/10',
'r': 'sqrt(x**2 + y**2)',
't': 'time*10**9'
'y': 'yp/10',
'z': 'zp/10',
'r': 'sqrt(x**2 + y**2)',
't': 'time*10**9'
}

# Read in data, convert mm to cm and perform a first cut if specified:
Expand All @@ -207,7 +218,18 @@ def _load_root_file(self):
eventids = ak.broadcast_arrays(eventids['eventid'], interactions['x'])[0]
interactions['evtid'] = eventids

return interactions, n_simulated_events, start, stop
xyz_pri = ttree.arrays(['x_pri', 'y_pri', 'z_pri'],
aliases={'x_pri': 'xp_pri/10',
'y_pri': 'yp_pri/10',
'z_pri': 'zp_pri/10'
},
**self.kwargs)

interactions['x_pri'] = ak.broadcast_arrays(xyz_pri['x_pri'], interactions['x'])[0]
interactions['y_pri'] = ak.broadcast_arrays(xyz_pri['y_pri'], interactions['x'])[0]
interactions['z_pri'] = ak.broadcast_arrays(xyz_pri['z_pri'], interactions['x'])[0]

return interactions, n_simulated_events, start, stop

def _load_csv_file(self):
"""
Expand All @@ -228,7 +250,10 @@ def _load_csv_file(self):
#unit conversion similar to root case
instr_df["x"] = instr_df["xp"]/10
instr_df["y"] = instr_df["yp"]/10
instr_df["z"] = instr_df["zp"]/10
instr_df["z"] = instr_df["zp"]/10
instr_df["x_pri"] = instr_df["xp_pri"]/10
instr_df["y_pri"] = instr_df["yp_pri"]/10
instr_df["z_pri"] = instr_df["zp_pri"]/10
instr_df["r"] = np.sqrt(instr_df["x"]**2 + instr_df["y"]**2)
instr_df["t"] = instr_df["time"]*10**9

Expand Down Expand Up @@ -291,20 +316,23 @@ def _awkwardify_df(self, df):
_, evt_offsets = np.unique(df["eventid"], return_counts = True)

dictionary = {"x": reshape_awkward(df["x"].values , evt_offsets),
"y": reshape_awkward(df["y"].values , evt_offsets),
"z": reshape_awkward(df["z"].values , evt_offsets),
"r": reshape_awkward(df["r"].values , evt_offsets),
"t": reshape_awkward(df["t"].values , evt_offsets),
"ed": reshape_awkward(df["ed"].values , evt_offsets),
"type":reshape_awkward(np.array(df["type"], dtype=str) , evt_offsets),
"trackid": reshape_awkward(df["trackid"].values , evt_offsets),
"parenttype": reshape_awkward(np.array(df["parenttype"], dtype=str) , evt_offsets),
"parentid": reshape_awkward(df["parentid"].values , evt_offsets),
"creaproc": reshape_awkward(np.array(df["creaproc"], dtype=str) , evt_offsets),
"edproc": reshape_awkward(np.array(df["edproc"], dtype=str) , evt_offsets),
"evtid": reshape_awkward(df["eventid"].values , evt_offsets),
"y": reshape_awkward(df["y"].values , evt_offsets),
"z": reshape_awkward(df["z"].values , evt_offsets),
"x_pri": reshape_awkward(df["x_pri"].values, evt_offsets),
"y_pri": reshape_awkward(df["y_pri"].values, evt_offsets),
"z_pri": reshape_awkward(df["z_pri"].values, evt_offsets),
"r": reshape_awkward(df["r"].values , evt_offsets),
"t": reshape_awkward(df["t"].values , evt_offsets),
"ed": reshape_awkward(df["ed"].values , evt_offsets),
"type":reshape_awkward(np.array(df["type"], dtype=str) , evt_offsets),
"trackid": reshape_awkward(df["trackid"].values , evt_offsets),
"parenttype": reshape_awkward(np.array(df["parenttype"], dtype=str) , evt_offsets),
"parentid": reshape_awkward(df["parentid"].values , evt_offsets),
"creaproc": reshape_awkward(np.array(df["creaproc"], dtype=str) , evt_offsets),
"edproc": reshape_awkward(np.array(df["edproc"], dtype=str) , evt_offsets),
"evtid": reshape_awkward(df["eventid"].values , evt_offsets),
}

return ak.Array(dictionary)

# ----------------------
Expand Down Expand Up @@ -334,6 +362,9 @@ def awkward_to_wfsim_row_style(interactions):
res['x'][i::2] = awkward_to_flat_numpy(interactions['x'])
res['y'][i::2] = awkward_to_flat_numpy(interactions['y'])
res['z'][i::2] = awkward_to_flat_numpy(interactions['z'])
res['x_pri'][i::2] = awkward_to_flat_numpy(interactions['x_pri'])
res['y_pri'][i::2] = awkward_to_flat_numpy(interactions['y_pri'])
res['z_pri'][i::2] = awkward_to_flat_numpy(interactions['z_pri'])
res['time'][i::2] = awkward_to_flat_numpy(interactions['t'])
res['g4id'][i::2] = awkward_to_flat_numpy(interactions['evtid'])
res['vol_id'][i::2] = awkward_to_flat_numpy(interactions['vol_id'])
Expand Down
Loading

0 comments on commit 442274a

Please sign in to comment.