Giter VIP home page Giter VIP logo

sbnana's People

Contributors

brucehoward-physics avatar bzamorano avatar cjbacchus avatar dpmendez avatar etworcester avatar etyley avatar fernandapsihas avatar gartung avatar gputnam avatar grayputnam avatar henrylay97 avatar ikatza avatar jacoblarkin avatar jedori0228 avatar jzennamo avatar lgarren avatar marcodeltutto avatar mastbaum avatar miquelnebot avatar petrilloatwork avatar rhowell42 avatar sfbaylaser avatar steverdennis avatar wesketchum avatar woodtp avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sbnana's Issues

Changing Standard Record Requires a Clean Build

This is an issue in the interaction between sbnana/sbnanaobj.

The SRProxy maker code does not automatically refresh after updating stuff in StandardRecord (adding or removing a variable, etc.). Thus, ones needs to zap the build directory and re-build from scratch after any update.

It would be useful if this was not necessary and SRProxy could automatically detect updates.

I believe this is also a regression from the code reorganization.

NuMI XSec 2024 Updated Production Patch: CAFAna Core framework updates

The CAFAna Core framework will want a few updates for this analysis:

  1. CAFAna Trees enable saving of products in a useful ROOT product but outside of the Spectrum family
  2. Truth{Var,Cut} to enable using the true neutrinos directly in the main event loop

These would also want to be pushed to develop, and likely/hopefully Jaesung and I can use the same commits for both production patch and develop.

osclib need to update

osclib depends on root v6_22_06a whereas the new LArSoft release (and sbnanaobj) is setting up root v6_22_08b

Supporting XRootD

When we use a XRootD path (useful for grid jobs) in SpectrumLoader, this file does not pass the following line, and is neglected, since stat() returns non-zero value.

if(stat(p.we_wordv[i], &sb) == 0)
fileList.push_back(p.we_wordv[i]);

A XRootD file can be used if stat() is commented out (but still the wildcard will not work for XRootD path).

StandardRecord's FwdDeclare.h isn't part of the sbnana product

With a local checkout, this file (and SRProxy.h) are generated as part of the build, and I have them at
localProducts*/sbnana/v09_19_00/include/sbnana/CAFAna/StandardRecord/Proxy/

Whereas setup sbnana v09_19_00 -qe19:prof; ls $SBNANA_INC/sbnana/CAFAna/StandardRecord/ only shows SREnums.h, and no sign of the Proxy/ subdirectory at all.

I assumed the ups products were essentially made from the appropriate localProducts directory. Is it more complicated than that? Why are these files missed?

Ed correctly points out that we will want this to be working in time for the next workshop (ie in the upcoming release) so users don't have to check out and build the whole package. I am blaming this on something in the packaging step, so assigning Wes, though it's certainly possible that conditions during that build confuse my makefile.

Difference in flat cafs and CAFAna for chi2 proton plots

Discussion ongoing in #sbn_preprod_debugging and #sbn_preprod_debugging_proton channels on slack. It is possible that there is not a real issue here, but would like resolution to know if something needs to change in 2022A branch or not.

Support flatcafs in concat_cafs

Currently the only way to get a concatted flatcaf is to concat first, and then flatten. It would definitely be convenient to go the other order too.

The code would be somewhat different to what exists for non-flat CAFs, but, at least for straight concatting with no other modifications, it should be very simple. You literally concatenate the TTrees, which there is presumably a ROOT function for.

Unable to compile sbnana

I'm trying to compile sbnana on its own, but can't. Tried with v09_16_00, v09_17_00 and v09_17_02. They all seem to have the same error.

I'm following the typical recipe of

source localProducts*/setup
mrbsetenv
mrb i

Is this package supposed to be built differently?

The error in all attempts looks something like this:

INFO Parsing source file "/cvmfs/sbnd.opensciencegrid.org/products/sbnd/sbnanaobj/v09_16_03/include/sbnanaobj/StandardRecord/StandardRecord.h" ...
In file included from /cvmfs/sbnd.opensciencegrid.org/products/sbnd/sbnanaobj/v09_16_03/include/sbnanaobj/StandardRecord/StandardRecord.h:10:
In file included from /cvmfs/sbnd.opensciencegrid.org/products/sbnd/sbnanaobj/v09_16_03/include/sbnanaobj/StandardRecord/SRCRTHit.h:7:
In file included from /cvmfs/sbnd.opensciencegrid.org/products/sbnd/sbnanaobj/v09_16_03/include/sbnanaobj/StandardRecord/SRVector3D.h:9:
In file included from /cvmfs/larsoft.opensciencegrid.org/products/root/v6_22_06a/Linux64bit+3.10-2.17-e19-p383b-prof/include/TVector3.h:16:
In file included from /cvmfs/larsoft.opensciencegrid.org/products/root/v6_22_06a/Linux64bit+3.10-2.17-e19-p383b-prof/include/TMatrix.h:23:
In file included from /cvmfs/larsoft.opensciencegrid.org/products/root/v6_22_06a/Linux64bit+3.10-2.17-e19-p383b-prof/include/TMatrixF.h:20:
In file included from /cvmfs/larsoft.opensciencegrid.org/products/root/v6_22_06a/Linux64bit+3.10-2.17-e19-p383b-prof/include/TMatrixT.h:23:
In file included from /cvmfs/larsoft.opensciencegrid.org/products/root/v6_22_06a/Linux64bit+3.10-2.17-e19-p383b-prof/include/TMatrixTBase.h:73:
In file included from /cvmfs/larsoft.opensciencegrid.org/products/root/v6_22_06a/Linux64bit+3.10-2.17-e19-p383b-prof/include/TString.h:28:
/cvmfs/larsoft.opensciencegrid.org/products/root/v6_22_06a/Linux64bit+3.10-2.17-e19-p383b-prof/include/ROOT/RStringView.hxx:19:10: fatal error: 'string_view' file not found
#include <string_view>
         ^~~~~~~~~~~~~
1 error generated.
Traceback (most recent call last):
  File "/cvmfs/sbn.opensciencegrid.org/products/sbn/srproxy/v00.21//bin/gen_srproxy", line 475, in <module>
    main()
  File "/cvmfs/sbn.opensciencegrid.org/products/sbn/srproxy/v00.21//bin/gen_srproxy", line 433, in main
    decls = pygccxml.parser.parse([input_header], config)
  File "/cvmfs/larsoft.opensciencegrid.org/products/pygccxml/v2_1_0/lib/python3.8/site-packages/pygccxml-2.1.0-py3.8.egg/pygccxml/parser/__init__.py", line 51, in parse
  File "/cvmfs/larsoft.opensciencegrid.org/products/pygccxml/v2_1_0/lib/python3.8/site-packages/pygccxml-2.1.0-py3.8.egg/pygccxml/parser/project_reader.py", line 264, in read_files
  File "/cvmfs/larsoft.opensciencegrid.org/products/pygccxml/v2_1_0/lib/python3.8/site-packages/pygccxml-2.1.0-py3.8.egg/pygccxml/parser/project_reader.py", line 292, in __parse_file_by_file
  File "/cvmfs/larsoft.opensciencegrid.org/products/pygccxml/v2_1_0/lib/python3.8/site-packages/pygccxml-2.1.0-py3.8.egg/pygccxml/parser/source_reader.py", line 299, in read_file
  File "/cvmfs/larsoft.opensciencegrid.org/products/pygccxml/v2_1_0/lib/python3.8/site-packages/pygccxml-2.1.0-py3.8.egg/pygccxml/parser/source_reader.py", line 318, in read_cpp_source_file
  File "/cvmfs/larsoft.opensciencegrid.org/products/pygccxml/v2_1_0/lib/python3.8/site-packages/pygccxml-2.1.0-py3.8.egg/pygccxml/parser/source_reader.py", line 264, in create_xml_file
RuntimeError: Error occurred while running CASTXML:  status:1
make[2]: *** [sbnana/sbnana/CAFAna/StandardRecord/Proxy/SRProxy.cxx] Error 1
make[1]: *** [sbnana/sbnana/CAFAna/StandardRecord/Proxy/CMakeFiles/StandardRecordProxy.dir/all] Error 2
make: *** [all] Error 2

------------------------------------
ERROR: Stage install / package failed.
------------------------------------

Scripting for locating and displaying specific events

We should provide a couple of simple scripts

A - take a dataset name and a (run, subrun, event) tuple and tell you which file it is found in
B - given a list of such tuples in a text file, use A to look each one up and launch an event display navigated to the relevant event, or potentially other actions

Outdated Qualifiers e20 vs. e26

sbnana is out of date with the most recent qualifier e26. It seems osclib also needs to be updated to the new qualifiers, otherwise further development can't be done.

sbnana releases should include dictionaries for pycafana

Filing this issue against myself so I don't forget.

Interactively, one would run build_dicts.sh. This is not done as part of the build scripts because it is slow and most people don't need it. But we should try and detect we are building a release, and create and install these libraries in that case.

Potential Issue in (Livetime) Exposure Accounting with CAFAna

I believe there is an issue in the Livetime accounting in the CAFs. Tagging @jedori0228 as watcher since Jack pointed out a message from Jaesung that got me thinking about this (I would also tag Jack, but don't see him in the users).

There are a few possible philosophies for generating neutrino beam MC samples that I can think of. The most straight-forward to understand way seems to be if you literally simulated spills whether there were nu interactions near the detector or not. However, what I understand that we do is pick some volume and simulate a given number of neutrino spills with interactions in this volume (which by the way, we sometimes filter to get just events with neutrinos in the active volume when the total gen volume is a bit larger: at least in ICARUS).

I think/guess — I’m not an expert here but I think — that the POT assigned to a given event must be proportional to the volume size (and I would guess possibly materials??) since a neutrino interaction IN our considered volume may take (on average) more than 1 spill’s worth of POT.

For example if you expect 1 neutrino in the simulated volume over 5 spills, then the POT required for a given event would then be something like 5 times the nominal POTPerSpill (on average), no? Or, putting it another way, getting an event in half a detector “requires more POT” than getting an event in a full detector.

As a partial check of this, I did a test run of 50 events in ICARUS with the volDetEnclosure volume and volCryostat volumes, where I think the DetEnclosure volume is larger. Asking the GENIE Helper to print out some stuff (I slightly moved and uncommented a printout that was in the Stop() function ), I got some details. The POTPerSpill printed out 6E13, the expected setting. However, the SpillExposure over these 50 events was on average significantly higher than this using the DetEnclosure option, and it was even higher when using the Cryostat option.

I’m not an expert, but this is my observation, and it fits with what @jedori0228 was seeing (more POT/Event reported by a CAF file than the nominal POTPerSpill).

Now, this points us to where I believe there is at least one exposure accounting bug. The Livetime is tied to fNGenEvt. However if what I describe above is correct, then the POT/Event will actually be different than the POT/Spill, and it will be the number of beam spills that we want to know about for scaling the cosmics no? Maybe one can scale Livetime by the ratio of (POT/Event) / (nominal POTPerSpill) to recover this in the meantime?

Side/follow-up question — does running with extra area and then filtering run us into further problems? What if it changes the average nuclear density (cross-section) or something? Here I fully defer to experts…

I don't know who are the right experts to tag here, but I tag @cjbackhouse since I understand he set the Livetime to be NGenEvt, and I tag @PetrilloAtWork as he has been involved in some fashion with the gen scripts and committed the file using volDetEnclosure. Please tag others as needed if you know better people to tag!

[[ I’m putting this Issue in sbnana but if this is better placed in sbncode or elsewhere let me know! ]]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.