Giter VIP home page Giter VIP logo

stir-gate-connection's People

Contributors

francescaleek avatar kristhielemans avatar robbietuk avatar robert-prescientimaging avatar samdporter avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

stir-gate-connection's Issues

Hard wired voxel sizes

# to get mu map (REQUIRES GATE > 8.2)
/gate/actor/addActor MuMapActor getMuMap
/gate/actor/getMuMap/attachTo VoxPhantom
/gate/actor/getMuMap/save images/output/Phantom{SimuId}.hdr
/gate/actor/getMuMap/setVoxelSize 4 4 4.0625 mm
/gate/actor/getMuMap/setResolution 110 110 64
/gate/actor/getMuMap/setEnergy 511 keV

Suggestion: Fix file structure to make more sense

Currently the file structure does not make too much sense. Everything is currently run from STIR-GATE-Connection/VoxelisedSimulation/ and the main scripts are there. The project "Home" directory contains ExampleScanners/, ExamplePhantoms/ and VoxelisedSimulation/.

I propose that the content of VoxelisedSimulation/ be moved into the main directory, which would continue to contain ExampleScanners/ and ExamplePhantoms/ .

Currently no rush for this. It would break a lot of backwards compatibility. Maybe aim for v2.0?

GATE output images shifted, does not allow for forward projection

Example: I am trying to estimate scatter. I am using an attenuation map output by a GATE (.hdr) but I am running into the following error.

ERROR: ProjMatrixByBinUsingRayTracing sadly doesn't support shifted x/y origin yet
terminate called after throwing an instance of 'std::string'

The image has the following properties:

AttenuationImage=Phantom1-MuMap.hdr

list_image_info $AttenuationImage

WARNING: Unable to determine patient position. Internally this will generally be handled by assuming HFS

Origin in mm {z,y,x}    :{-150.42, 210, 210}
Voxel-size in mm {z,y,x}:{3.27, 3, 3}
Min_indices {z,y,x}     :{0, -70, -70}
Max_indices {z,y,x}     :{46, 70, 70}
Number of voxels {z,y,x}:{47, 141, 141}
Physical coordinate of first index in mm {z,y,x} :{-150.42, 0, 0}
Physical coordinate of last index in mm {z,y,x}  :{0, 420, 420}
Physical coordinate of first edge in mm {z,y,x} :{-152.055, -1.5, -1.5}
Physical coordinate of last edge in mm {z,y,x}  :{1.63499, 421.5, 421.5}
Image min: 0.00011125
Image max: 0.130242
Image sum: 29608.8

I have a temporary work around to fix this translation

## Create zeros image with same origin as GATE input
stir_math --times-scalar 0.0 --including-first zeros.hv $GATE_Atten_input
## Add the GATE output mu-map to the template
stir_math --add STIR_AttenuationImage zeros.hv $AttenuationImage

It is correct to use the GATE output as the mu-map as $GATE_Atten_input in the simulation setup, see VoxelisedSimulation/GATESubMacros/AttenuationConv.dat and #44 issue.

source this_SGC.sh from a different directory issue

Running sh ../this_SGC.sh from the VoxelisedSimulation directory would set SGCPATH to /Path/To/SGC (observed by echo $SGCPATH in script).

However, running source ../this_SGC.sh would set SGCPATH to /Path/To/SGC/VoxelisedSimulation

stir_math unable to read .hdr outputs from GATE simulatuions

stir_math is unable to read the .hdr/.img outputs of GATE

VoxelisedSimulation/images/output/CreateActivity.sh https://github.com/UCL/STIR-GATE-Connection/blob/master/VoxelisedSimulation/images/output/CreateActivity.sh
and VoxelisedSimulation/images/output/CreateAttenuation.sh https://github.com/UCL/STIR-GATE-Connection/blob/master/VoxelisedSimulation/images/output/CreateAttenuation.sh
were written to convert the true source and used GATE attenuation into interfile format.

However I get error

stir_math attenuation.hv Phantomtest-MuMap.hdr

Available input file formats:
Interfile

ERROR: no file format found that can read file 'Phantomtest-MuMap.hdr'
libc++abi.dylib: terminating with uncaught exception of type std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >
zsh: abort      stir_math attenuation.hv Phantomtest-MuMap.hdr

where Phantomtest-MuMap.hdr exists and is an output from a test GATE simulation but is a binary file, content:

5c01 0000 6473 7200 0100 0000 0101 2f68
6f6d 6500 0100 0000 30ab 3617 d37f 0000
0000 0000 0000 7230 0400 6e00 6e00 4000
0100 0000 0000 0000 6d6d 000c 4271 2f63
6300 a01a 0100 1000 0000 0000 0000 0000
0000 8040 0000 8040 0000 8240 0000 0000
0000 0000 0000 0000 0000 0000 0000 0000
0000 803f 0000 0000 0000 0000 0000 0000
0000 0000 0000 0000 0000 0000 ff7f 0000
0000 0000 696d 6167 6500 12e4 fe7f 0000
d6f9 e30b 0100 0000 6038 12e4 fe7f 0000
0000 0000 0000 0000 a040 5117 3800 0000
0100 0000 0031 0000 3800 0000 0000 0000
3800 0000 0000 0000 0039 12e4 3800 0000
0100 0000 6e6f 6e65 0000 0000 2100 0000
0000 0000 0000 0000 6008 5317 306e 6f6e
6500 5117 d37f 006e 6f6e 6500 0000 0000
306e 6f6e 6500 0000 0039 126e 6f6e 6500
a7f8 6714 016e 6f6e 6500 0000 0000 f06e
6f6e 6500 0000 0000 006e 6f00 0000 0000
0000 0000 0000 0000 0000 0000 0000 0000
0000 0000 0000 0000 0000 0000 

I have added this issue to this project as I believe it is a GATE issue, not STIR.

Scripts are depenant on running from SGC VoxelisedSimulation directory

Related to #56 (3) [PATHS... and setting enviromental variables. The idea here is to define things like $SGC_HOME as the STIR-GATE-Connection project directory and then set all paths from there. In some places paths are defined relatively e.g.../PATH/TO/FILE]

Currently if scripts want to be run from an alternative directory, or away from the SGC project entirely, a number of scripts will fail due to path problems.

It might be of interest to add a sourceable script that sets the enviromental variable SGC_Home, or let the user define it. Then all subscript calls can be referenced from SGC_Home, e.g. ${SGC_Home}/VoxelisedSimulation/ExampleSTIR-GATE.sh

Any ideas, I havent done this before for a project.

Things to do

    • Change the mMR template to the "real" one (8x8 crystals, not 8x9) (needs STIR mods).
    • When creating XCAT, find XCAT dimensions from the XCAT parameter file as opposed to hard-wiring.
    • Find GATE image specifications from Interfile headers.
      This is used in source.mac in the /gate/source/SourcePhantom/setPosition.
    • Find GATE sinogram specifications from STIR Interfile header or vice versa.
    • Find durations and energy windows from Interfile headers.
    • Add GE D690 root header. Will allow for unlisting.
    • Allow for STIR parameter files generated and simulated using GATE.
    • Add testing capability and check on script errors.
    • Tie the cluster $TASK_ID to the start_time and end_time of the GATE simulation. Perhaps have a setable time period (e.g. 0.5s, 1s, 2s). Note this information is recored in root files and is related to radioactive decay.

Distance map is created at each simulation

When running parallel cluster jobs this will dump for each script (and then I think reread but need testing).

#comment out the following line once created
/gate/VoxPhantom/geometry/buildAndDumpDistanceTransfo images/output/dmap.hdr
/gate/VoxPhantom/geometry/distanceMap images/output/dmap.hdr

The similations do read these files, it may be overwritten and accessed simultatiously by different sub-jobs, probably resulting in problems.

If this is an issue, there may need to be a setup/preperation script.

Need execulable errors to exit scripts

Some scripts can error due to bad inputs (or older versions of STIR). These scripts will continue, even though an output of the error'd function is required later. This leads to much unnessisery computation in some cases and makes debugging harder. Example script I am talking about is:

## mutiply ones with the norm factors to get a sino
echo "apply_normfactors3D"
apply_normfactors3D $OutputFilename $factors ones.hs 1 1 $eff_iters

where apply_normfactors3D fails due to an older version of STIR. Yet the script continues, but there are other examples.

EstimateSTIRGATENorm mistakenly adds 1s to the normalisation sinogram

stir_math -s --including-first --add-scalar 1 ${OutputFilename} ${OutputFilename}"_span1.hs"

This is an attempt at renaming the files from ${OutputFilename}"_span1.hs" to ${OutputFilename}".hs" because as no output template (for SSRB reformatting) has been specified, but is there a better way todo this in STIR? @KrisThielemans
If I were to correct this in this form I would use

stir_math -s --including-first --add-scalar 0 ${OutputFilename} ${OutputFilename}"_span1.hs" 

@francescaleek If you use the *span1.hs sinogram, there wont be a problem otherwise there will may well be.

normalisation and discretisation

  • The phantom used for the normalisation uses 1 sample per voxel, leading to a jaged boundary. Things will work better when using more samples here
    Z number of samples to take per voxel := 1
    Y number of samples to take per voxel := 1
    X number of samples to take per voxel := 1
  • Normalisation factors depend on the projector. We currently use the default projector, which is the ray-tracing matrix with a single LOR. This has its own discretisation errors, which don't really match what GATE does. The estimated efficiencies therefore try to compensate for this, but this means they are not very good for smaller phantoms. Recommended therefore to use a par file saying number of rays in tangential direction to trace for each bin:= 5 or similar here
    forward_project ${model_data} ${FOVCylindricalActivityVolumeFilename} ${MeasuredData} > /dev/null 2>&1

Note that the doc should make it clear that the norm factors are for the given projector.

Suggestion: lm_to_projdata_template.par can be removed and the real parameter file could be created in UnlistRoot.sh

As discussed in #41 (comment)

Currently the template is used and {variable} strings are replaced using sed commands .

#============= create parameter file from template =============
cp Templates/lm_to_projdata_template.par lm_to_projdata_${ROOT_FILENAME}.par
sed -i.bak "s/{ROOT_FILENAME}/${ROOT_FILENAME}/g" lm_to_projdata_${ROOT_FILENAME}.par
sed -i.bak "s/{SinogramID}/${SinogramID}/g" lm_to_projdata_${ROOT_FILENAME}.par
sed -i.bak "s|{UNLISTINGDIRECTORY}|${UnlistingDirectory}|g" lm_to_projdata_${ROOT_FILENAME}.par

The actual parameter file could be created as only three lines are important in the whole list mode parameter file:

lm_to_projdata Parameters:=
  input file := {ROOT_FILENAME}.hroot
  output filename prefix := {UNLISTINGDIRECTORY}/{SinogramID}
  template_projdata := UnlistingTemplates/STIR_scanner.hs
End := 

ComputePoissonDataCorrections will fail if not called from `VoxelisedSimulation` directory

This is because of

sh SubScripts/EstimateRandomsFromDelayed.sh ${randoms3d} ${DelayedData}

The fix for this is possibly to introduce an enviromental variable, e.g. ${SGCPATH} or ${SGC_HOME} and then set all SGC paths from this location. This will allow the SGC scripts to be run out of the SGC project. Related to #56 (item 3)

Suggestion: Populate a common template .hroot from STIR sinogram template header and GATE scanner geometry

The information present in the hroots are duplicates of those present in the STIR scanner template and GATE scanner geometry.

originating system
Number of rings
Number of detectors per ring
Inner ring diameter (cm)
Average depth of interaction (cm) 
Distance between rings (cm)    
Default bin size (cm)                   
View offset (degrees)       
Maximum number of non-arc-corrected bins

Information such as number of Rsectors, number of modules_X, number of submodules_X, number of crystals_X, etc. are all present in the GATE scanner geometry.

These fields can easily be populated using sed commands.

v1.1 TODO

Since v1.0 has been created, preparing for the next (sub-)release. Some of these issues/PRs are leftover from the previous #37

  • mCT #26

  • Scatter Estimation #50

  • PATHS... and setting enviromental variables. The idea here is to define things like $SGC_HOME as the STIR-GATE-Connection project directory and then set all paths from there. In some places paths are defined relatively e.g.../PATH/TO/FILE #74

  • AttenuationConv.dat #44

  • Energy information should not be hardcoded #54

  • General corrections computation script #70

  • Move corrections computation scripts out of VoxelisedSimulation #75

  • Reconstruction (OSEM) example #76

Pending refactors and removals

This is a list of directories and files that do not make sense or are of the wrong format/style.

This will be updated in future.

Refactors

  • VoxelisedXCAT/ -> VoxelisedSimulation/?

  • Directories to CamelCase format.

Removals

  • VoxelisedXCAT/activity.h33 and VoxelisedXCAT/attenuation.h33 are unnessisery now.

AttenuationConv data should be in units of cm-1

In

10
0 0 Air
114 114 Lung
115 383 Body
384 384 Adipose
385 394 Body
395 395 Muscle
396 402 Body
403 403 Body
404 420 Body
421 1000 SpineBone

the attenuation coefficient ranges are not intuitive. STIR uses attenuation coefficient units of cm^-1.
The input attenuation voxelised phantom for this project should use this attenuation coefficient unit for future STIR processing.

Body tissue has an approximate attenuation coefficient of 0.096cm^-1 (i.e. water).
In

stir_math --including-first --times-scalar 10000 $AttenuationFilenameGATE".hv" $AttenuationFilename

the interfile attenuation image is multipled by 10,000 to create a GATE interfile image, which would corespond to a AttenuationConv of 960 for body tissue/water. This would corresponds to SpineBone in GATE.

Two options:

  1. change the multiplier in In

    stir_math --including-first --times-scalar 10000 $AttenuationFilenameGATE".hv" $AttenuationFilename
    to not be 10,000 but some scaling value that maps between the values in an XCAT and the expected GATE materials.

  2. Change the values in

    10
    0 0 Air
    114 114 Lung
    115 383 Body
    384 384 Adipose
    385 394 Body
    395 395 Muscle
    396 402 Body
    403 403 Body
    404 420 Body
    421 1000 SpineBone

    to better align the multiplied coefficients.

In my opinion, 2. is the better method.
To do this, we need to use something like XCAT to assertain the attenuation coefficients ranges of different body materials, multiply by 10,000, and update the AttenuationConv.dat file. Should be done for #37

Suggestion: Parameter the size of the world on size of phantoms/scanner

A voxelised phantom (activity or mumap) larger than the world will cause a crash.

We may however have a phantom that is an entire XCAT torso/body with different z lengths. For any particle, tracking stops when it escapes from the world volume. Therefore, we do not want to make the world too large by default because of computation time concerns.

We could therefore take the maximum length of the phantoms and scanner in each dimention and use that for the definition of world size. This would be an extention of #6 (4.)

Suggestion: for the "ExampleSTIR-GATE.sh"

Hi,

In the line 60, I suggest the script to be instead of:
./SetupSimulation.sh $ScannerType $StoreRootFilesDirectory $Activity $Attenuation
to become:
./SetupSimulation.sh $ScannerType $StoreRootFilesDirectory $ActivityFilename $AttenuationFilename

Given that few lines previously, we declared the file names.
Also, I have a question. about executing .sh. When I run it with "./" it gives me errors with the if statements. I have to change everything to "bash" to run it properly (i.e.: bash SetupSimulation.sh $ScannerType $StoreRootFilesDirectory $ActivityFilename $AttenuationFilename).

Regards,

George

Something wrong with the script ExampleSTIR-GATE.sh

there is an error " could not open header file '_GTAE.h33' " when i execute command ./ExampleSTIR-GATE.sh "test".i am sure STIR install 'bin' and GATE 'bin' in the right 'PATH'. it seems like there are some error with the line 27 in GeneratesSTIRGATEImage.sh

GATE output voxelised images are reflected in z

This is somewhat related to #47, but slightly different and therefore a seperate issue.

The Phantom-SourceMap and Phantom-MuMap output by GATE are reflected in the z axis compared to the input. This first figure is the first slice for the input attenuation_corrected.hv and output Phantom-MuMap.hdr
image

This second figure is the central slice of each (of 24/47) indicating now reflection in x or y
image

@francescaleek Have you noticed any of this in your experiments?

Issues with sed between macOS and linux systems

This is causing problems here:

# Add $NumberOfSlices and $SliceThickness at $LineNum
sed -i '' $LineNum'i\
!number of slices := '$NumberOfSlices'\
slice thickness (pixels) := '$SliceThickness'
' $GATEFilename

Handly tutorial on this: https://riptutorial.com/sed/topic/9436/bsd-macos-sed-vs--gnu-sed-vs--the-posix-sed-specification
To summarise this issue:

BSD (macOS) sed: MUST use -i ''
GNU sed: MUST use just -i (equivalent: -i'') - using -i '' does NOT work.

macOS can use gnu-sed as it can be installed using brew and then aliasing, but this is such a workaround it probably isnt worth it.

Working on a fix with sed "blah" > $GATEFilename

error in norm-estimation script

apply_normfactors3D $eff_factors $factors $model_data 1 $outer_iters $eff_iters
echo "inverting the eff_factors to get norm"
stir_math -s --including-first --power -1 $OutputFilename $eff_factors

seems wrong on 2 levels:

  • the first step should use ones.hs. At the moment it's "unnormalising" the model_data (good to compare to the measured data, but not good to get nrom factors!)
  • the second step isn't necessary as we could switch the bool value.

So I believe this should be

apply_normfactors3D $OutputFilename $factors $model_data 0 $outer_iters $eff_iters 

Some notes:

  • while digging through the code, I got confused, and "documented" what I found at UCL/STIR#706
  • it's possible that the current code wouldn't suffer from the division-by-zero mentioned in that issue, but I can't see any smart handling in stir_math either. I guess we just have to make sure that there are no zeroes in the estimated factors. (The estimation itself will be fine, it's passing it to BinNormalisationFromProjData that is the problem, is that needs the norm factors)

TODO for v1.0

  • mCT #26 (optional?)

  • README

    • Project software Requirements (STIR and GATE, root?)
    • Better discription outlining the usage of the project (feedback and help is needed)
  • Scanner view offset #19 checked (optional: test scripts)

  • Fix file structure

    • Output/ directory should not have any scripts or templates in it. The templates need to be moved.
    • The directory name VoxelisedSimulation/ is possibly not suitable anymore.
    • There are old scripts in 'images/ for some reason... (have been moved)
  • PATHS... and setting enviromental variables. The idea here is to define things like $SGC_HOME as the STIR-GATE-Connection project directory and then set all paths from there. In some places paths are defined relatively e.g.../PATH/TO/FILE

  • AttenuationConv.dat #44

Proposal for a general corrections script to compute all corrections for reconstruction

There should be a script to call each of the correction estimations scripts after a GATE simulation and unlisting into STIR sinograms. The following are comments on each of the correction computation scripts that should or should not be included in a general corrections computation script.

Randoms:

INCLUDE in general corrections script
https://github.com/UCL/STIR-GATE-Connection/blob/master/VoxelisedSimulation/SubScripts/EstimateRandomsFromDelayed.sh

  • The first script that should be called.
  • Requires delayed sinogram and outputs a single randoms estimate file, assuming cleanup.

Attenuation Coefficent Factors

INCLUDE in general corrections script

Normalisation:

DO NOT INCLUDE in general corrections script
https://github.com/UCL/STIR-GATE-Connection/blob/master/VoxelisedSimulation/SubScripts/EstimateGATESTIRNorm.sh
This is a hard script to include in the propsed idea as it should only be run once per scanner and is really GATE "scan" indepenant. It should be assumed this has been precomputed.

  • Output normalisation file is used in scatter correction.

Scatter correction:

INCLUDE in general corrections script
https://github.com/UCL/STIR-GATE-Connection/blob/master/VoxelisedSimulation/SubScripts/EstimateScatter.sh

  • Most computationally demanding script
  • Currently has lots of arguments for various filenames. Any

Multiplicative Factors:

INCLUDE in general corrections script

D690 has significantly fewer coincidences in root file than mMR

This issue will attempt to summerised expand on the discussions of #14 and #15.
This issue was likely introduced in #15 with the introduction of the correct D690 scanner geometry.


Details

This issue is based upon a comparison between the mMR (previously validated ECAT system) and the D690 (not validated cylindrical PET system).

Performing 10 second (off-centre) point source simulations using both the D690 and mMR leads to significantly different data in the output root file.

Counts (Results are out of date, refer to followup)

Tree D690 mMR ratio
Hits 112k 246k 2.2
Single Readout 82k 135k 1.6
Singles 17.6k 88k 5
Coincidences* 1.67k 20.2k 12

*The coincidences is what STIR unlists and therefore the issue is obviously in the Gate simulation and not STIR.
Data collected by setting

/gate/output/root/setRootSinglesAdderFlag 0
/gate/output/root/setRootSinglesReadoutFlag 0
/gate/output/root/setRootHitFlag 0
/gate/output/root/setRootSinglesFlag 0
/gate/output/root/setRootOpticalFlag 0
/gate/output/root/setRootNtupleFlag 0
/gate/output/root/setRootCoincidencesFlag 1
flag to True.

What we find is the ratio between the "Hits" and "Singles Readout" are acceptable and within expected. However, the "Singles" and "Coincidences" ratios are significant.

Sinograms

These root files are unlisted using STIR to investigate their shape. The shape of the sinograms looks acceptable in both cases (after they were compressed into 1 segment using STIR's SSRB functionality. Example of the D690 sinogram:
image
and mMR:
image
Please note that the scales are not equivilant.

Thoughts

It appears the issue is not with the geometry but the digitiser with regards to readout of singles:

## For details regarding readout, see: https://opengate.readthedocs.io/en/latest/digitizer_and_detector_modeling.html#readout
/gate/digitizer/Singles/insert adder
/gate/digitizer/Singles/insert readout
/gate/digitizer/Singles/readout/setDepth 4


Previous Discussion

mMR does have long axial FOV hence higher sensitivity
#14 (comment)

mMR: 13.3 cps/kBq from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4766138/
D690: 7.5cps/kBq from http://dx.doi.org/10.1118/1.3635220
#14 (comment)

Cluster job submission keyword mismatch

There is a discrepancy between UCL's CS and Myriad cluster job submission keywords.

Example:
CS:

#$ -l h_vmem=1G
#$ -l tmem=1G

Myriad:

$ -l mem=1G

As this is high level, and will likely vary further with additional system, I won't fix this and leave it for users. As I personally work on the Myriad cluster, I will convert everything to that format.

about norm factors of mMR

Sorry to bother you.If I want to get the normalisation factors of mMR, which par file should I use in the simulation?

RandomsEstimate problems

This is related to the findings found here:
#59 (comment)

The EstimateRandomsFromDelayed.sh uses find_ML_norm code. This will compute the geometric factors (always, see: UCL/STIR#843), which results in an incorrect dependancy on the scanner.

Combining many ROOT files using `hadd` can exeed the (default) 100GB limit

I was using the following command to combine the root files output by many parallel GATE simulations

hadd SingleSource10MCombine.Coincidences.root SingleSource10MSim_{1..100}.Coincidences.root

This errored with:

Fatal in <TFileMerger::RecursiveRemove>: Output file of the TFile Merger (targeting SingleSource10MCombine.Coincidences.root) has been deleted (likely due to a TTree larger than 100Gb)

Solution

There is a known solution for this issue, see: https://root-forum.cern.ch/t/root-6-04-14-hadd-100gb-and-rootlogon/24581. I have detailed here:

Create a script:

// Script name `startup.C`

#include "TTree.h"

int startup() {
  TTree::SetMaxTreeSize( 1000000000000LL ); // 1 TB
  return 0;
}

namespace {
  static int i = startup();
}

Compile this using the commands

root.exe -b -l -q startup.C

and run hadd on macOS with

DYLD_INSERT_LIBRARIES=startup_C.so hadd SingleSource10MCombine.Coincidences.root SingleSource10MSim_{1..100}.Coincidences.root

and run hadd on linux with

LD_PRELOAD=startup_C.so hadd SingleSource10MCombine.Coincidences.root SingleSource10MSim_{1..100}.Coincidences.root

Question

Should we include the solution script in SGC and how should we document this methodology?

Suggestion: Create dmap with unique $TASK_ID variable in name

The current implementation runs as SetupDmap.mac script during the SetupSimulation.sh script.
The purpose of this GATE macro is to create dmap.hdr and dmap.img so that multiple parallel GATE simulations without overwriting this file and potentially causing crashes, see #12.

It would be possible to create a dmap for each parallel job by adding the unique identifier (SimuId/$TASK_ID) to the name, e.g. dmap_1.hdr or dmap_500.hdr.

While not required for current implementation, it might be useful later when more than one phantom is used, such as with motion.

D690 Simulations will not unlist

WARNING: CListModeDataROOT: I've set the scanner from STIR settings and ignored values in the hroot header.

ERROR: the number of transaxial blocks per bucket, the number of axial crystals per block, the number of axial crystals per singles unit, the number of transaxial crystals per singles unit, 
libc++abi.dylib: terminating with uncaught exception of type std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >
sub_scripts/unlist.sh: line 28: 29470 Abort trap: 6           lm_to_projdata lm_to_projdata_${SGE_TASK_ID}.par

Due to mismatch between STIR and root file (unsure if .hroot or .root, probbably .root). https://github.com/UCL/STIR/blob/c11e9673cad3702aa83b288edcc27527e25fea7e/src/listmode_buildblock/CListModeDataROOT.cxx#L260-L264

mMR unlisting works fine.

Unlisting root files with random rejection does not use an random seed

UnlistRoot.sh has the ability to unlist with random rejection. However, unless a seed is stated, STIR will unlist with a seed of 42. This makes the unlisting of the same root file with the same $AcceptanceProb unlist the same sinogram.

seed is an parameter files variable that can be added to the lm_to_projdata parameter file that could be randomised with $RANDOM

Example GATE geometries may not be compatable with STIR 5.0.1 due to rotation/offset changes

With the merge of UCL/STIR#1016, the quarter_block rotation has been removed from STIR, after STIR v5.0.0. GATE geometries are rotated by 90 degrees about the z axis. A possible solution to this is to rotate the GATE geometry , i.e.

/gate/cylindricalPET/placement/setRotationAxis  0 0 1
/gate/cylindricalPET/placement/setRotationAngle -90 deg

Alternatively, but less desirable, would be to modify the view offset (degrees) value by 90 degrees in the STIR interfile headers.


Additionally, rotational checks need to be rerun for the scanners becasue the all the block/detector offsets have been removed in STIR. This means the mMR geometry is likely broken, see:

; STIR will try to align the data.
; If you have used non standart GATE axes,
; rotate using:
offset (num of detectors) := -4

D690 View offset (STIR) for unlisting

See thread for original details: #15 (review)

STIR ignores this for the moment (until merge of STIR/pull/181). Therefore, this issue is for future proofing.


The problem

The GATE D690 scanner has problems #18, but this issue focuses on the unlisting and the GATE geometry's rotation when compared with the STIR projectors.

The previous scanner geometry centered a bucket on the axies. The more realistic scanner geometry uses buckets that are composed of two blocks (in y), see the transaxial scanner setup:
image
where the wireframe are blocks are marked in magenta. There are 32 buckets around the scanner.

The current STIR implementation has offset_det and half_block offsets of crystals.

https://github.com/UCL/STIR/blob/7ad42836e6d813313b8f2f5a83f8c6d3e04a47fb/src/IO/InputStreamFromROOTFileForCylindricalPET.cxx#L111-L118


Comments

I believe, after checking now that what @KrisThielemans discussed, that in future this value should be 0 as offset_det and half_block are used.

Definitions and coments regarding:
offset_det:
https://github.com/UCL/STIR/blob/7ad42836e6d813313b8f2f5a83f8c6d3e04a47fb/src/include/stir/IO/InputStreamFromROOTFile.h#L201-L202

half_block:
https://github.com/UCL/STIR/blob/7ad42836e6d813313b8f2f5a83f8c6d3e04a47fb/src/include/stir/IO/InputStreamFromROOTFileForCylindricalPET.h#L167-L170
and computation:
https://github.com/UCL/STIR/blob/7ad42836e6d813313b8f2f5a83f8c6d3e04a47fb/src/IO/InputStreamFromROOTFileForCylindricalPET.cxx#L59

Suggestion: Add time per simulatuion for Jobs scripts

Currently the time is hardcoded to be 1 second.

StartTime=$(expr $TASK_ID - 1) ## Start time in GATE time
EndTime=$(expr $TASK_ID) ## End time in GATE time

The start and end times can easily be multipled by an int value to allow for longer simiulation times without overlap.

Something like

TimePerGATESim=10  # in seconds
StartTime="$(echo $TASK_ID $TimePerGATESim | awk '{ tmp=(( $1 - 1 ) * $2) ; printf"%0.0f", tmp }')"
EndTime="$(echo $TASK_ID $TimePerGATESim | awk '{ tmp=( $1  * $2) ; printf"%0.0f", tmp }')"

Unlisting has hardcoded energy infomation

lm_to_projdata gets energy window lower level and energy window upper level from the .hroot file even if present in the template. This energy information is currently hardcoded

LowerEnergyThreshold=0
UpperEngeryThreshold=1000

UnlistRoot.sh should infact take this information as an optional argument, otherwise do not unset it.

The energy information is important for the scatter estimation scripts.

the unit of reconstructed image

Hello,Iโ€˜m using the mMR macro file from this repository and successfully reconstructed the image from root file. Is the unit of the image reconstructed using STIR Bq/ml or other units?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.