refnx / refnx Goto Github PK
View Code? Open in Web Editor NEWNeutron and X-ray reflectometry analysis in Python
Home Page: https://refnx.readthedocs.io
License: BSD 3-Clause "New" or "Revised" License
Neutron and X-ray reflectometry analysis in Python
Home Page: https://refnx.readthedocs.io
License: BSD 3-Clause "New" or "Revised" License
If constant dq/q is deselected, and saved in an experiment, then this choice isn't retained when the experiment is reloaded.
Hi,
I’d like to use a MixedSlab
component as the last component in a Structure
. However, I’ve run into the exception raised in the slabs
method of the Structure
class that checks both the fronting and backing components are slabs. From a quick test, where I've removed this check along with the similar check in the sld_profile
method, I can still fit data and the SLD profile can be correctly calculated. Is it be possible to allow components other than slabs in the fronting and backing positions, or will it cause issues elsewhere?
Thanks,
Sandy
From the divergence of the beam it should be possible to calculate total beam width at detector. This could be used to check that the foreground region width is sensible. But this might depend on whether the sample is convex/concave or not. Perhaps just raise a warning.
Create a MixedReflectModel that takes a sequence of structures and gives each a scale factor.
This allows the modelling of surfaces which are patchy, etc.
Using pathlib.Path
objects makes dealing with filesystem objects much simpler and less verbose, for example eliminating many of the os.path.join
dances. The multicontrast example could, for instance, become:
from pathlib import Path
datadir = Path(refnx.__file__).parent / "analysis" / "test"
e361 = ReflectDataset(datadir / "e361r.txt")
Support for Path
objects instead of str
objects is now quite good across most of stdlib and so it would be nice for refnx to support that too. Given that refnx is already testing types, then adding isinstance(data, pathlib.PurePath)
would work; possibly_open_file
looks like it will already accept Path
objects.
From jhykes/rebin#9
x_old is [1.5 2.5 3.5 4.5 5.5 6.5]
y_old is [10 10 10 10 10]
x_new is [1.7 2.27332857 2.84665714 3.41998571 3.99331429 4.56664286
5.13997143 5.7133]
The y_new I get on using your script is:
[ 55.73328571 5.73328571 5.73328571 5.73328571 5.73328571 5.73328571 5.73328571]
There is clearly some problem with this result especially with the first bin result. Could you have some hints on what could be going wrong?
I am using the assumption of 'piecewise_constant' for bins.
AttributeError Traceback (most recent call last)
in ()
----> 1 f.write_spectrum_xml('PLP0024619.spec')
C:\Miniconda3\envs\dev3\lib\site-packages\refnx\reduce\platypusnexus.py in write_spectrum_xml(self, f, scanpoint)
953 d['runnumber'] = 'PLP{:07d}'.format(self.cat.datafile_number)
954
--> 955 d['r'] = string.translate(repr(r[scanpoint].tolist()), None, ',[]')
956 d['dr'] = string.translate(repr(dr[scanpoint].tolist()), None, ',[]')
957 d['l'] = string.translate(repr(l[scanpoint].tolist()), None, ',[]')
AttributeError: 'module' object has no attribute 'translate
@llimeht there is a large amount of noise coming from the tests:
python runtests.py
results in a lot of lines that look like:
/Users/anz/miniconda3/envs/dev3/lib/python3.4/site-packages/IPython/terminal/ipapp.py:260: DeprecationWarning: metadata {'config': True} was set from the constructor. Metadata should be set using the .tag() method, e.g., Int().tag(key1='value1', key2='value2')
file or command."""
also, make the text that's about to be written bytes, not str.
This is probably due to negative reflectivities in the transform. I thought these were supposed to be being filtered out.
If MCMC is chosen as a fitting algorithm, then an experiment is loaded with DE as the saved algorithm, then both algorithms end up being selected in the menu. Only one choice should be selected.
Hi there,
When trying to fit data with points of zero reflectivity, understandably, NaNs are produced in the Objective.logl
method and the following runtime error is raised:
Perhaps adding a method to the ReflectDataset
class to ignore zeros or to remove them would be beneficial?
I'm currently using the following as a workaround:
dataset = np.loadtxt("test.dat")
dataset = dataset[(dataset != 0).all(1)] #Remove any points with 0 reflectivity
q, r, r_error = dataset[:,0], dataset[:,1], dataset[:,2]
data = ReflectDataset([q, r, r_error])
Thanks,
James
Creating a model with the Jupyter GUI is very intuitive and the "to code" button is great for model reproducibility; however, if you update the GUI plot type in Options --> Plot Type to logY and then use the "to code" button the Objective class is not fed the transformation type selected in the GUI.
From my understanding of the code, the CurveFitter.sample()
method does not allow user modification to default parameters in the emcee sampler.
CurveFitter(...):
def sample(....):
for state in self.sampler.sample(self._state, **kwargs): <-- **kwargs not accessable in the method call.
self._state = state
_callback_wrapper(state, h=h)
test
After curvefitting with differential_evolution, etc, the uncertainties are estimated by calculating the covariance matrix. The covariance matrix is obtained by calculating the Jacobian matrix with numdifftools, then inverting (J.TxJ). The matrix inversion (and therefore the estimation of uncertainties) can fail if the matrix is singular.
When examining further:
When trying to install the latest version of this package via pip leads to the installation being aborted with the error message
setup.py line7: ModuleNotFoundError: No module named '_open_mp_helpers'
Checking setup.py in the tarball for version 0.1.9 (the newest one on PyPi) indeed shows that it tries to import _open_mp_helpers, but that module is not included in the tarball.
The next older version (0.1.8) can be installed via pip successfully.
Hi, really like the package. I wondered if there are any plans for a jax backend?
I noticed some comments in reflect model.py
that seemed to indicate that this was something being considered?
I'm new to reflectrometry but would a jax backend enable autograd on the parameters of the model? Would we expect the model to be differentiable w.r.t the parameters?
add_data
says it accepts a data tuple that is between 2 and 4 long. However, it only accepts 4 length tuple at the moment.
have Motofit in my Window 10 PC, it clashes frequently. There is a frequent error ' ValueError ("x" must be strictly increasing sequence), and usually followed by software collapse. Then I cannot process the saved experiment again, with a reminding " Failed to load experiment. it may have been saved in a previous refnx version(0.1.19).
raw data:
7.599559981455088729e-03 5.213826491056721135e-01 4.243400930856116443e-02 3.661709719107202792e-04
7.742441650939418767e-03 3.642551744899117927e-01 1.933701856570909858e-02 3.732699748664838912e-04
7.888384865706121393e-03 4.695387230610218454e-01 2.730133021787140013e-02 3.805166935176594383e-04
8.037447335102247681e-03 4.683678949027353933e-01 2.058098775440134892e-02 3.879139844566911722e-04
8.189688001664290756e-03 4.587522700272596010e-01 1.583231609585667929e-02 3.954647642040634136e-04
8.345167065836545106e-03 5.534588733091591051e-01 2.101541305034648754e-02 4.031720103739539016e-04
8.503946007442236138e-03 4.931240671240281204e-01 1.839655446691918286e-02 4.110387627674602683e-04
8.666087611255911607e-03 4.514596819949677764e-01 1.583540515575311211e-02 4.190681245852288317e-04
8.831655992373014380e-03 4.743817054218485985e-01 1.531638308109771300e-02 4.272632636473209806e-04
9.000716620456417766e-03 4.834806149544817466e-01 1.420397228961449258e-02 4.356274135966207734e-04
9.173336346199975622e-03 4.666036851359535720e-01 1.214487907527011037e-02 4.441638751729345253e-04
9.349583427376684241e-03 4.526237524849369698e-01 1.086582390645935217e-02 4.528760174894488596e-04
9.529527557015129643e-03 4.617982569604561216e-01 1.119252568289392721e-02 4.617672793778770327e-04
9.713239888600970831e-03 4.488023456531289312e-01 1.131460889750190990e-02 4.708411706698260911e-04
9.900793066437103829e-03 4.121551485855888264e-01 1.054495398509967298e-02 4.801012736260660088e-04
1.009226125407673529e-02 4.200108551061211837e-01 1.059868834416802762e-02 4.895512443296365628e-04
1.028772016246616225e-02 3.921599876536964180e-01 8.698907751423439610e-03 4.991948140857007657e-04
1.048724708064542252e-02 3.766409140181901649e-01 7.526050787911986797e-03 5.090357909022591790e-04
1.069092090692939055e-02 3.531106034795797877e-01 6.375900295252574443e-03 5.190780609980207277e-04
1.089882217765676849e-02 3.418903221418345129e-01 6.053958371329964082e-03 5.293255902621272395e-04
1.111103310330241629e-02 3.375662927008362679e-01 5.871102058726914395e-03 5.397824259200660786e-04
1.132763759647822423e-02 2.965539090250646770e-01 5.017803671952743814e-03 5.504526980046764049e-04
1.154872130780261637e-02 2.779858479049108211e-01 4.781115214384502182e-03 5.613406210469422961e-04
1.177437165928183101e-02 2.680841299104053110e-01 4.780851715098900258e-03 5.724504957182420181e-04
1.200467787789149740e-02 2.591708658298848000e-01 4.820866929975697185e-03 5.837867104941091680e-04
1.223973103305434594e-02 2.339196365583754433e-01 4.183418799154747965e-03 5.953537434354960701e-04
1.247962407076262832e-02 2.127928784954379871e-01 3.453205999809800912e-03 6.071561638999984712e-04
1.272445184879029273e-02 2.054553206162908763e-01 3.159680117574236585e-03 6.191986342985561194e-04
1.297431117934462474e-02 1.890514737913715493e-01 2.701446113049673498e-03 6.314859120621898678e-04
1.322930086201067924e-02 1.859042902723558321e-01 2.559857285340468285e-03 6.440228513753011585e-04
1.348952172689498255e-02 1.637010098997919716e-01 2.153850995745611088e-03 6.568144051911054167e-04
1.375507667050647823e-02 1.467565274603537251e-01 1.893171569583913154e-03 6.698656270776371692e-04
1.402607069992515068e-02 1.353787268867921167e-01 1.744339655069591802e-03 6.831816732969793037e-04
1.430261097306832763e-02 1.269392160684790216e-01 1.619253172813204904e-03 6.967678048027511808e-04
1.458480683886640913e-02 1.116573425058501601e-01 1.438115231027857228e-03 7.106293892548430418e-04
1.487276988642877336e-02 9.973450483022074098e-02 1.303403765982375688e-03 7.247719032864625857e-04
1.516661398423579769e-02 9.089033344764581301e-02 1.183578339145615905e-03 7.392009345336746995e-04
1.546645532146174833e-02 8.095282866735188654e-02 1.081132057748498748e-03 7.539221837408340400e-04
1.577241246699782884e-02 7.225247355943230365e-02 9.836988361257884787e-04 7.689414673443344377e-04
1.608460640015152779e-02 6.307643976146737330e-02 8.688350319987062954e-04 7.842647193461861448e-04
1.640316056980267029e-02 5.644573448923200648e-02 7.914446075036508802e-04 7.998979939446242398e-04
1.672820094157191628e-02 5.054137829827078793e-02 7.325057042714031800e-04 8.158474678773668110e-04
1.705985604468592681e-02 4.452309838993329105e-02 6.647586273648678403e-04 8.321194427798696064e-04
1.739825702299970661e-02 3.755146385029481393e-02 5.632429432520569051e-04 8.487203476741915931e-04
1.774353768592815475e-02 3.191148218058344893e-02 4.879574538352491838e-04 8.656567414791883994e-04
1.809583456875684185e-02 2.799588864026724550e-02 4.371699654646714274e-04 8.829353157869230346e-04
1.845528697405600382e-02 2.313247707150960578e-02 3.858024151500929360e-04 9.005628971760257879e-04
1.882203704642417333e-02 1.994677432354460475e-02 3.397568249576239999e-04 9.185464504102198741e-04
1.919622980176078145e-02 1.715877223824826756e-02 3.032826750911320722e-04 9.368930804892179956e-04
1.957801319990171846e-02 1.416542556540937962e-02 2.612772877867924697e-04 9.556100358446171714e-04
1.996753821363068254e-02 1.152038571385591986e-02 2.292504817467397381e-04 9.747047114687157126e-04
2.036495886976818701e-02 9.945969912730022455e-03 2.090405573362733960e-04 9.941846513503052422e-04
2.077043232270570777e-02 7.811567222714584785e-03 1.786381364088555446e-04 1.014057551775060387e-03
2.118411889723042715e-02 6.156474876496182379e-03 1.535840357436587774e-04 1.034331263862206269e-03
2.160618217664062382e-02 5.082770657130412061e-03 1.358455295872379931e-04 1.055013797297947109e-03
2.203678906031091705e-02 3.858843405059949380e-03 1.152153359427287546e-04 1.076113323310379840e-03
2.247610981398370095e-02 3.233632196200909859e-03 1.045720674011518944e-04 1.097638177487020479e-03
2.292431816053817906e-02 2.543166377364174981e-03 9.311971782766483626e-05 1.119596863666432905e-03
2.338159132962687806e-02 2.093385592542826128e-03 8.347647460677792169e-05 1.141998056800626457e-03
2.384811014055389020e-02 1.694526568564506258e-03 7.339039646108142629e-05 1.164850606706236292e-03
2.432405906251886712e-02 1.527524450811253934e-03 6.961648559807358477e-05 1.188163541264724340e-03
2.480962629726897115e-02 1.237339090639502041e-03 6.032300940611741696e-05 1.211946070233233495e-03
2.530500385572444222e-02 1.024288093696810661e-03 5.428233881237394228e-05 1.236207588933878028e-03
2.581038760909869223e-02 9.792942313169789373e-04 5.336978008881747314e-05 1.260957681320312718e-03
2.632597741839892008e-02 8.316648701114790409e-04 4.854346688306595311e-05 1.286206125099059175e-03
2.685197713842814998e-02 7.843306735174979246e-04 4.671226736955350859e-05 1.311962893653187108e-03
2.738859478005017598e-02 6.881862724547388108e-04 4.323985373653603959e-05 1.338238162080139515e-03
2.793604252541310659e-02 7.110593081883520052e-04 4.521390536376433472e-05 1.365042309478613648e-03
2.849453684322947322e-02 7.629859580709460239e-04 4.708013186141104850e-05 1.392385923851306977e-03
2.906429858926675228e-02 6.411159907741929842e-04 4.328558137397925512e-05 1.420279806665364917e-03
2.964555304728012583e-02 6.774720204395253244e-04 4.519125383185483536e-05 1.448734975918500225e-03
3.023853007386563888e-02 6.018618748276018289e-04 4.163251678828592894e-05 1.477762671922297185e-03
3.084346413147568203e-02 6.112005459662898672e-04 4.281581516192463084e-05 1.507374360246282151e-03
3.146059442834196751e-02 5.297962967069307473e-04 4.084262562275820213e-05 1.537581737457642365e-03
3.209016500924811999e-02 3.955153104104824034e-04 3.580240570101514143e-05 1.568396735637935012e-03
3.273242480125263670e-02 3.894424457765164921e-04 3.596824638470905416e-05 1.599831525783826976e-03
3.338762777424723188e-02 2.791040233926636908e-04 3.029000575083840041e-05 1.631898524209721970e-03
3.405603300229374797e-02 2.635321990169312142e-04 2.907603365937417106e-05 1.664610396441081963e-03
3.447296185783672168e-02 2.924103087055016461e-04 9.440595443991695382e-05 1.672509018342889682e-03
3.515558055123071263e-02 4.300633739487962504e-04 6.483331150176163498e-05 1.705847368957118249e-03
3.585210059482129424e-02 3.221877411519473607e-04 4.421155708903929472e-05 1.739860286518544758e-03
3.656279740951365320e-02 2.269254456509144460e-04 4.510764352933276171e-05 1.774561221545332013e-03
3.728795202179065182e-02 1.297336901200966616e-04 3.986997769974968595e-05 1.809963896989036152e-03
3.802785117624768452e-02 2.011110278613124252e-04 3.774319493476977020e-05 1.846082313712983517e-03
3.878278744664877264e-02 2.082207034931878660e-04 3.471375968042376959e-05 1.882930755985806398e-03
3.955305935287842245e-02 1.262004308793730658e-04 2.909202587029946884e-05 1.920523797181669353e-03
4.033897147952657941e-02 2.138184104483222805e-04 2.712553043724362693e-05 1.958876305578272775e-03
4.114083459522496072e-02 2.093037531960818235e-04 2.780566431762273742e-05 1.998003450231043351e-03
4.195896577610760958e-02 1.087781315022363134e-04 2.616111256581964061e-05 2.037920707011722961e-03
4.279368853080545848e-02 1.795773377526748056e-04 2.791755575047405973e-05 2.078643864745921815e-03
4.364533292955425497e-02 1.862321740133637653e-04 2.605221620782602337e-05 2.120189031517286365e-03
4.451423573235913089e-02 2.736889179151323170e-04 2.677043903954454330e-05 2.162572641009131817e-03
4.540074052437966662e-02 1.813588552783329999e-04 2.454080677457853870e-05 2.205811459095693115e-03
4.630519785150043893e-02 2.219046255118249623e-04 2.341456024722954565e-05 2.249922590502609170e-03
4.722796535776402349e-02 2.370405632131552794e-04 2.286214358215819944e-05 2.294923485581538015e-03
4.816940792755431550e-02 1.521977786330035252e-04 1.937823187428976788e-05 2.340831947274682988e-03
4.912989783049938330e-02 1.372532138854382457e-04 1.782983202823328270e-05 2.387666138218515720e-03
5.010981486623008324e-02 1.562629056235031663e-04 1.668671634318934186e-05 2.435444587914116412e-03
5.110954651882323313e-02 9.657278879458375091e-05 1.469533373711248727e-05 2.484186200219526061e-03
5.212948810552114204e-02 1.006066518016499853e-04 1.303200844516597351e-05 2.533910260767953133e-03
5.317004293573487222e-02 9.600338210440151397e-05 1.164480350979628022e-05 2.584636444726709417e-03
5.423162247003642661e-02 7.826137391265153460e-05 1.035132235306134253e-05 2.636384824632807648e-03
5.531464648187454231e-02 6.626878217725950876e-05 8.616425366062262112e-06 2.689175878377584022e-03
5.641954322575578468e-02 3.601874171666256919e-05 7.457989781142735733e-06 2.743030497438741274e-03
5.754674960470089473e-02 4.144201578061317210e-05 7.028195547808961300e-06 2.797969995176057857e-03
5.869671134146790120e-02 3.442338898827230337e-05 6.238422959638832481e-06 2.854016115308609467e-03
5.986988315993795051e-02 2.906399095551795325e-05 5.741212095458311442e-06 2.911191040740565474e-03
6.106672895959364827e-02 2.487271278729686960e-05 5.117092113466999094e-06 2.969517402296895092e-03
6.228772200302694351e-02 3.901146082193545050e-05 5.348340515205985504e-06 3.029018287885839808e-03
6.353334509909421002e-02 4.741060229594620535e-05 5.056923333517812061e-06 3.089717251641440395e-03
6.480409079730693744e-02 2.572624146330207187e-05 5.001460263839616169e-06 3.151638323450771818e-03
6.610046158134352812e-02 3.974102326459696597e-05 4.940617837017392956e-06 3.214806018555490651e-03
6.742297006554920369e-02 2.426287388828528193e-05 4.239600313632101516e-06 3.279245347329792608e-03
6.877213920355521037e-02 4.167897556534937273e-05 4.547937549326330298e-06 3.344981825472835828e-03
7.014850249014081829e-02 2.224857331711221025e-05 4.150316623022222487e-06 3.412041484131123417e-03
7.155260416849133775e-02 2.455621924983376279e-05 3.787076656663640302e-06 3.480450880266843308e-03
7.298499945846735615e-02 1.827034170786300037e-05 3.541294461240136386e-06 3.550237107677855705e-03
7.444625475998715991e-02 1.429033977588983963e-05 3.084541886278215989e-06 3.621427807487751557e-03
7.593694788829845332e-02 1.232114618924396605e-05 3.062835896514750721e-06 3.694051179573742440e-03
7.745766830081628196e-02 1.252496705656416319e-05 2.738716998521390884e-06 3.768135993894868464e-03
7.900901732726789417e-02 1.447815632711990197e-05 2.726117666017170588e-06 3.843711602026522040e-03
8.059160840767165546e-02 1.741429111374085707e-05 2.668551598980064689e-06 3.920807949021254055e-03
8.220606733398122534e-02 1.423538005433377669e-05 2.497985029847501208e-06 3.999455585491304595e-03
8.385303250492852356e-02 1.405527321726848283e-05 2.424843427176616683e-06 4.079685680162416402e-03
8.553315516590290213e-02 1.426889785090836694e-05 2.310935113322736890e-06 4.161530032175762145e-03
8.724709968610400157e-02 5.283154850349717875e-06 2.165403807861982225e-06 4.245021084488905319e-03
8.899554379434909679e-02 9.417197548686544151e-06 2.205507365006875186e-06 4.330191936351922809e-03
9.077917886233644040e-02 1.203083973816843591e-05 2.230870929217033557e-06 4.417076357152639743e-03
9.259871018852147695e-02 6.848666367270758143e-06 2.084849368586095416e-06 4.505708800426923288e-03
9.445485725842317493e-02 2.043970060782080068e-05 2.001967172961944386e-06 4.596124417414452872e-03
9.634835404173805462e-02 1.231181557869309655e-05 2.022645799649350840e-06 4.688359071721394031e-03
9.827994926327623948e-02 1.313516623108118561e-05 2.077915893552664426e-06 4.782449353469942800e-03
1.002504067237127566e-01 1.245585195850324519e-05 1.960586909967675604e-06 4.878432594899393022e-03
1.022605055945021990e-01 8.621818847219296460e-06 1.882747216222397348e-06 4.976346885472097102e-03
1.043110407102973425e-01 1.315045862264523980e-05 1.934769567267957097e-06 5.076231087091617397e-03
1.064028229066577014e-01 9.259336988454914649e-06 1.803212283950989229e-06 5.178124850670462985e-03
1.085366793216400172e-01 1.342089822909908162e-05 1.865654742142013044e-06 5.282068631953015965e-03
1.107134537356495585e-01 9.878463135535724780e-06 1.740974358234873856e-06 5.388103708518007275e-03
1.129340068938492481e-01 1.067040813510995834e-05 1.720687640232034035e-06 5.496272196529182648e-03
1.151992168562147867e-01 1.076215350963682719e-05 1.785159102590859006e-06 5.606617068403386625e-03
1.175099793469439602e-01 1.109207515644309410e-05 1.750087947044002713e-06 5.719182170672596453e-03
1.198672080838734666e-01 8.838312473075775964e-06 1.658112519964997156e-06 5.834012241547440435e-03
1.222718351917669449e-01 1.027582618984679066e-05 1.661647622701717520e-06 5.951152930867735676e-03
1.247248114959672027e-01 9.183337490158789948e-06 1.661451542379839511e-06 6.070650817198764738e-03
1.272271069799270904e-01 1.022575424331777344e-05 1.683879141877419323e-06 6.192553429392382708e-03
1.297797111018208771e-01 1.171183302054409348e-05 1.676435810342743878e-06 6.316909264760115633e-03
1.323836332171378083e-01 1.204141905939989127e-05 1.697558616984465179e-06 6.443767810233794538e-03
1.350399029926797057e-01 1.076491545474115086e-05 1.710843058760015890e-06 6.573179563567290232e-03
1.377495707673768166e-01 1.053878482899999463e-05 1.737439286843465327e-06 6.705196053438165424e-03
1.405137080233157532e-01 8.043912220813658197e-06 1.722153488891441243e-06 6.839869862672653088e-03
1.433334077516280891e-01 1.082901709391120592e-05 1.636941892799713954e-06 6.977254649049053660e-03
1.462097849318356735e-01 9.458368440247056445e-06 1.688236605430866921e-06 7.117405169328115157e-03
1.491439769689291150e-01 6.964378892241260243e-06 1.692052581742171228e-06 7.260377302503995094e-03
1.521371440924113583e-01 6.790830056535694161e-06 1.698443855541887278e-06 7.406228072396115955e-03
1.551904698771277291e-01 9.906061569262005653e-06 1.713215544784203451e-06 7.555015673715895566e-03
1.583051616722408739e-01 1.538197814134771430e-05 1.790353673185308810e-06 7.706799496109579053e-03
1.614824511337922819e-01 1.393431030582741412e-05 1.829871082137000559e-06 7.861640151231343021e-03
1.647235946099745163e-01 1.120602009756404888e-05 1.874310889369136748e-06 8.019599496388259141e-03
1.680298737295591316e-01 1.285913013893806792e-05 1.835648360666953673e-06 8.180740663811188890e-03
1.714025958676780592e-01 1.281574826105827122e-05 1.755468338945156938e-06 8.345128087161701735e-03
1.748430947238615585e-01 1.421254129832019325e-05 1.731987319275674869e-06 8.512827531350321969e-03
1.783527306724646266e-01 1.201969343456800250e-05 1.825535029599323926e-06 8.683906116912979214e-03
1.819328915894144749e-01 1.043095272878834479e-05 1.892083263863651327e-06 8.858432357125425871e-03
1.855849931291780330e-01 9.385953664230022866e-06 1.870269476258467164e-06 9.036476181397865362e-03
1.893104793847306400e-01 8.377842152879553169e-06 1.682650562066749396e-06 9.218108969035611333e-03
1.931108235421425123e-01 1.109850057043948884e-05 1.785687979980989539e-06 9.403403583365320742e-03
1.969875283157863921e-01 1.017577742796924668e-05 1.861463938177747679e-06 9.592434400723621329e-03
2.009421266846889986e-01 7.975287274273907485e-06 1.846994649806913470e-06 9.785277347763028655e-03
2.049761823762698831e-01 6.880251680474834028e-06 1.814843575546700014e-06 9.982009932806658326e-03
2.090912905072203620e-01 8.246727443765017166e-06 1.884431246735864281e-06 1.018271128185005946e-02
2.132890784031984122e-01 8.591274846148101843e-06 1.865005835929647063e-06 1.038746217979672620e-02
2.175712058550111072e-01 9.989349226131584731e-06 2.082503374438805130e-06 1.059634509778979369e-02
2.219393661553441255e-01 9.196395516618223925e-06 2.112144986651856645e-06 1.080944424136452524e-02
2.263952866417510146e-01 5.952355432522756625e-06 2.099466327218956650e-06 1.102684558655568116e-02
2.309407292017132962e-01 9.478603353169988462e-06 2.212322176579200235e-06 1.124863691575461386e-02
2.355774912025769829e-01 7.630862645645273073e-06 2.354592852173417488e-06 1.147490786530076801e-02
2.403074062440209990e-01 9.005300941932485777e-06 2.201055042091646164e-06 1.170574996928940015e-02
2.451323445543422130e-01 4.939648292358372390e-06 2.206978828229185162e-06 1.194125669501224646e-02
2.500542140116138179e-01 1.342025995237382916e-05 2.543583675178233791e-06 1.218152349540776135e-02
2.550749609417899966e-01 8.113227520485951670e-06 2.557601985895412696e-06 1.242664785666339200e-02
2.601965705387065397e-01 1.559616108570362523e-05 2.755751280099635031e-06 1.267672933700084333e-02
My fitting parameters are:
Objective - fit_c_PLP0051780
Dataset = fit_c_PLP0051780
datapoints = 181
chi2 = 5.626907906311002
Weighted = False
Transform = Transform('logY')
Parameters: 'fit_c_PLP0051780'
Parameters: 'instrument parameters'
<Parameter: 'scale' , value=1.05888 +/- 0.0803, bounds=[0.8, 1.2]>
<Parameter: 'bkg' , value=2.44049e-07 +/- 7.37e-07, bounds=[0.0, 5e-06]>
<Parameter:'dq - resolution', value=5 (fixed) , bounds=[-inf, inf]>
<Parameter: 'q_offset' , value=0 (fixed) , bounds=[-inf, inf]>
Parameters: 'Structure - '
Parameters: 'BULk'
<Parameter:'fronting - thick', value=0 (fixed) , bounds=[-inf, inf]>
<Parameter:'fronting - sld', value=2.19365 +/- 0.1 , bounds=[2.0, 2.3]>
<Parameter:'fronting - isld', value=0 (fixed) , bounds=[-inf, inf]>
<Parameter:'fronting - rough', value=0 (fixed) , bounds=[-inf, inf]>
<Parameter:'fronting - volfrac solvent', value=0 (fixed) , bounds=[0.0, 1.0]>
Parameters: 'spline'
<Parameter:'spline - spline extent', value=67.5286 +/- 45.8 , bounds=[30.0, 200.0]>
Parameters: 'dz - spline'
<Parameter:'spline - spline dz[0]', value=0.142761 +/- 1.39 , bounds=[0.0, 1.0]>
<Parameter:'spline - spline dz[1]', value=0.220703 +/- 1.37 , bounds=[0.0, 1.0]>
<Parameter:'spline - spline dz[2]', value=0.431873 +/- 0.865, bounds=[0.0, 1.0]>
Parameters: 'vs - spline'
<Parameter:'spline - spline vs[0]', value=3.65488 +/- 1.41 , bounds=[2.0, 4.5]>
<Parameter:'spline - spline vs[1]', value=2.00744 +/- 0.84 , bounds=[2.0, 4.0]>
<Parameter:'spline - spline vs[2]', value=4.00071 +/- 2.48 , bounds=[4.0, 5.0]>
Parameters: 'Au'
<Parameter: '1 - thick' , value=213.611 (fixed) , bounds=[180.0, 300.0]>
<Parameter: '1 - sld' , value=4.5 (fixed) , bounds=[4.5, 4.7]>
<Parameter: '1 - isld' , value=0 (fixed) , bounds=[-inf, inf]>
<Parameter: '1 - rough' , value=2.56035 (fixed) , bounds=[1.0, 30.0]>
<Parameter:'1 - volfrac solvent', value=0 (fixed) , bounds=[0.0, 1.0]>
Parameters: 'Cr'
<Parameter: 'thick' , value=90.945 (fixed) , bounds=[80.0, 160.0]>
<Parameter: 'sld' , value=3.09841 (fixed) , bounds=[3.0, 3.1]>
<Parameter: 'isld' , value=0 (fixed) , bounds=[-inf, inf]>
<Parameter: 'rough' , value=9.98872 (fixed) , bounds=[1.0, 10.0]>
<Parameter: 'vfsolv' , value=0 (fixed) , bounds=[0.0, 1.0]>
Parameters: 'SiO2'
<Parameter: 'thick' , value=10.0109 (fixed) , bounds=[10.0, 100.0]>
<Parameter: 'sld' , value=4.18257 (fixed) , bounds=[4.18, 4.3]>
<Parameter: 'isld' , value=0 (fixed) , bounds=[-inf, inf]>
<Parameter: 'rough' , value=1.51061 (fixed) , bounds=[1.0, 10.0]>
<Parameter: 'vfsolv' , value=0 (fixed) , bounds=[0.0, 1.0]>
Parameters: 'Si'
<Parameter:'backing - thick', value=0 (fixed) , bounds=[-inf, inf]>
<Parameter:'backing - sld', value=2.13637 (fixed) , bounds=[2.0, 2.2]>
<Parameter:'backing - isld', value=0 (fixed) , bounds=[-inf, inf]>
<Parameter:'backing - rough', value=9.96248 (fixed) , bounds=[1.0, 10.0]>
<Parameter:'backing - volfrac solvent', value=0 (fixed) , bounds=[0.0, 1.0]>
I applied Spline to described the molecular adsorbed, non-uniformed layer. I assume it is Spline function is not running smoothly. And I am trying four Slabs to replace Spline, currently, No collapse.
My email is: [email protected]
I appreciate your help a lot.
Cheers
Yunxiao
The script below gives a minimal example below
This fails to run because the algorithm to identify what the spin state fails
The issue can be fixed by modifying the if statement below in platypusnexus.py
Line ~778 if channel.spin_state is _spin_channels[sc]
from refnx.reduce import PolarisedReduce, SpinSet
from refnx.reduce import PolarisationEfficiency
import matplotlib.pyplot as plt
directory = "c:\Test\PNR_files"
spinset_rb1 = SpinSet(
down_down = "PLP0016430.nx.hdf",
up_up = "PLP0016431.nx.hdf",
data_folder = directory
)
spinset_db1 = SpinSet(
down_down = "PLP0016426.nx.hdf",
up_up = "PLP0016427.nx.hdf",
data_folder = directory
)
polreduce = PolarisedReduce(spinset_db1)
polreduce.reduce(spinset_rb1, save=True, scale=1.5, lo_wavelength=2.3, hi_wavelength=12, rebin_percent=2)
plt.errorbar(
polreduce.reducers["uu"].x[0],
polreduce.reducers["uu"].y[0] ,
polreduce.reducers["uu"].y_err[0],
label="++"
)
plt.errorbar(
polreduce.reducers["dd"].x[0],
polreduce.reducers["dd"].y[0],
polreduce.reducers["dd"].y_err[0],
label="--"
)
plt.legend()
plt.yscale("log")
Would be best if this was only applied to the reflected beam for situations where you want to test reductions with a specified position and FWHM that has been determined by the manual beam finder. Are there any situations where you need to specify the position and FWHM for the direct beam?
For the measured data, R increases at the beginning of the curve (Q is from 0 to 0.05 Å ^(-1)). While for the fitting, R remains constant or decreases. Can someone tell me the reason?
As it stands the CurveFitter.fit()
function will only minimise the negative log-likelihood (or residuals for least-squares).
This means that for systems such as the LipidLeaflet
class, the logp()
function has no effect on the initial fitting.
Would it be relevant to allow the CurveFitter.fit()
function to optimise the negaive log posterior in addition to the negative log likelihood?
If the refnx gui loads a file from a network drive (e.g. GoogleDrive), then the filename associated with the datafile is something like 'G:/My Drive/...'
When the mcmc.py code fragment attempts to use that filename it fails because that path isn't resolvable because of the weird drive letter. The actual volume name is '/Volumes/Google Drive/ My Drive/...'. Resolution of this issue needs to look at getting the fully qualified name.
This project lists jupyter-sphinx
in its sphinx configuration or requirements file, but does not appear to use this extension directly anymore.
We have just rewritten jupyter-sphinx
from scratch, and fully changed the interface.
Therefore we would like to give you a heads up, in case you find the new version useful.
Now all code execution is done via a single jupyter-execute
sphinx directive, instead of the previous ipywidgets-setup
/ipywidgets-embed
.
The new version is also much more flexible: it allows to embed any output recognized by jupyter in sphinx documentation. See the project documentation for details.
We have released 0.2.0rc1 on pip, please give it a shot using pip install 0.2.0rc1 --pre
— we would love to hear your feedback.
Please also let us know by responding here or in jupyter/jupyter-sphinx#33 if the new release disrupts your workflow.
need to write a test for the objective.nlpost function
I'm not sure why this even possibly could be, but the current readthedocs page for refnx doesn't search for me (two different devices), I just get a "searching..." page but never any results. It has been this way for a few days at least.
I have been trying to work out how to view or implement a progress bar for the fit, or at least some verbose output. Am using jupyter notebooks, but I don't think that this matters????
Do a fit using the refnx frozen executable using differential_evolution. Multiple windows start spawning everytime one does a fit. This occurs with the app executable for 0.1.14.
I opened up a bug, pyinstaller/pyinstaller#4865. The bug doesn't manifest when starting from the command line.
fitter = CurveFitter(objective, nwalkers=600)
fitter.fit(method=‘L-BFGS-B’,target='nll',######)
I expected (possibly incorrectly), that any args from https://docs.scipy.org/doc/scipy/reference/optimize.minimize-lbfgsb.html#optimize-minimize-lbfgsb should be accepted when put where ###### is. Instead get an error that it is unexpected.
Hello,
There is a typo in the definition of the polarizer efficiency matrixes in the PolarisedEfficiency class (reduce.py).
self.flipper2_matrix = [
[one, z, one, z],
[F2, (1 - F2), z, z],
[z, z, one, z],
[z, z, F2, (1 - F2)],
]
should be
self.flipper2_matrix = [
[one, z, z, z],
[F2, (1 - F2), z, z],
[z, z, one, z],
[z, z, F2, (1 - F2)],
]
This will make it consistent with the original reference Review of Scientific Instruments 70, 4241 (1999) on Page 3. I found that using the incorrect matrix introduces a significant error (up to 30% over subtraction) in the NSF channel corrections when strong SF signals are also present.
Cheers,
David
The Try/Except process designed to count the number of header rows is doing so by catching ValueErrors, when the enclosed operation is throwing ParseErrors.
Could just allow the user to supply the number of headers rows as a kwarg?
I ran into issues when trying to save and load models using Motofit. I can save the model fine but when I try to load it, I got an error saying:
AttributeError: 'Stack' object has no attribute 'thick'
My model is composed of a stack component. I added my model file saved.
test_highchi1.txt
It seems to be impossible to initialise walkers from an existing chain. CurveFitter.initialise_with_chain (line 350 of refnx/refnx/analysis/curvefitter) seems to assume that fitter.chain is not None (which is a bad assumption, as you're trying to initialise the fitter).
Furthermore, it seems like there is a circular reference between fitter.initialise_with_chain (line 367) and fitter.initialise (line 292).
In previous versions, this could be done by directly setting fitter._lastpos to the desired walker location, but this attribute seems to have gone missing at some point.
EDIT: I see that CurveFitter._state.coords has relaced ._lastpost
lnprob doesn't return 0
parameters needs to return list of parameters
remove minimal roughness
speedup like brush code.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.