juliamixedmodels / embraceuncertainty Goto Github PK
View Code? Open in Web Editor NEWThe book "Embrace Uncertainty: Fitting Mixed-Effects Models with Julia"
Home Page: https://embraceuncertaintybook.com/
The book "Embrace Uncertainty: Fitting Mixed-Effects Models with Julia"
Home Page: https://embraceuncertaintybook.com/
Now that the release version of Julia is 1.10.0 I don't have a julia-1.9 IJulia kernel accessible through juliaup without adding it explicitly. @palday Would it be okay to bump the jupyter
setting in the files in this repository to julia-1.10
?
"These values are often considered as some sort of “estimates” of the random effects. It can be helpful to think of them this way but it can also be misleading. As we have stated, the random effects are not, strictly speaking, parameters — they are unobserved random variables. We don’t estimate the random effects in the same sense that we estimate parameters."
It could be worth pointing out that in Bayesian LMMs these actually have the status of parameters.
Are these bootstrapped samples of
"To reiterate, this model can be reduced to a linear model because the random effects are inert, in the sense that they have a variance of zero." Is this sentence correct? The random effects don't necessarily have a variance of zero, the estimate of the variance from the data is zero.
Also, regarding the model being called degenerate, is it the model that is degenerate or the variance covariance matrix? In Pinheiro and Bates I think degenerate was only used in the context of the vcov matrix of random effects.
Finally, it may be worth mentioning that the 0 estimate of the random effect sd is just due to sparse data. If we had lots more data, the estimate would probably be non-zero. Simulation can demonstrate this point.
I probably missed it, but got stuck for a bit trying to figure out how to install EmbraceUncertainty when running the code here: https://juliamixedmodels.github.io/EmbraceUncertainty/intro.html#sec-dyestuff
It might be useful to reiterate here that you need something like the following code (even if it is mentioned elsewhere):
using Pkg; Pkg.add(url="https://github.com/JuliaMixedModels/EmbraceUncertainty")
The prefixes are described here.
But by default pandoc treats the top level as a section and not a chapter, as discussed in the caveats:
Because pandoc-crossref offloads all numbering to LaTeX if it can,
chapters: true
has no direct effect on LaTeX output. You have to specify Pandoc's--top-level-division=chapter
option, which should hopefully configure LaTeX appropriately.
I propose that we have all the datasets that we use available through the EmbraceUncertainty.dataset
generic and remove the data
directory.
For the purposes of the book dataset(:foo)
will mean EmbraceUncertainty.dataset(:foo)
.
Those datasets that are available through MixedModels.dataset
will be grandfathered into EmbraceUncertainty.dataset by checking the names against the result of MixedModels.datasets()
.
Data tables from the English Lexicon Project are available in the arrow
directory of https://github.com/dmbates/EnglishLexicon.jl
The datasets used in the longitudinal data chapter are available in the osf.io repository https://osf.io/sr46c/.
For the time being I think I will just customize the dataset retrieval for each name just so we can get it working. Later we can decide if there should be a unified storage and retrieval protocol.
everything in context of model structure
Currently we assign Grouping()
contrasts to any grouping factors. In the past it was necessary to do this to avoid unnecessarily creating large, dense contrasts matrices when grouping factors had many levels.
After JuliaStats/MixedModels.jl@9ee4a0a we no longer need to so (thanks to @palday).
Shall we remove the explicit assignments of Grouping()
contrasts? This will allow us to avoid creating a contrasts
Dict until late in the second chapter when we apply a +/- 1 encoding to a binary experimental factor.
Also, the show_progress
named argument to parametricbootstrap
has been deprecated in favor of progress
. It makes sense now to assign
const progress = false
and add a named argument progress
to calls to fit
and parametricbootstrap
.
I will create a PR for these changes if there is no objection.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.