pybamm-team / pybamm Goto Github PK
View Code? Open in Web Editor NEWFast and flexible physics-based battery models in Python
Home Page: https://www.pybamm.org/
License: BSD 3-Clause "New" or "Revised" License
Fast and flexible physics-based battery models in Python
Home Page: https://www.pybamm.org/
License: BSD 3-Clause "New" or "Revised" License
See #16.
The base model will internally hold an expression tree representing the rhs of an ode model. e.g.
class BaseModel:
def __init__(self):
self._rhs = ... # lhs is a dictionary mapping variables to expressions
self._initial_conditions = ... # a dictionary mapping expression to expressions representing the initial conditions
self._boundary_conditions = ... # a dictionary mapping expression to expressions representing the boundary conditions
@property
def rhs(self):
return self._rhs
...
Note: rather than having rhs
return a dict, it might be nicer to have BaseModel
emulate a dictionary, so you could write model[variable]
and get the rhs expression associated with that variable (see https://docs.python.org/3/reference/datamodel.html?emulating-container-types#emulating-container-types)
Add a solver test to StefanMaxwellDiffusion tests.
Following from #49
If each submodel (e.g. StefanMaxwellDiffusion) is a Model in its own right, we should have a function which allows for the simple concatenation of two submodels e.g. the functionality:
basemodel.create_from_submodels([submodelA, submodelB, submodelC, etc])
This would concatenate the rhs, initial_conditions, boundary_conditions, variables dictionaries.
It is possible that overlapping variable names may become an issue (e.g. concentration in negative and positive particle if within the particle expression they are simply called c). By associating a unique name to each submodel when it is initially defined, this could be avoided.
Summary
Add functionality that allows the existing models to be solved using FEniCS instead of Finite Volumes
Motivation
Ideally we should be able to switch between different spatial discretisations (e.g. Finite Volumes and Finite Elements) without affecting the rest of the code (especially the model)
Additional context
Non-exhaustive to-do list:
This is a pre-requisite for #17
Refactor this class to that it simply returns an symbolic expression for the Homogeneous reactions.
Should this really be a class? A function is probably sufficient
I was just looking at the discretisation of the boundary conditions. At the moment this seems to be mainly done in the gradient
and divergence
functions of MatrixVectorDiscretisation
, which both do this:
# Add boundary conditions if defined
if symbol.id in boundary_conditions:
lbc, rbc = boundary_conditions[symbol.id]
discretised_symbol = self.concatenate(lbc, discretised_symbol, rbc)
gradient_matrix = self.gradient_matrix(domain)
return gradient_matrix * discretised_symbol
So it looks like extra ghost nodes are added to the current state vector to implement the bc. Should this be done instead when the pybamm.Variable
node is processed in process_symbol
? Also, the way the boundary conditions are implemented would depend on the method, so this should probably be done in FiniteVolumeDiscretisation
? @tinosulzer?
Add a new solver class DAESolver
After discussions, we've decided to switch to using an expression tree to encode each model and submodel (see #14). The idea is that this would decouple the different aspects of PyBaMM (e.g. model/set_parmeters/discretisation/solver) and allow for a pipeline approach, where the expression tree is mutated by each aspect in turn.
This issue is to provide an initial implementation of the expression tree so that it can start being used by the rest of the code.
class Symbol
: base class, holds list of child nodes, has a (string) name
symbol + symbol
returns Addition(symbol,symbol))
class UnaryOperator(Symbol)
: provides a base class for all the unary operators (e.g. Negate)class BinaryOperator(Symbol)
: same but for binary opsclass Addition(BinaryOperator)
: implementation of the addition op,class Laplacian(UnaryOperator)
: implementation of the laplacian op.class Parameter(Symbol)
: a parameter (symbolic, no value). A Parameter has a list of domains (text) that it is valid overclass Variable(Symbol)
: a variable that will (eventually) be discretised over the mesh. At the moment it is purely symbolic. A variable has a list of domains (text) that it is valid overclass Vector(Symbol)
: a node representing a vector (the spatial discretisation class/function will add these in)class Matrix(Symbol)
: a node representing a matrix (the spatial discretisation class/function will add these in)class Scalar(Symbol)
: a node representing a scalar (the set_parameters function will add these in place of Parameter
). A Scalar has a list of index ranges, and a value for each rangeclass IndependentVariable(Symbol)
: a node representing an independent variable (space or time)provide functionality for iterating through the tree, e.g.
[node.name for node in PreOrderIter(f)] # pre-order traversal
[node.name for node in PostOrderIter(f)] # post-order traversal
[node.name for node in LevelOrderIter(f)] # level-order (i.e. breadth-first)
(note: these examples come from the anytree library)
there will be other functionality that we will need. e.g converting to/from other types of expression trees like Fenics UML, but this issue is for the initial implementation, other things can go in separate issues...
Summary
Edit parameters.py to load lithium-ion instead of lead-acid
Motivation
We should be able to switch between lithium-ion and lead-acid parameters simply by changing the name of the input file
To do
"A Scalar has a list of index ranges, and a value for each range". This is not implemented
for an expression with more than one grad(c)
or div(c)
, the BaseDiscretisation will discretise each of them individually. We should identify these sub-expressions and replace them all with the same Matrix
note that identifying sub-expressions is also in #55, so this will overlap a bit
Update
Scope of this issue has been extended to also implementing the SPM and SPMe by calling submodels since the forms of the models are so similar.
Summary
We need to implement a DFN
class. To do this, we need:
butler_volmer
function that creates a G
given in terms of the other parameters and variables in the modelStefanMaxwellCurrent
class which implements the conservation of current equations in the electrolyte as well as the McInnes equation for phi_e.StandardElectrode
class which implements current conservation and Ohm's law in the macroscale electrodeDFN
class which:
G
using the butler_volmer
functionStefanMaxwellDiffusion
and StefanMaxwellCurrent
StandardElectrode
.StandardParticle
. It must be somehow indicated that a particle must be inserted in every cell of the discretised model.DFN
we should also add the following useful variables:
Summary
Implement the SPM. We require:
StandardParticle
class which implements spherical diffusion on a spherically symmetric particle taking a boundary condition G
for the flux on the particle surface.SPM
class which calls homogeneous_reactions
to create G
and then calls StandardParticle
twice to create a negative electrode particle and a positive electrode particle.variables
dictionary of the modelDifferent parameters can have different values in different domains. This can be represented in a model by a concatentation of Parameters, which will change to a concat of Scalars. The discretisation class need to change this to a concatenation of Vectors
.
See #16.
It is generally useful to encode an expression tree to text, in order to:
Using method of lines:
scipy.integrate.solve_ivp(..., method='BDF')
to solve ODEsHow to check convergence to a manufactured solution, both in space and time?
Would want to do this in test_simulation.py
After discussions in #21 , it would be useful for the BaseModel
class to have an attribute variables
with that maps "variable name" to a variable. This dict needs to be processed by parameters and discretisation just like rhs
and initial_conditions
. Will then be useful for visualisation.
for Parameters
that vary with time, user puts in the name of the function (functions defined in .py in the same directory as the parameter file). Then the parameter class replaces Parameter
with Function
of that name
In ParameterValues.process_model
, the processing of boundary conditions assumes that boundary conditions are a single equation, not a dictionary ({"left": , "right": }
).
Also, ParameterValues.process_model
is currently untested.
Getting issues when trying to write
c = Variable('c')
1 - c
and also
-c
Does the BinaryOperator class account for the case of integer inputs or no input on one side but symbols on the other?
Doc build failing - see build report
Problem seems to be that guzzle_sphinx_theme
is not being installed. It is in requirements-docs.txt
, but not in set-up.py
.
@martinjrobins is there a way to avoid having to remember to keep these two in sync (and is there anything else that also needs to be kept in sync?)
Finite Volumes convergence is not guaranteed to be h**2 if the grid is not uniform
(Diskin, Boris, and James L. Thomas. "Notes on accuracy of finite-volume discretization schemes on irregular grids." Applied numerical mathematics 60.3 (2010): 224-226.)
To do: Change Mesh class so that convergence is h**2 even if the mesh is not uniform
Currently the only class with __repr__
implemented is Symbol
. Other classes need a more accurate __repr__
than just inheriting from Symbol.
After ParameterValues
swaps out all the Parameter
for Scalar
nodes, there is the possibility that you get things like Scalar(4) * Scalar(5)
in the tree, which can be combined into Scalar(20)
see #36. It might be useful for the overloaded binary operators onSymbol
(e.g. __mul__
) be able to take Numbers
(e.g. int'). They would then implicitly convert this to a
Scalar` node.
Note: also need to add __pow__
to this list.
Summary
Describe the bug
The Domain
class checks that domain
is an iterable and raises a TypeError
if it isn't.
However, a str
domain is not caught by this as iter("a string") == True
.
This isn't caught by the test in TestIndependentVariable
; test should be
with self.assertRaises(TypeError):
pybamm.IndependentVariable("a", domain="test")
Suggested fix
If domain
is a string, replace with list(domain)
.
Note: #72 Should be completed before this issue.
Summary
We require an SPMe
class. This class builds upon the SPM
class by adding an electrolyte model and modifying the voltage expression. This will be done by calling the StefanMaxwellDiffusion
class to add a model for diffusion in the electrolyte. The StefanMaxwellDiffusion
class will require a G
as an input. You can either call the homogenous_reactions
function again to create this G
or it can be somehow extracted from SPM. The voltage expression should just be overwritten.
Note: SPMe
could also be implemented without calling SPM
and instead build the same things inside SPMe
again but this might be a good exercise to see how easy it is to add physics to a model.
Describe the bug
Binary operators are not being processed properly by ParameterValues
. At line 119,
new_left = self.process_symbol(symbol.children[0])
new_right = self.process_symbol(symbol.children[1])
the children are being switched by the first line, so new_left
is processed again by the second line:
# symbol.children = (left, right)
new_left = self.process_symbol(symbol.children[0])
# symbol.children = (right, new_left)
new_right = self.process_symbol(symbol.children[1])
# symbol.children = (right, new_new_left)
Solution
Possibly a hacky solution, but we can get around this by pre-setting pointers to left
and right
left, right = symbol.children
new_left = self.process_symbol(left)
# symbol.children = (right, new_left)
new_right = self.process_symbol(right)
# symbol.children = (new_left, new_right)
Need a node representing the state vector, to replace Variable
in the discretisation step.
Should have an internal slice object (https://docs.python.org/dev/library/functions.html#slice) that can be None
if the node returns the whole state vector
for example, from @tinosulzer:
class VariableVector(pybamm.Symbol):
"""A vector that depends on an input y."""
def __init__(self, y_slice, name=None, parent=None):
if name is None:
name = str(y_slice)
super().__init__(name=name, parent=parent)
self._y_slice = y_slice
@property
def y_slice(self):
return self._y_slice
def evaluate(self, y):
return y[self._y_slice]
Symbol types might need __neg__
, __pow__
, __abs__
, others?
I have pushed a branch: component_class to my fork of PyBaMM if you want an example. On that branch, I have briefly explore this potential class structure for models.
The basic idea is to have two sets of classes: Models and SubModels each of which have common methods and variables. Then we derive other classes from these such as:
class SPM(Model):
which then inherits the methods common to Model. It seems that this would make it easier to add a new type of model as we inherit all the standard methods.
The Models are a two layer structure with
The superclass 'Model' defines the methods and variables common to all models such as .initial_conditions() and .rhs()
The subclasses on level 2. then just define things specific to that model, e.g the domain and the type of submodel to use on that domain.
SubModels are arranged in a three layer structure:
All SubModels should have a common structure so that they can be interacted with by Models in a consistent way. Therefore a superclass SubModel that contains common methods and variables is used (might just be conceptually useful as, for example, .rhs() will be different for each submodel).
We then have a different type of SubModel for each sub_domain: Particle, Electrolyte, etc. These define the methods and variables common to each of their domains such as setting initial conditions but allow for further specification. Finally, on level 3. a particular SubModel is choosen for a particular domain. Here specifics like the exact model equations on that domain are stated.
Motivation
Increase modularity of the model section of the code
Add class(es)/function(s) to visualise the result of a simulation. Options:
The formatting for the documentation on the expression tree node classes does not match the rest of pybamm, need to make these consistent
see #16.
the new spatial discretisation base class and derived classes need to process a Model, along with an input mesh in order to:
Variable
nodes with Vector
nodesMatrix
nodesexample use:
model = ReactionDiffusionModel()
param_file = Parameters("file.dat")
param_file.process(model)
disc = FiniteVolumeDiscretisation(mesh_dx=0.1, something_else=2)
y0 = disc.process(model)
My current thoughts are that the discretisation class should also generate initial conditions (perhaps from functions contained in the Variable
nodes of the expression tree?), and output the initial state vector from the process
function so that it can be used by the solver class
so, for example, you can set the boundary conditions in a model like so:
self.boundary_conditions = {N_e: {"left": 0, "right": 0}}
Refactor Simulation
class to run the whole process (create model, set parameters, discretise, solve, visualise). Needs to take optional inputs for the parameter class, parameter file, mesh/discretisation, solver, and set defaults if these are None
. Can later add capability to save solution, load a saved solution, etc.
class Simulation(object):
def __init__(
self,
model,
parameter_values=None,
discretisation=None,
solver=None,
name="unnamed",
):
# Defaults
if parameter_values is None:
# default parameter_values
if discretisation is None:
# default discretisation
if solver is None:
# default solver
self.model = model
self.parameter_values = parameter_values
self.discretisation = discretisation
self.solver = solver
self.name = name
def set_parameters(self):
self.parameter_values.process(self.model)
def discretise(self):
y0 = self.discretisation.process(self.model)
return y0
def solve(self, y0):
self.solver.solve(self.model, y0)
def run(self):
self.set_parameters()
y0 = self.discretise()
self.solve(y0)
def load(self):
def save(self):
def plot(self):
Building on issue #18 / PR #35
For a particular, model, parameter values may be needed that are combinations of "raw" parameter values. For example, the total electrode length L = Ln + Ls + Lp
, or any dimensionless parameter.
It doesn't make sense to write these all in the csv file that gets imported, so there needs to be a class that calculates the "derived" parameters, maybe split by domain ("negative_electrode", "separator", "positive_electrode" or "whole_cell"). There may be overlap between the domains.
This class should also contain scales for non/re-dimensionalisation
For example,
class DiffusionParameterValues(BaseParameters):
def __init__(self, base_parameters={}, optional_parameters={}):
super().__init__(base_parameters, optional_parameters)
self.scales = {"length": self.raw["Ln"] + self.raw["Ls"] + self.raw["Lp"],
"time": ...,
etc}
self.whole_cell = {"L": self.raw["Ln"] + self.raw["Ls"] + self.raw["Lp"],
"Cd": self.scales["length"]**2/self.raw["D"]/self.scales["time"],
etc}
self.negative_electrode = {"L": self.raw["Ln"], etc}
self.separator = {"L": self.raw["Ls"], etc}
self.positive_electrode = {"L": self.raw["Ln"], etc}
Maybe make DiffusionParameterValues
emulate a dictionary, similarly to #19 , with keys being domain names.
Now if we want to include new parameters, say for a temperature model, we will need more scales and derived parameters, but these should be in a separate class as they might call upon raw
values that would not necessarily be given to a DiffusionParameterValues
class.
class TemperatureParameterValues(BaseParameters):
def __init__(self, base_parameters={}, optional_parameters={}):
super().__init__(base_parameters, optional_parameters)
self.scales = {"Temp": self.raw["T_init"]}
self.whole_cell = {"k": self.raw["D_temp"]/ bla bla}
self.negative_electrode = {}
self.separator = {}
self.positive_electrode = {}
Then a BatteryParameterValues
could multiply-inherit from DiffusionParameterValues
and TemperatureParameterValues
, combining the dictionary attributes from each of these (not sure how to do this)
@martinjrobins @Scottmar93 do you think this is a good approach? See previous implementation (quite different) for an idea of which derived parameters need to be implemented.
This issue must be completed prior to refactoring the reaction diffusion model #17
StefanMaxwellDiffusion should inherit from BaseModel so that it is a complete model in its own right. It will contain a dictionary with the set of equations in the electrolyte implemented in symbolic form employing the expression tree.
Will looks something like:
class StefanMaxwellDiffusion(BaseModel):
def __init__(self):
self._rhs = {c: "epression"}
self._initial_conditions = .... etc
self._variable = {"c": c, ...}
EDIT.
The keys should be of class Symbol expect for the _variable dictionary.
Process initial conditions to take in combinations of scalars and return a vector with the right shape and value.
numbers.Number
), if not raise NotImplementedError
scalar_to_vector
to vector_of_ones
that takes in a domain
and returns a Vector
of the right shape with value 1.see #16.
This class will take an instance of a model that has been processed by the Parameter (#18) and Spatial Discretisation (#20) classes. That is, the solver can assume that all Parameter, Variable and Operator nodes representing gradients (e.g. Laplace) nodes in the expression tree have been replaced with concrete Vector and Matrix nodes. That is, the expression tree now represents a concrete linear algebra expression
For now we only consider Models representing ODE equations and solvers implementing of methods of lines (MOL). But the implementation should be flexible enough to be extended to DAE equations, and time discretistion with a newton iteration solver rather than MOL.
The solver class needs to:
y
and t
Solver class should store the current state vector internally, and have an integrate
function that can move the simulation forward in time, to allow for output/plotting over time, e.g.
model = ReactionDiffusionModel()
param_file = Parameters("file.dat")
param_file.process(model)
disc = FiniteVolumeDiscretisation(mesh_dx=0.1, something_else=2)
y0 = disc.process(model)
solver = RungeKuttaSolver(y0, tolerance=1e-5)
Tf = 1.0
nout = 100
nout_dt = Tf / nout
for i in range(nout):
solver.integrate(nout_dt)
plot(solver.y, disc)
Create a mesh class that takes a Parameter class with mesh parameters, for eventual input to the spatial discretisation class (#20)
eg.
model = ReactionDiffusionModel()
param_file = Parameters("file.dat")
param_file.process(model)
mesh = Mesh(param_file)
disc = FiniteVolumeDiscretisation(mesh, something_else=2)
y0 = disc.process(model)
Add to expression tree hierarchy: class IndependentVariable(Symbol)
: a node representing an independent variable (space or time)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.