Giter VIP home page Giter VIP logo

algebraicrewriting.jl's People

Contributors

aaguinal avatar algebraicjuliabot avatar epatters avatar jeffbezanson avatar kris-brown avatar mehalter avatar olynch avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

algebraicrewriting.jl's Issues

Problem finding mk_sched in notebook code for "Agent-based modeling via graph rewriting" blog

Hi,
Our CEPHIL team has been having trouble running block 17 of the following notebook code for "Agent-based modeling via graph rewriting" blog.

https://nbviewer.org/github/AlgebraicJulia/AlgebraicRewriting.jl/blob/compat_varacsets/docs/src/Dynamic%20Tracing.ipynb

Specifically, we get an error message indicating that "mk_sched" cannot be found. Here's the exact code and error message:

image

My project .toml file is attached as a txt file (so it can be attached_); is it adopted from the July7_2023 commit (reflecting the July 7 date of the associated blog post (https://blog.algebraicjulia.org/post/2023/07/graphical-schedule/#cb3-2).
Project.toml.txt

I further attach my complete .ipynb file, again as a .txt file so that it can be attached.
ACT-ABM-BlogExample-Jan-2023_With-July7_2023_Project_toml.ipynb.txt

Thanks so much for any guidance that can be offered!

Rewriting every matching pattern in a graph

So I'm running into a problem where I want to rewrite all of the matching patterns in this graph:
original

This is the pattern that I'm matching for:
match

And this is the rewrite pattern, with equivalence on the respective, outermost nodes on the matching pattern and edges discarded:
sub

After the first rewrite, the graph looks like this:
rewrite1

But when I apply it for a second time, it performs it in the same place:
rewrite2
Instead I would want into to perform it on the other two original patterns before it finishes.

What would be the best way to complete this kind of total rewrite?

Rewriting with "universal quantification" in patterns

Often we want to say things like "match all incoming edges (and their sources) to a vertex" and other things that cannot be expressed as a single homomorphism from a single C-Set. There are a couple ideas for how to address this:

1.) An ad hoc (but straightforward) solution is to represent a pattern L as a map L --> U where U is the universally-quantified part. This can be interpreted as a multispan with L as the apex and multiple copies of U for the legs that we will take the colimit of. (For n=0 it's just L, n=1 it's just U, n=2 as the pushout of L with two U's, etc.) We can interpret these rules as dynamically considering at runtime what is the highest n for which a match exists, or perhaps other constraints (e.g. max(n) such that n > 3 and n < 9) are part of the rule definition. This is straightforward because the rewriting is still done with vanilla rewriting of C-Sets, with just the pattern determined at runtime.

2.) "Graph Rewriting in Span-Categories" (2010, Löwe) represents a pattern as E -> U -> L (existentially quantified, universally quantified, and the overall pattern) and represents matches as spans G <- X -> L satisfying a certain property. Rewrites are computed using final pullback complements.

3.) "A relation-algebraic approach to graph structure transformation" (2001, Kahl) has another approach that tries to address a similar problem.

Conegation rewriting

Using the logic of biheyting algebras (described here) it should be straightforward to implement conegation rewriting (described here).

This semantics would be an intermediate between DPO (don't apply rule if it produces a dangling connection) and SPO (cascade delete until there are no dangling connections). It would still apply the rule, adding/deleting/merging things, except it will skip deleting anything in particular that would produce a dangling connection.

The corresponding PR could also implement the pushout complement code in terms of biheyting algebras (although the check if the dangling/identification conditions hold will remain unchanged).

Manifest files in repo

For packages, best practice is to not commit the Manifest.toml files (currently in root folder and docs) and rely on the Project.toml to specify the versions needed of the dependencies.

Inplace DPO rewrite method that returns maps

In the Inplace module https://github.com/AlgebraicJulia/AlgebraicRewriting.jl/blob/main/src/rewrite/Inplace.jl it would be nice to have something like rewrite_match_maps! which does the rewrite inplace but returns the maps like rewrite_match_maps does (https://github.com/AlgebraicJulia/AlgebraicRewriting.jl/blob/main/src/rewrite/DPO.jl).

This is helpful for using inplace rewriting with the incremental homset updating, for example. I'd be happy to look into how to make this contribution, but I don't understand what is going on in the Incremental module; with some rough guidance I could get started.

fix yoneda_cache support for DynamicACSets

I created a DynamicAcset using the following code:

@present SchAi2Thor(FreeSchema) begin
  ## Objects
  RoboGripper::Ob
  Thing::Ob  # all Null ∪ objects

  ## Attributes for Thing
  Label::AttrType
  IsOpen::AttrType
  IsBroken::AttrType
  IsCooked::AttrType
  IsSliced::AttrType
  IsToggled::AttrType
  IsDirty::AttrType
  IsFilledWithLiquid::AttrType
  IsUsedUp::AttrType

  thing_has_label::Attr(Thing, Label)
  thing_is_open::Attr(Thing, IsOpen)
  thing_is_broken::Attr(Thing, IsBroken)
  thing_is_cooked::Attr(Thing, IsCooked)
  thing_is_sliced::Attr(Thing, IsSliced)
  thing_is_toggled::Attr(Thing, IsToggled)
  thing_is_dirty::Attr(Thing, IsDirty)
  thing_is_filled_with_liquid::Attr(Thing, IsFilledWithLiquid)
  thing_is_used_up::Attr(Thing, IsUsedUp)

  PositionX::AttrType
  PositionY::AttrType
  PositionZ::AttrType
  thing_has_positionX::Attr(Thing, PositionX)
  thing_has_positionY::Attr(Thing, PositionY)
  thing_has_positionZ::Attr(Thing, PositionZ)

  ## Relations
  PartOfRel::Ob
  InRel::Ob  # Q: How do I enforce that things of a particular type can only be inside other things of certain types?

  is_holding::Hom(RoboGripper, Thing)
  is_part_of_l::Hom(PartOfRel, Thing)
  is_part_of_r::Hom(PartOfRel, Thing)
  is_in_l::Hom(InRel, Thing)
  is_in_r::Hom(InRel, Thing)
end

# specify types for attributes
const type_assignment = Dict(
  :Label => String,
  :IsOpen => Bool,
  :IsBroken => Bool,
  :IsCooked => Bool,
  :IsSliced => Bool,
  :IsToggled => Bool,
  :IsDirty => Bool,
  :IsFilledWithLiquid => Bool,
  :IsUsedUp => Bool,
  :PositionX => Float64,
  :PositionY => Float64,
  :PositionZ => Float64
)

# create dynamic acset object
const Ai2Thor = DynamicACSet("Ai2Thor", SchAi2Thor;
  type_assignment=type_assignment)

When I attempt to compute the representables for that ACSet, I get the following error:

ERROR: MethodError: no method matching yoneda_cache(::DynamicACSet)

Closest candidates are:
  yoneda_cache(::Type)
   @ AlgebraicRewriting ~/.julia/packages/AlgebraicRewriting/W29UX/src/rewrite/Representable.jl:41
  yoneda_cache(::Type, ::Any; clear, cache)
   @ AlgebraicRewriting ~/.julia/packages/AlgebraicRewriting/W29UX/src/rewrite/Representable.jl:41

Stacktrace:
 [1] top-level scope
   @ ~/Documents/Git/ai2thor-cset-experiments/julia/Ai2thorCsetProject/Ai2ThorWorld-Rules-CR.jl:114

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

show images in literate doc examples for ABMs

The ABM demos weren't written with literate programming in mind, so there should be sprinkled in throughout them calls to Graphviz so that it's clear what is going on, rather than the current output which is mostly large ACSet tables.

Pushout complement treatment of attrvars

This example shows that when pushout_complement creates a K ACSet, distinct parts of G can be given the same AttrVar when they should be different (for example, the pushout could try to assign true to one and false to another - this leads to a Catlab error). This is a minimal example:

using AlgebraicRewriting, Catlab, DataMigrations

@present Sch(FreeSchema) begin
  X::Ob 
  D::AttrType
  f::Attr(X,D)
end

@acset_type AbsFoo(Sch)

const Foo = AbsFoo{Bool}

L =  @acset Foo begin X=2; f=[false, false] end 
I =  @acset Foo begin X=2;D=2; f=AttrVar.(1:2) end 
R =  @acset Foo begin X=2; f=[false, true]end 
rule = Rule(homomorphism(I, L; monic=[:X]), homomorphism(I, R; monic=[:X]))

rewrite_match(rule, id(L))

To address this, pushout_complement should be as generous as possible in assigning parts AttrVars (basically anything in the image of I->K) to prevent conflicts between the attribute values in K vs R.

yoneda_cache file naming robustness to changing data structures

We cache the output of yoneda using yoneda_cache, but this creates a file based on the name of the ACSet. This could lead to name collisions (two different ACSet schemas which happen to share a name) and mystifying errors when one tries to read from the cache after the data of the schema has changed. One possible solution to these issues is to make the file name a hash of the content of the Presentation.

Rules to ODEs

A potential feature to implement: in section 6 of https://lmcs.episciences.org/6628 (the "voter model") there is an example which uses a combinatorial analysis of rewrite rules and produces continuous time evolution dynamics. This requires the computation of partial overlaps between rewrite rules.

Integrate incremental hom search with rewriting programs

Each of the boxes of a graphical rewriting recipe has various hom sets it needs with respect to the current state of the world. Currently these are generated from scratch whenever they are needed (for example, consider a Query block that needs to know all the paths of length 2 in the current graph).

There are certainly scenarios where it would be better to not maintain the hom sets incrementally. For example, if we pass through that Query block just once and never again, then it's pointless to maintain the hom set through every change we make in a possibly very long simulation. So some decision should be made on how to opt-in or opt-out (perhaps on a box-by-box basis) to incremental hom search.

Furthermore, because rewriting programs presently only work for Mark As Deleted ACSets, the incremental hom search will have be to made to support that ACSet type too.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.