Giter VIP home page Giter VIP logo

polytope's Introduction

TuLiP

This is the source repository for TuLiP, the temporal logic planning toolbox. The project website is http://tulip-control.org

Installation

In most cases, it suffices to:

pip install .

TuLiP can be installed also from PyPI:

pip install tulip

This will install the latest release, together with required dependencies. To find out what dependencies (including optional ones) are installed, call:

tulip.interfaces.print_env()

For detailed instructions, including notes about dependencies and troubleshooting, consult https://tulip-control.sourceforge.io/doc/install.html The next section describes how to build documentation. A test suite is provided under tests/. Consult the section "Testing" below.

Pip can install the latest development snapshot too:

pip install https://github.com/tulip-control/tulip-control/archive/master.zip

Code under development can be unstable so trying pip install tulip first is recommended.

Documentation

There are two main sources of documentation outside the code. The "user" documentation is under doc/ and is built with Sphinx, so the usual steps apply, :

make html

API documentation is generated using Epydoc and can also be built from the doc directory, now by :

make api

Built copies for the most recent release of TuLiP are available online at:

Command summaries are provided by make help. Besides the above sources, you may also read API documentation using the standard pydoc tool. E.g., :

pydoc tulip

Testing

Tests are performed using pytest. From the root of the source tree (i.e., where setup.py is located), :

./run_tests.py

to perform basic tests. To try all available tests, ./run_tests.py full. For alternatives and a summary of usage, ./run_tests.py -h

Contact and support

License

This is free software released under the terms of the BSD 3-Clause License. There is no warranty; not even for merchantability or fitness for a particular purpose. Consult LICENSE for copying conditions.

When code is modified or re-distributed, the LICENSE file should accompany the code or any subset of it, however small. As an alternative, the LICENSE text can be copied within files, if so desired.

If this software is useful to you, we kindly ask that you cite us:

I. Filippidis, S. Dathathri, S. C. Livingston, N. Ozay and R. M. Murray,
"Control design for hybrid systems with TuLiP: The Temporal Logic Planning
toolbox," 2016 IEEE Conference on Control Applications (CCA), Buenos Aires,
Argentina, 2016, pp. 1030-1041, doi: 10.1109/CCA.2016.7587949.

polytope's People

Contributors

ajwagen avatar carterbox avatar drscotthawley avatar johnyf avatar krooken avatar murrayrm avatar necozay avatar pettni avatar rmattila avatar samuelkolb avatar slivingston avatar stephanietsuei avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

polytope's Issues

support Python 3.7

The future is upon us. At the time of opening this issue, visiting https://docs.python.org/ results in documentation for Python version 3.7. We should support it:

  • add to list of classifiers in setup.py
  • add to test matrix on Travis CI (.travis.yml) (EDIT: now done in GitHub Action .github/workflows/main.yml)
  • skim the "What's New?" section of the 3.7 announcement, and check for parts of Polytope that might be broken from it;
  • if any code changes for compatibility, write regression tests

cvxopt should be an optional dependency

cvxopt is a notorious dependency, both in terms of installing and using it. We should aspire to regarding it as optional because

  • polytope can still be useful without having an optimization package available, and
  • a long-term goal is to create a generic interface to several different optimization packages, and infrastructure to make cvxopt optional is some progress in that direction.

An example of how to achieve this is being considered for introduction into Tulip. The relevant pull request there is 122.

should `cvxopt` be included in requirements.txt?

On the current tip of master branch, cvxopt is included in requirements.txt. There is a comment above it to indicate that it is optional, but it has no semantic impact on the file.

I propose that it is removed. Thoughts?

My main motivation is to have requirements.txt only enforce required dependencies to improve the reliability of using pip install -r requirements.txt. Installing cvxopt is still somewhat difficult to reliably automatically install from source code.

According to the documentation of pip install, it is possible to use -r option multiple times, i.e., to give multiple requirements file. One idea is to create a separate extra_requirements.txt that include optional dependencies like cvxopt. Then, a full install can be achieved with

pip install -r requirements.txt -r extra_requirements.txt

while automated enviroments like Binder will continue to use the standard pip install -r requirements.txt.

create regression test for bug fix of PR #56

In the author's own words:

This PR should really have a unit test that demonstrates the error in projection_interhull and its fix, but I need this for a fix in TuLiP and so leaving that until a later PR.

cf. #56

improve numerical accuracy of polytope.reduce (polytope.intersect)

polytope.intersect calls polytope.reduce to simplify the H-representation after intersection, which basically concatenates constraints. When the numbers in given matrices have different orders, reduce does not work well. Below is an example where a1 intersection a2 and a2 intersection a1 lead to two different results (with the cvxopt conelp solver). a1 and a2 are two continuous propositions for a problem where the proposition preserving partition and the abstraction results tulip returns do not make sense. Though root cause seems to be reduce.

A short-term solution would be to include some warnings in docstrings/tutorial so that users can transform their system to have continuous state variables of similar magnitude. A mid-term solution might be to introduce some conditioning within the polytope package based on the size of input matrices. Long-term solution might be to interface with better LP solvers, which should partially alleviate this issue.

a1 = pc.Polytope.from_box([[0., 30000.],[0., 25.]])

a2 = pc.Polytope.from_box([[0., 40000.],[0., 25.]])

pc.reduce(pc.Polytope(vstack((a2.A,a1.A)),vstack((a2.b,a1.b))))
Out[86]: 
Single polytope 
[[ 1.  0.] |    [[ 30000.]
 [ 0.  1.] x <=  [    25.]
 [-1. -0.] |     [     0.]
 [-0. -1.]]|     [     0.]]

pc.reduce(pc.Polytope(vstack((a1.A,a2.A)),vstack((a1.b,a2.b))))
Out[87]: 
Single polytope 
[[ 1.  0.] |    [[ 40000.]
 [ 0.  1.] x <=  [    25.]
 [-1. -0.] |     [     0.]
 [-0. -1.]]|     [     0.]]

bounding_box format

Why is bounding box stored as a 2-tuple of vertical arrays?

For the purpose of applying transformations to the bounding box (#32), I think it would be better to store the bounding box as a single 2xN array where the min corner and max corner are their own rows.

release on PyPI

After some minor improvements (mainly logistics) the initial version will be released on PyPI to become available for automated installation through pip, easy_install or python setup.py install of a package that depends on it.

Before release a sandbox version can be found on testpypi.

Polytope installation requires setuptools>=26?

On a Travis Build for another project using python2.7, the install of polytope failed at from setuptools import setup. I updated the version of setuptools on the Travis server from 20 to 26. After that, the install succeeded. This probably means there should be a setuptools>=26 requirement added to requirements.txt, but I can't confirm whether this is the most lenient requirement.

parallelize set difference

Either using threading to avoid the cost of creating new processes with multiprocessing, or with a persistent pool of workers using multiprocessing, to which ConvexPolytope - ConvexPolytope jobs are passed.

The former approach is arguably preferable, because the computation tree is significantly parallelizable, due to the intermediate production of non-convex polytopes, which enable further differences to be passed to freshly spawned threads.

Note that if parallelization results in a significant difference, then that is expected to be observable also on personal machines, for those cases the branching factor of a difference (the number of convex polytopes into which a convex polytope breaks after subtracting another convex polytope) is not much larger than 4 (a common number of processors on laptops).

How to use cvxopt to find intersection between polytope

Dear Polytope Developers and users,
I am new to the polytope package.
For my project, I am using the Polytope package to create polytopes and regions.
I am able to create regions and plot them. In each region, I have many polytopes.

In one of the problems, I have a region(r1) with 16 polytopes and another region(r2) with 54 polytopes.

when I tried to find the intersection between these two regions, the polytope package take a very long time,
even after 20 min, it did not completed and I terminated the process.

Hence I tried to import cvxopt. with

import cvxopt.glpk
import cvxopt
I am able to import cvxopt. But I do not know how to use it to find the intersection .

So can anyone give your comments on how to use cvxopt for set operations?

YOur answers will be more helpful

and Thanks in advance

Best regards,
Muthu

With large scales `reduce` can remove non-redundant hyperplanes

Hyperplanes whose b element is large relative to their A row can be incorrectly removed with default tolerances:

import numpy as np
import polytope

# construct a 10x10 square centred on (-95, 0)
box = polytope.Polytope.from_box([[-1000, -990], [-5, 5]])

# split the box in two by adding a plane through y=995
half_box = polytope.Polytope(
    np.concatenate([[[1, 0]], box.A], axis=0),
    np.concatenate([[-995], box.b], axis=0),
)

# validate the volume of the half box (prints 50, correct)
print(half_box.volume)

# reduce the half box and check its volume (prints 100, incorrect)
print(polytope.reduce(half_box).volume)

The non-redundant hyperplane is removed by the part of reduce that removes linearly dependent rows: https://github.com/tulip-control/polytope/blob/main/polytope/polytope.py#L1109

release `polytope == 0.2.2`

  • update the file CHANGES.md
  • check the file MANIFEST.in
  • if needed update requirements in setup.py and requirements files
  • test
  • upload to PyPI.

`.project` can return redundant hyperplanes despite `minrep == True`

When attempting to construct the Minkowski sum of two [-1, 1]^3 cubes using the method described in section 2 of this paper, I end up projecting a 6D polytope with 12 constraints, as constructed below. The polytope returned by project claims to be in minimal representation however it contains 8 redundant hyperplanes.

import numpy as np
from polytope import Polytope, reduce

pt = Polytope(
    np.array([[ 0,  0,  0,  1,  0,  0],
              [ 0,  0,  0,  0,  1,  0],
              [ 0,  0,  0,  0,  0,  1],
              [ 0,  0,  0, -1,  0,  0],
              [ 0,  0,  0,  0, -1,  0],
              [ 0,  0,  0,  0,  0, -1],
              [ 1,  0,  0, -1,  0,  0],
              [ 0,  1,  0,  0, -1,  0],
              [ 0,  0,  1,  0,  0, -1],
              [-1,  0,  0,  1,  0,  0],
              [ 0, -1,  0,  0,  1,  0],
              [ 0,  0, -1,  0,  0,  1]]),
    np.ones((12,))
)

projected_pt = pt.project([1, 2, 3], solver='exthull')

print('is projected_pt minrep?', projected_pt.minrep)
print('projected_pt:')
print(projected_pt)
print('reduced projected pt after unsetting minrep:')
print(reduce(Polytope(projected_pt.A, projected_pt.b)))

The above prints:

is projected_pt minrep? True
projected_pt:
Single polytope 
  [[ 0. -0. -1.] |    [[2.]
   [-1.  0. -0.] |     [2.]
   [ 0. -1. -0.] |     [2.]
   [ 0.  1. -0.] |     [2.]
   [ 1. -0. -0.] |     [2.]
   [-1.  0. -0.] |     [2.]
   [-0. -1. -0.] x <=  [2.]
   [-0. -0.  1.] |     [2.]
   [ 0.  0. -1.] |     [2.]
   [ 0.  1. -0.] |     [2.]
   [ 1.  0. -0.] |     [2.]
   [ 0.  1.  0.] |     [2.]
   [ 1.  0.  0.] |     [2.]
   [ 0. -0.  1.]]|     [2.]]

reduced projected pt after unsetting minrep:
Single polytope 
  [[-1.  0. -0.] |    [[2.]
   [-0. -1. -0.] |     [2.]
   [ 0.  0. -1.] x <=  [2.]
   [ 0.  1.  0.] |     [2.]
   [ 1.  0.  0.] |     [2.]
   [ 0. -0.  1.]]|     [2.]]

Rmove or delete particular polytope from a region

Dear Users and developers,

I have a region with almost 100 polytopes and a reference polytope. I started to compare each of the polytope from the region with the Reference polytope. I want to delete the polytope in the region if it is not in the reference polytope.
does the Polytope package offer a short way to do this operation? now i do it as follows

import polyope as PC

oldRegion # --> contains 100 polytope
newRegion = []
referencePoly=pc.Polytope(A, b) #--> the reference polytope

for i in oldRegion:

if  i <= referencePoly :
      newRegion.append(i)

the i form the new region as pc.Region([newRegion])

however if there is an option to delete a polytope within the oldRegion on the fly, that would be more helpful.

Any suggestion on this issue, is highly appreciated

Thank you

Muthu Vallinayagam
Researcher, Institute of Experimental Physics,
TU Freiberg, Germany

clear memoized attribute values when geometry changes

It is advantageous to restrict ConvexPolytope and Polytope to have immutable geometry (except for their annotation, i.e., the propositions), so that expensive attributes (e.g., Chebyshev radius and center, envelope, bounding box, etc.) need not be recomputed.

Since there is no particular mechanism to monitor whether an already computed attribute is no more up to date, these objects are in practice already immutable (also: there is no suggested way of altering them, though as of dd707a5 A and b are not yet protected).

Therefore we may also consider equipping them with a __hash__ function, though one complication would be that two Polytopes with nearly the same geometry can be created but they would have different __hash__ values associated with them.

setuptools 18.5 says invalid version

~/.virtualenvs/dev/lib/python2.7/site-packages/setuptools/dist.py:294: UserWarning: The version specified ('0.1.2-dev-93d397e7686c65df543f1cd40556ad438a6f8e3b') is an invalid version, this may not work as expected with newer versions of setuptools, pip, and PyPI. Please see PEP 440 for more details.
  "details." % self.metadata.version

Rigid Rotation / Reorientation

I want to add a method to polytope which reorients the polytope.

I believe (with no proof) that it will be computationally less expensive to reorient an existing polytope and bounding sphere than to define a new polytope and calculate the bounding sphere from scratch. I also believe (because profiling shows a lot of time spent on cheby_ball and reduce) that the cost of computing bounding boxes and half-space representations is large compared to the intersection computation. My application involves computing intersections of the same set of polytopes at varying orientations.

Is this something y'all are interested in? Should the implementation be more general and accept non-rigid transformations? Should it automatically calculate the minimum number of rotations needed to reorient the polytope given an orientation matrix (after 3D, it is not guaranteed that there is a one step rotation between given orientations)?

Updated release on PyPI?

Hi everyone,

Are there any plans for a new release soon?

The most updated version of the codebase had some bugs that were fixed, when compared to 0.2.3, that would be nice to have on pip due to automatic dependency handling. I am building a package that uses Polytope's algorithms and it would be nice to have the most up-to-date code available on pip.

Thanks for making this nice package available to the community!

Zero Volume for 14D polytope

Dear polytope devs,

I hope this is the right place for this. I have created an example similar to the one in the tutorial for a convex polytope in H-representation, however it gives me a volume of 0.0 and also, there is no warning in case the dimensions do not match. I think I am missing something here, as the same problem gives a nonzero volume with the Volesti package in R.

My Python version is 3.7.6.

import numpy as np
import polytope as pt

A = np.array([[0.01281, -0.00435, -0.00303,  0.00307, -0.00161, -0.00153,
         0.00728, -0.00249,  0.02289,  0.00303,  0.00186, -0.01623,
        -0.00289, -0.00246],
       [ 0.01246, -0.00457, -0.0033 ,  0.00266, -0.00236, -0.00184,
         0.0068 , -0.00278,  0.0233 ,  0.00258,  0.00129,  0.11378,
        -0.00316, -0.00274],
       [ 0.03503, -0.0095 , -0.013  , -0.01412,  0.15911, -0.01669,
        -0.00333, -0.01422,  0.06059, -0.02439, -0.07807,  0.1101 ,
        -0.01338, -0.01452],
       [-0.03503,  0.0095 ,  0.013  ,  0.01412, -0.15911,  0.01669,
         0.00333,  0.01422, -0.06059,  0.02439,  0.07807, -0.1101 ,
         0.01338,  0.01452],
       [-0.05537,  0.01528,  0.01336, -0.00266,  0.19761,  0.01101,
        -0.0184 ,  0.01247, -0.09187,  0.00186,  0.02847, -0.12141,
         0.01314,  0.01252]])

b = np.array([62.2,   62.2, 1416. ,  978. , 1059.])

p = pt.Polytope(A, b)       
print(p.volume)

increase the default max_iter in projection_iterhull

Polytope projection function does not reveal the max_iter parameter of projection_iterhull. I found in playing some problems in tulip with open_loop discretization that projection_iterhull is used due to being efficient for the polytope and projection dimensions considered in such cases however the random initialization fails due to max_iter being small. I often get the error message "Exception: iterative_hull: could not find starting simplex". I think setting max_iter to a larger value (50000?) would be useful. This should not affect the performance of already working examples. Just wanted to check if anyone has any concerns on this before implementing it.

numpy version if statement breaks for numpy 1.10

In numpy v. 1.6, unique1d was renamed to unique, which is taken care of in quickhull.py using an if statement. However, for numpy v. 1.10, the code on lines 49-50 give numpy_ver = 1.1, so it looks like numpy v. 1.1 is used.

Suggested fix:

  1. Replace lines 49-50 with numpy_ver = [int(s) for s in np.version.version.split('.')], and change if statements on lines 217 and 384 to lexicographic comparison: if numpy_ver > [1,5,9]:
  2. Remove if statements and require numpy v. 1.6 or above

Performance Benchmarking

I've made a template to help with performance benchmarks on this branch carterbox/polytope@ec5fdd01c4e7046ec3e47436b0ec37cfbd1e36af. I chose timeit instead of cProfile because the output is easier to manage and compare. Though, if you wanted, you could easily switch out one for the other.

linear programming if `cvxopt` present but `cvxopt.glpk` absent

polytope.polytope.lpsolve tries to use cvxopt.glpk. If importing fails, then scipy is used.

install_requires includes scipy, so this should always work.
Nonetheless, cvxopt can be present and cvxopt.glpk absent.

Should we try to:

  1. use cvxopt.glpk,
  2. if that fails, then try to use the Python LP solver from cvxopt (this is the proposed change),
  3. if that fails, then use the Python LP solver from scipy?

In other words, does anyone know how the Python LP solver of cvxopt compare to that of scipy?

Choosing between scipy and cvxopt solvers

When running the version of polytope in branch scipysolver (as of 524f4c5) with both scipy and cvxopt installed, it may be beneficial to have the ability to easily choose which solver is to be used. Below is a possible implementation of this to go in polytope.py:

def set_solver(solver):
    if solver == 'scipy':
        from scipy import optimize
        global optimize
        global lp_solver
        lp_solver = 'scipy'
    if solver == 'glpk':
        try:
            from cvxopt import matrix, solvers
            import cvxopt.glpk
            from .esp import esp
            global matrix
            global solvers
            global esp
            global lp_solver
            lp_solver = 'glpk'
            # Hide optimizer output
            solvers.options['show_progress'] = False
            solvers.options['LPX_K_MSGLEV'] = 0
            solvers.options['msg_lev'] = 'GLP_MSG_OFF'
        except:
            raise Exception("unable to import cvxopt.glpk")

Changing a global variable (lp_solver) may not be the best solution to switch between solvers but, given the current implementation, it is a relatively simple one. Are there any other thoughts on a safer or better way to implement this?

collect bibliography from source files

In tulip, full bibliographic references are placed in the file doc/bib.txt from which the user guide bibliography is generated automatically as described in the tulip developer's guide. References from within individual source files mention keys from the centralized bibliography doc/bib.txt.

Over time there have been private discussions about this approach. I prefer to place plain-text full references in each source file (example). The purpose is to reduce the coupling between different modules/subpackages/developers introduced by "importing" a shared bibliography.

Placing the reference entries in the source code file makes the code file independent of the bibliography file. Also, the source code file is readable in plain text, without running any documentation generation tool. In that sense, it is self-contained, and independent of which package the module is part of. Relocating a module from one package to another doesn't require one to not forget to move the references too (something I could forget).

I have debated the potential duplication with myself and decided that in this particular case I don't mind it. A posteriori I observe that no duplication ever arose anywhere (or so I think). There is a reason: we should not implement twice the same algorithm, but instead import and reuse it, whenever possible. Slightly more elaborated, algorithms that are sufficiently complex or specialized to justify citing a reference are usually implemented at one place and carefully (well, alternatives in Python, C, etc. may introduce duplication, but the references may appear only in one of the implementations--the most readable).

The question that arises naturally from these remarks is how to obtain a centralized bibliography from a decentralized one. I would like to propose automatically collecting references from individual source files. The format of entries in each file need not be strict. The most robust way is probably to mention the DOI or a URL wherever applicable, and raise warnings during bibliography collection if a bibliographic entry cannot be confirmed from online archives. Checking entries introduces the requirement for access to the internet, but it can be optional. Also, warnings can be generated when similar entries are detected (title match or lexical distance).

From a search the package duecredit appears to be for collecting references from source files. I haven't evaluated it yet.

The numpy boolean negative, the `-` operator, is not supported, use the `~` operator or the logical_not function instead

With numpy == 1.13.0 the tests break, though the same build used to pass (this numpy version was recently released). The error message below suggests as a possible replacement using the function logical_not.

ERROR: polytope_test.operations_test.polytope_intersect_test

----------------------------------------------------------------------

Traceback (most recent call last):

  File "/home/travis/miniconda/lib/python2.7/site-packages/nose/case.py", line 197, in runTest

    self.test(*self.arg)

  File "/home/travis/build/tulip-control/polytope/tests/polytope_test.py", line 197, in polytope_intersect_test

    p5 = p2.intersect(p4)

  File "/home/travis/miniconda/lib/python2.7/site-packages/polytope/polytope.py", line 275, in intersect

    return reduce(Polytope(iA, ib), abs_tol=abs_tol)

  File "/home/travis/miniconda/lib/python2.7/site-packages/polytope/polytope.py", line 1067, in reduce

    (np.array([b_arr]).T-np.dot(A_arr, lb)) < -1e-4)

TypeError: The numpy boolean negative, the `-` operator, is not supported, use the `~` operator or the logical_not function instead.

Plotting polytopes on same plot/ adding "_get_patch" as an import option

Hi,

I've been using polytope for a while now after using MPT for quite some time in MATLAB. Congrats on the nice work.

I've been trying to plot several polytopes in the same plot (some of these are within a Region and other from other polytopic operations) and have found the handy _get_patch inside polyope.py:

def _get_patch(poly1, **kwargs):

But this import is not available unless chagning init.py as:

from .polytope import (
    Polytope, Region,
    is_empty, is_fulldim, is_convex, is_adjacent, is_subset,
    reduce, separate, box2poly, grid_region,
    cheby_ball, bounding_box, envelope, extreme, qhull,
    is_inside, union, mldivide, intersect, volume, projection, **__get_patch_**

From this subtle change I am able to use _get_patch to reconstruct the plot in a new figure with all polytopes I want to show in a single one:
image

In the case above I am plotting a collection of convex polytopes that were inside a Region + Its respective bounding box, just for illustration purposes.

Am I overcomplicating things or should _get_patch be available for importing so it can be freely used?

Thanks.

silence cvxopt output

As of 441919b, cvxopt == 1.1.7 does not show glpk == 4.60 output. cvxopt == 1.1.8 does show this output.
Note that requirements.txt has pinned cvxopt == 1.1.7.
Silencing for cvxopt == 1.1.7 was introduced here by #11.

This is a cvxopt problem, and has been reported here: cvxopt/cvxopt#63

polytope volume changing on each trial

Dear Developers & users,
recently I face an issue with polytope. I use jupyter notebook for all calculations
here is an example:
a=np.array([ [ 0.57735,-0.57735, -0.57735], [-0.57735,-0.57735, 0.57735], [ 0.57735, 0.57735, -0.57735], [ 0. , 0. , 1. ],
[-0.57735, 0.57735, 0.57735], [-0. ,-0. , -1. ]])
b = np.array([[-0.0241 , -0.19315 , 0.20914, 0.16667, 0.03683 ,-0.125 ]])

s=pc.Polytope(da,db)

print(" Volume is : ", s.volume)

executing the cell prints volume. "Volume is : 1.3701285938484307e-05"

again if I execute the cell then I get a different volume value. Executing for the second time print "Volume is : 1.2551849198678577e-05"

on third Execution: " Volume is : 1.3241511242562017e-05 "

every time the value is changing. what causes this issue in polytope pacakage ?

Though the values are small, I do not expect different volumes for the same polytope defined using the same "a & b".

Looking forward to your suggestions

thank you


Best regards,
Muthu
TU Feriberg, Feriberg, Germany

support Python 3.10

Add the Trove classifier and a CI test environment for Python 3.10. Make any changes to the package polytope that may be necessary in order to ensure compatibility with Python 3.10.

According to PEP 619, Python 3.10.0 is expected to be available on Monday, 2021-10-04. Changes in Python 3.10 are documented in "What’s New In Python 3.10".

As far as I know, any changes to polytope code needed to be compatible with Python 3.10 have already been made.

The Trove classifier and CI test environment (for the development version of Python 3.10) have already been added in branch py310. However, there were errors in the CI build due to numpy and scipy not being yet available in wheel form on PyPI for Python 3.10. Moreover, locally scipy sources from PyPI did not compile successfully for Python 3.10. So it seems that this issue will be revisited after Python 3.10 has been released, when numpy and scipy will be available for Python 3.10 (as well as matplotlib and cvxopt).

Until then, the branch py310 will keep being rebased on top of the mainline branch, so that any merge conflicts be avoided.

improve polytope efficiency

Optimize the heavily used functions and methods in polytope.
Possibly by pushing some things to C using Cython (optional, i.e., installation shouldn't break in case compilation fails).
The profiler says:

ncalls  tottime  percall  cumtime  percall filename:lineno(function)
...
10492/7224    0.615    0.000    3.158    0.000 polytope.py:1041(bounding_box)
     4158    1.008    0.000   20.218    0.005 polytope.py:1123(envelope)
5126/2873    0.081    0.000   22.546    0.008 polytope.py:1179(mldivide)
    82571    8.334    0.000   10.122    0.000 polytope.py:120(__init__)
  795/790    0.002    0.000    1.952    0.002 polytope.py:1220(intersect)
2601/2402    0.162    0.000    0.958    0.000 polytope.py:1242(volume)
      452    0.001    0.000    0.007    0.000 polytope.py:1290(extreme)
      442    0.031    0.000    2.131    0.005 polytope.py:1396(projection)
        2    0.000    0.000    0.002    0.001 polytope.py:147(__str__)
        2    0.000    0.000    0.006    0.003 polytope.py:1496(separate)
  404/123    0.013    0.000    0.082    0.001 polytope.py:1533(is_adjacent)
      410    0.045    0.000    1.652    0.004 polytope.py:1633(projection_fm)
     4691    1.358    0.000   14.216    0.003 polytope.py:1855(region_diff)
   250785    0.039    0.000    0.039    0.000 polytope.py:191(__len__)
     6778    0.034    0.000    0.895    0.000 polytope.py:194(__copy__)
        6    0.000    0.000    0.001    0.000 polytope.py:2054(box2poly)
       10    0.001    0.000    0.013    0.001 polytope.py:2062(_get_patch)
       10    0.000    0.000    0.003    0.000 polytope.py:2124(_plot_text)
     2247    0.003    0.000    2.291    0.001 polytope.py:221(__eq__)
     2247    0.007    0.000    2.288    0.001 polytope.py:227(__le__)
      488    0.001    0.000    1.316    0.003 polytope.py:236(union)
     3775    0.006    0.000   12.104    0.003 polytope.py:247(diff)
     2586    0.035    0.000    2.435    0.001 polytope.py:256(intersect)
     6778    0.011    0.000    0.906    0.000 polytope.py:280(copy)
        6    0.000    0.000    0.001    0.000 polytope.py:285(from_box)
      442    0.002    0.000    2.133    0.005 polytope.py:335(project)
        1    0.000    0.000    0.000    0.000 polytope.py:343(scale)
     6360    0.006    0.000    0.009    0.000 polytope.py:352(dim)
     2306    0.005    0.000    0.806    0.000 polytope.py:361(volume)
        1    0.000    0.000    0.000    0.000 polytope.py:367(chebR)
     2224    0.003    0.000    0.148    0.000 polytope.py:377(cheby)
       10    0.000    0.000    0.018    0.002 polytope.py:390(plot)
        1    0.000    0.000    0.000    0.000 polytope.py:424(Region)
    11745    0.056    0.000    0.108    0.000 polytope.py:443(__init__)
     7591    0.006    0.000    0.008    0.000 polytope.py:474(__iter__)
       30    0.000    0.000    0.000    0.000 polytope.py:477(__getitem__)
    27866    0.013    0.000    0.017    0.000 polytope.py:492(__len__)
        5    0.000    0.000    0.032    0.006 polytope.py:516(__le__)
      386    0.007    0.000   33.698    0.087 polytope.py:538(union)
      412    0.001    0.000   12.580    0.031 polytope.py:558(diff)
1444/1363    0.016    0.000    4.661    0.003 polytope.py:580(intersect)
        1    0.002    0.002    0.254    0.254 polytope.py:59(<module>)
      196    0.001    0.000    0.003    0.000 polytope.py:600(__copy__)
      196    0.000    0.000    0.003    0.000 polytope.py:605(copy)
     1805    0.002    0.000    0.003    0.000 polytope.py:609(dim)
       96    0.000    0.000    0.157    0.002 polytope.py:615(volume)
       50    0.000    0.000    0.000    0.000 polytope.py:631(cheby)
       10    0.000    0.000    0.018    0.002 polytope.py:644(plot)
       10    0.000    0.000    0.003    0.000 polytope.py:671(text)
121057/111800    0.154    0.000    0.480    0.000 polytope.py:676(is_empty)
    32985    0.112    0.000    4.712    0.000 polytope.py:698(is_fulldim)
     2281    0.086    0.000   26.889    0.012 polytope.py:728(is_convex)
     2252    0.011    0.000    2.312    0.001 polytope.py:771(is_subset)
13086/13008    2.004    0.000   10.197    0.001 polytope.py:793(reduce)
4283/1311    0.070    0.000   40.449    0.031 polytope.py:897(union)
78070/77939    2.160    0.000   18.825    0.000 polytope.py:977(cheby_ball)
        1    0.000    0.000    0.000    0.000 polytope.py:98(Polytope)

Polytope error for 1D

In one dimensional case studies, the abstraction algorithm of TuLiP triggers a compatibility error.
This can be fixed easily:
Polytope file:

if n==1:
    x = [sol['x']]
else:
    x = sol['x']

lines 1285 to 1289
lines 1297 to 1300

make plotting independent of `tulip`

polytope.polytope and polytope.plotboth depend ontulip.graphicsfor plotting (remnants from a time whenpolytopewas a subpackage oftulip). For example:

from tulip.graphics import newax

from tulip.graphics import newax

polytope is a dependency of tulip. Therefore, it cannot itself depend on tulip, because this results in a cyclic dependency (hence the labeling as "bug").
The usage is very limited: tulip.graphics.newax and tulip.graphics.dom2vec, both of which are small or eliminable.

The module tulip.graphics originates from my porting to Python part of plotting utilities that I had developed in Matlab. In Python, either some things are not needed at all, or there is a much better and simpler way to do them, or someone - like matplotlib and friends - has already written what is needed. Also, tulip.graphics derives from code that was handling seamless plotting in 2 or 3 dimensions, which in practice is not used in tulip or polytope (and I don't think it would be very practical to plot 3D partitions).

cvxopt 1.2.0 bug

I keep encountering an error when I run Tulip with the latest version of Polytope when using the 1.2.0 version of cvxopt. The error disappears when using cvxopt 1.1.8.
This is an example of the output that I receive:
rebwin@rebwin-Lenovo-Yoga-2-13:~/tulip-control-master/examples$ python continuous.py
GLPK Simplex Optimizer, v4.65
4 rows, 3 columns, 8 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 3: obj =  -5.000000000e-01 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
8 rows, 3 columns, 16 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 3: obj =  -5.000000000e-01 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
6 rows, 2 columns, 6 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 1: obj =  -1.000000000e+00 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
6 rows, 2 columns, 6 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 1: obj =  -1.000000000e+00 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
6 rows, 2 columns, 6 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 1: obj =  -1.100000000e+00 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
6 rows, 2 columns, 6 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 1: obj =  -1.100000000e+00 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
6 rows, 2 columns, 6 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 1: obj =  -1.000000000e-01 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
6 rows, 2 columns, 6 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 1: obj =  -1.000000000e-01 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
GLPK Simplex Optimizer, v4.65
4 rows, 3 columns, 8 non-zeros

  • 0: obj =   0.000000000e+00 inf =   0.000e+00 (1)
    
  • 3: obj =  -5.000000000e-01 inf =   0.000e+00 (0)
    

OPTIMAL LP SOLUTION FOUND
Traceback (most recent call last):
File "continuous.py", line 68, in
cont_partition = prop2part(cont_state_space, cont_props)
File "/home/rebwin/anaconda3/envs/pythonpoly/lib/python3.6/site-packages/tulip/abstract/prop2partition.py", line 119, in prop2part
dummy = region_now.diff(cur_prop_poly)
File "/home/rebwin/anaconda3/envs/pythonpoly/lib/python3.6/site-packages/polytope/polytope.py", line 754, in diff
return mldivide(self, other)
File "/home/rebwin/anaconda3/envs/pythonpoly/lib/python3.6/site-packages/polytope/polytope.py", line 1365, in mldivide
Pdiff = mldivide(Pdiff, poly1, save=save)
File "/home/rebwin/anaconda3/envs/pythonpoly/lib/python3.6/site-packages/polytope/polytope.py", line 1380, in mldivide
P = region_diff(a, b)
File "/home/rebwin/anaconda3/envs/pythonpoly/lib/python3.6/site-packages/polytope/polytope.py", line 1977, in region_diff
N = len(reg)
File "/home/rebwin/anaconda3/envs/pythonpoly/lib/python3.6/site-packages/polytope/polytope.py", line 675, in len
return len(self.list_poly)
AttributeError: 'Region' object has no attribute 'list_poly'

It seems like the attribute 'list_poly' is never initialized.

remove support for Python 2.7, 3.5, 3.6

Python 2.7 development has completely stopped. The Python ecosystem has now moved to Python 3. Most major packages have ceased support for Python 2.7, for example numpy did so about 2 years ago (numpy == 1.17.0). So it is time to remove support for Python 2.7 from polytope. It suffices to remove 2.7 from the supported Python versions, pass the parameter python_requires to the function setuptools.setup, and use features available on Python 3, and in newer versions of dependencies where needed.

This change is motivated by the benefits of using the function numpy.random.default_rng that is available in newer versions of numpy. There is no reason to constrain polytope's code to remain compatible with Python 2.7, and thus not use new functionality available in recent versions of dependencies.

combining many polytopes into single polytope

Dear Users and developers,
I try to combine two different polytopes. I assumed that the input is two polytopes and the output would be a single polytope.
In fact, the polytopes I deal with are very close to each other or sometimes they partially overlap.

import polytope as PC

creating polytopes

p1 = pc.Polytope(p1A, p1b)
p2 = pc.Polytope(p2A, p2b)

combining them

r = p1 + p2

I expected that " r " is a polytope. but it is a Region in the end. and just it contains the given polytopes.

is there any faster method to combine given polytopes into a single polytope?

Now i follow the following method

p1_exetrem = pc.extreme(p1)
p2_exetrem = pc.extreme(p2)

new_exetrem =np.array(p1_exetrem , p2_exetrem ) ### should be done over row of each rows in p1_exetrem and p2_exetrem

new_polytope = pc.qhull ( new_exetrem )

however, I feel it is not a safer method. since with this method, we included the space where p1_exetrem and p2_exetrem overlap with each other. Also when I try to plot new_polytope I get an error message.

Any suggestions would be greatly appreciated

Thank you

Best regards,
Muthu Vallinayagam
Researcher, Inst. of Experimental physics
TU Freiberg, Germany

BUG: deprecation warning inside region_diff

Bug

The following deprecation warning shows when running the test script with python==3.5.2 and numpy==1.11.1.

VisibleDeprecationWarning: converting an array with ndim > 0 to an index will result in an error in the future
range(beg_mi[j],beg_mi[j]+mi[j])

This warning is occurring because mi[j] is being used to define a range, but it is a list of size 1 instead of an integer. i.e. [10] instead of 10. The cause is that mi is defined as a 2D column array: mi = np.zeros([N,1], dtype=int) such that 1D indexing of mi returns a row instead of an element.

Fix

Define mi as an (N,0) array instead. Since mi is not involved in matrix muliplications, it shouldn't matter if it is a column or row vector.

Polytope.reduce and removal of possibly overlapping polytopes

Hi everyone,

I have a quick question: Does the reduce function, responsible for removing redundant inequalities from the H-rep, indirectly removes overlapping polytopes (if there are any?)

The docstring for this function gives some information but I wanted to double check. I am working in a package that uses polytope as a requirement and knowing that reduce would do this for me seems faster than a workaround I currently have implemented.

def reduce(poly, nonEmptyBounded=1, abs_tol=ABS_TOL):

"""Remove redundant inequalities from the hyperplane representation.

Uses the algorithm described at [1],
by solving one LP for each facet.

[1] https://www.cs.mcgill.ca/~fukuda/soft/polyfaq/node24.html

Warning:
  - nonEmptyBounded == 0 case is not tested much.

@type poly: L{Polytope} or L{Region}

@return: Reduced L{Polytope} or L{Region} object
"""

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.