Giter VIP home page Giter VIP logo

geojson-pydantic's Introduction

geojson-pydantic

Pydantic models for GeoJSON.

Test Coverage Package version Downloads License Conda


Documentation: https://developmentseed.org/geojson-pydantic/

Source Code: https://github.com/developmentseed/geojson-pydantic


Description

geojson_pydantic provides a suite of Pydantic models matching the GeoJSON specification rfc7946. Those models can be used for creating or validating geojson data.

Install

$ python -m pip install -U pip
$ python -m pip install geojson-pydantic

Or install from source:

$ python -m pip install -U pip
$ python -m pip install git+https://github.com/developmentseed/geojson-pydantic.git

Install with conda from conda-forge:

$ conda install -c conda-forge geojson-pydantic

Contributing

See CONTRIBUTING.md.

Changes

See CHANGES.md.

Authors

Initial implementation by @geospatial-jeff; taken liberally from https://github.com/arturo-ai/stac-pydantic/

See contributors for a listing of individual contributors.

License

See LICENSE

geojson-pydantic's People

Contributors

bmschmidt avatar david-mateo avatar davidraleigh avatar drewbo avatar eseglem avatar farridav avatar geospatial-jeff avatar grinvlad avatar hawkaa avatar impocode avatar iwpnd avatar jmfee-usgs avatar markus-work avatar martinabeleda avatar moradology avatar p1-dta avatar regisfbrilhante avatar thegreatrefrigerator avatar vincentsarago avatar yellowcap avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

geojson-pydantic's Issues

`geojson-pydantic` not publishing types

When I try to depend on this project and validate my own with mypy I get the following error:

error: Skipping analyzing 'geojson_pydantic.features': found module but no type hints or library stubs
note: See https://mypy.readthedocs.io/en/latest/running_mypy.html#missing-imports

This means type information is not published.

I have made this work on a similar project by adding an empty py.typed in the root package directory, like this PR does: https://github.com/hawkaa/pygeojson/pull/11/files

Should I open a PR?

Rationale for not supporting empty geometries?

Hi there! Just wondering if it is a deliberate decision that empty geometries don't pass validation because these constrained lists:

# Coordinate arrays
MultiPointCoords = conlist(Position, min_items=1)
LineStringCoords = conlist(Position, min_items=2)
MultiLineStringCoords = conlist(LineStringCoords, min_items=1)
LinearRing = conlist(Position, min_items=4)
PolygonCoords = conlist(LinearRing, min_items=1)
MultiPolygonCoords = conlist(PolygonCoords, min_items=1)

e.g:

>>> from shapely.geometry import mapping, Polygon
>>> from geojson_pydantic import Polygon as PydanticPolygon
>>> PydanticPolygon.parse_obj(mapping(Polygon()))
pydantic.error_wrappers.ValidationError: 1 validation error for Polygon
coordinates
  ensure this value has at least 1 items (type=value_error.list.min_items; limit_value=1)

Empty geometries seem to be permitted per the geojson spec

Cannot import geojson_pydantic after 'pip install geojson-pydantic'

Python version: 3.8.5
Pip: pip 20.2.3
Virtualenv: virtualenv 20.0.31
Package version: geojson-pydantic==0.2.1
OS: Ubuntu 18.04.5 LTS
Pydantic: pydantic==1.6.1
Error message:

> import geojson_pydantic
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 618, in _load_backward_compatible
  File "<frozen zipimport>", line 259, in load_module
  File "/home/PATH/fresh-env/lib/python3.8/site-packages/geojson_pydantic-0.2.1-py3.8.egg/geojson_pydantic/__init__.py", line 5, in <module>
  File "/home/PATH/fresh-env/lib/python3.8/site-packages/pkg_resources/__init__.py", line 467, in get_distribution
    raise TypeError("Expected string, Requirement, or Distribution", dist)
TypeError: ('Expected string, Requirement, or Distribution', None)

__geo_interface__ specification and implementation

from #94 (comment)

Feature / FeatureCollection

with the change introduced in ☝️ PR we get __geo_interface__ for empty feature which looks like

Feature(type="Feature", geometry=None, properties=None).__geo_interface__
>> {'type': 'Feature', 'geometry': None, 'properties': None}

The specs says:

properties (optional)
A mapping of feature properties (labels, populations ... you name it. Dependent on the data). Valid for "Feature" types only.

geometry (optional)
The geometric object of a "Feature" type, also as a mapping.

So to me it's a bit unclear if we should return properties=None, properties={} or no properties key when empty. Same for empty geometries, should we return geometries=None or no geometry key 🤷

ref: https://gist.github.com/sgillies/2217756#__geo_interface__

Geometry

For now we don't support empty coordinates for geometry (see #89) but same, what should the __geo_interface__ looks like for empty geometries?

ref:

¿ add PyGEOS in geojson-pydantic ?

It might be a nice feature to have pygeos as an optional dependencies which could then enable additional methods in geometries (https://pygeos.readthedocs.io/en/stable/measurement.html).

try:
    import pygeos
except ImportError:  # pragma: nocover
    pygeos = None  # type: ignore

class Polygon(_GeometryBase):
    """Polygon Model"""

    type: str = Field("Polygon", const=True)
    coordinates: PolygonCoords

    @property
    def as_pygeos_geometry(self):
        assert pygeos is not None, "pygeos must be installed to use this method"
        return pygeos.from_geojson(self.json())

    @property
    def area(self):
        assert pygeos is not None, "pygeos must be installed to use this method"
        return pygeos.area(self.as_pygeos_geometry)

Errors in bbox validation logic?

Thanks for writing and maintaining this useful library, and apologies if my issue/PR betray some fundamental misunderstanding(s).

The 0.6.0 update seems to have broken our CI pipeline. As best I can tell, this relates a collision of expectations related to the new bbox validation on features, and to the need for explicit Nones: we don't include bboxes for our features, and I think somehow that lack is getting turned into an explicit None which then raises an error when the bbox validation takes its len.

It also appears to me that the bbox tests were possibly raising ValueErrors for multiple reasons: they were missing geometries and so raised an error even if the bboxes were correct. I've added an assertion to make sure that they succeed.

I'm filing a related PR that I believe fixes these issues.

Parsing method for `Geometry`

Hi,

Right now I am in situation where I'd like to parse a PostGIS geometry into the Geometry models from this library. However, I don't know upfront which geometry type I'm dealing with.

I just wondered if it could be this library's responsibility of providing a method for parsing a geometry and returning the contents with the correct model. What do you think?

If so, I've been back and forth on where to do this. I think I've settled on putting a new function in utils.py or parser.py called parse_geometry_obj (same naming convention as pydantic parse_obj) The function would return a Geometry type which is a union of all geometries.

Alternatively, we turn _GeometryBase into Geometry and override parse_obj, but I come from a functional background so inheritance like that feels odd and inflexible to me. In my world, we would get rid of the base class and add the geo interface to all the classes in the union.

Let me know what you think. I'd love to put up a PR. Should be really simple if we decide how to do it.

lat lon validation feasible?

Hi all, great work on the models so far 🎉

I'm wondering if it would make sense to include validation for latitude & longitude for coordinates and bbox?

So to check if coordinate input is in the correct order (lon, lat as per geojson spec) for lat input >90 or <-90

Boundaries checks would be:
-180.0 <= lon <=180.0
-90.0 <=lat <= 90.0

Not sure how to do this.. Is it even possible to validate tuples that don't have a key?

allow a third element in `position`!

A position is an array of numbers. There MUST be two or more
elements. The first two elements are longitude and latitude, or
easting and northing, precisely in that order and using decimal
numbers. Altitude or elevation MAY be included as an optional third
element.

Position could be up to 3 elements

add more docs!

would be nice to have at least some examples in the readme!

Problem with "wkt" method.

When inheriting from the Point class, the "wkt" method does not work as I expect. It uses the class name to build wkt. It's right?
Example:

from geojson_pydantic import geometries as geom
class PointType(geom.Point):
    ... (my custom logic)

test = PointType(coordinates=[123, 123])
print(test.wkt)

>>> POINTTYPE (123.0 123.0)

I am expecting to get POINT (123.0 123.0) for example to use this in a database.

Removing the `geojson` package dependency

Hi,

Thank you for a great library!

It does not matter at all, but I was a little confused that geojson was included as a dependency when depending on geojson-pydantic. Inspecting the source code, it looks like it is used to validate the coordinates.

It shouldn't be too hard turning the validation into pydantic idiomatic validations. AFAIK the functionality that needs to be replicated are in these three methods:

Again, not sure the benefit, but we'll at least get rid of the geojson dependency and might improve validation handling by turning it into pydantic validations.

Håkon

collection iterator

What do we think about turning FeatureCollection and GeometryCollection into iterators?

I think this:

for feat in fc:
    ...

is nicer than this:

for feat in fc.features:
    ...

Bbox and Position invalid oas3 spec

Heyo, ive been using geojson_pydantic for our fast_api application and also openapi-spec-validator to check if the openapi.json spec generated is valid 3.0.2 oas3 spec
The Bbox and Position.coordinates properties both generate schemas that seem to have invalid array definitions.

          "coordinates": {
            "title": "Coordinates",
            "type": "array",
            "items": {
              "anyOf": [
                {
                  "maxItems": 2,
                  "minItems": 2,
                  "type": "array",
                  "items": [
                    {
                      "type": "number"
                    },
                    {
                      "type": "number"
                    }
                  ]
                },
                {
                  "maxItems": 3,
                  "minItems": 3,
                  "type": "array",
                  "items": [
                    {
                      "type": "number"
                    },
                    {
                      "type": "number"
                    },
                    {
                      "type": "number"
                    }
                  ]
                }
              ]
            }
          },

I played around with it and seems like the valid definition should be

"coordinates": {
    "title": "Coordinates",
    "maxItems": 3,
    "minItems": 2,
    "type": "array",
    "items": {
    "type": "number"
    }
}

I achived this by changing the position and bbox definition in types.py
Position = conlist(float, min_items=2, max_items=3)
BBox =List[conlist(float, min_items=4, max_items=4) | conlist(float, min_items=6, max_items=6)]

When installing without cache in a clean environment, the install fails because the wheel is not succesfully build.

The issue:
The command pip3 install --no-cache-dir geojson-pydantic~=0.3.4 does not install pydantic.

Steps to reproduce the issue:

  1. Create a clean new Python environment with python3 -m venv venv
  2. Execute: pip3 install --no-cache-dir geojson-pydantic~=0.3.4
  3. Results into the issue: ModuleNotFoundError: No module named 'pydantic'. note: This error originates from a subprocess, and is likely not a problem with pip.

Expected cause:
setup.py does not have a version. The setup.cfg has the version, but pip has no knowledge of its existence because it does not know what build tool to use to build the project, therefore pip does not know how to retrieve the version from setup.cfg. To give pip knowledge of how it must build the project, it requires a pyproject.toml file. So that it knows what the build requirements are and against what version of setuptools or any other tool it must be run.

MR with a possible fix:
#60

question: why are you returning tuples instead of arrays?

At the geojson spec, a Point is represented inside an array [0,0], however when using this lib the return after the parsing is a tuple (0,0).
Is there a specific reason for that?

 {'address': {'coordinates': (-43.297337, -23.013538), 'type': 'Point'}} != {'address': {'coordinates': [-43.297337, -23.013538], 'type': 'Point'}}

Bbox behavior different from spec

Current Behavior

When we initialize any GeoJson type without passing a bbox, it's setted to null.

from geojson_pydantic import FeatureCollection
from pprint import pprint

fc = {
    "type": "FeatureCollection",
    "features": [
        {
            "type": "Feature",
            "properties": "",
            "geometry": {
                "type": "Polygon",
                "coordinates": [
                    [
                        [-37.33175659179676, -9.424103736877383],
                        [-37.31844711303711, -9.420578002929688],
                        [-37.26972198486328, -9.45348930358881],
                        [-37.3108024597168, -9.467284202575684],
                        [-37.33392715454096, -9.487695693969727],
                        [-37.37298583984375, -9.524782180786133],
                        [-37.4452285766601, -9.476631164550724],
                        [-37.44975662231445, -9.453315734863281],
                        [-37.33175659179676, -9.424103736877383],
                    ]
                ],
            },
        }
    ],
}
fc = FeatureCollection(**fc)
pprint(fc.dict())

Will print

{'bbox': None,
 'features': [{'bbox': None,
               'geometry': {'bbox': None,
                            'coordinates': [[(-37.33175659179676,
                                              -9.424103736877383),
                                             (-37.31844711303711,
                                              -9.420578002929688),
                                             (-37.26972198486328,
                                              -9.45348930358881),
                                             (-37.3108024597168,
                                              -9.467284202575684),
                                             (-37.33392715454096,
                                              -9.487695693969727),
                                             (-37.37298583984375,
                                              -9.524782180786133),
                                             (-37.4452285766601,
                                              -9.476631164550724),
                                             (-37.44975662231445,
                                              -9.453315734863281),
                                             (-37.33175659179676,
                                              -9.424103736877383)]],
                            'type': 'Polygon'},
               'id': None,
               'properties': {},
               'type': 'Feature'}],
 'type': 'FeatureCollection'}

Expected Behavior

From the standard, at the definition of a GeoJSON Object:

GeoJSON Object

A GeoJSON object represents a Geometry, Feature, or collection of Features.
A GeoJSON object is a JSON object.
A GeoJSON object has a member with the name "type". The value of the member MUST be one of the GeoJSON types.
A GeoJSON object MAY have a "bbox" member, the value of which MUST be a bounding box array (see Section 5).

Then, from the section on bounding boxes:

  1. Bounding Box

    A GeoJSON object MAY have a member named "bbox" to include
    information on the coordinate range for its Geometries, Features, or
    FeatureCollections. The value of the bbox member MUST be an array of
    length 2*n where n is the number of dimensions represented in the
    contained geometries, with all axes of the most southwesterly point
    followed by all axes of the more northeasterly point. The axes order
    of a bbox follows the axes order of geometries.

In other words, when passing no bbox, the parsed geometry should contain no bbox fields.

{
 'features': [{
               'geometry': {
                            'coordinates': [[(-37.33175659179676,
                                              -9.424103736877383),
                                             (-37.31844711303711,
                                              -9.420578002929688),
                                             (-37.26972198486328,
                                              -9.45348930358881),
                                             (-37.3108024597168,
                                              -9.467284202575684),
                                             (-37.33392715454096,
                                              -9.487695693969727),
                                             (-37.37298583984375,
                                              -9.524782180786133),
                                             (-37.4452285766601,
                                              -9.476631164550724),
                                             (-37.44975662231445,
                                              -9.453315734863281),
                                             (-37.33175659179676,
                                              -9.424103736877383)]],
                            'type': 'Polygon'},
               'id': None,
               'properties': {},
               'type': 'Feature'}],
 'type': 'FeatureCollection'
}

FastAPI validation errors for openapi schema

Using geojson-pydantic as a FastAPI model attribute type causes a pydantic openapi validation error.

The error is for all coordinates/bbox. I am not sure but it may be related to stricter checking of the coordinate optional numeric values for the openapi schema - previously it would not be able to resolve the items on the docs, but now the docs do not load.

from geojson_pydantic import Point
from pydantic import BaseModel

class TestPoint(BaseModel):
    point: Point

This causes the following exception when attempting to view the fastapi openapi schema docs:

ERROR: Exception in ASGI application
Traceback (most recent call last):
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 375, in run_asgi
result = await app(self.scope, self.receive, self.send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 75, in call
return await self.app(scope, receive, send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\fastapi\applications.py", line 208, in call
await super().call(scope, receive, send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\applications.py", line 112, in call
await self.middleware_stack(scope, receive, send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\middleware\errors.py", line 181, in call
raise exc from None
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in call
await self.app(scope, receive, _send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\middleware\cors.py", line 78, in call
await self.app(scope, receive, send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\exceptions.py", line 82, in call
raise exc from None
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\exceptions.py", line 71, in call
await self.app(scope, receive, sender)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\routing.py", line 580, in call
await route.handle(scope, receive, send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\routing.py", line 241, in handle
await self.app(scope, receive, send)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\starlette\routing.py", line 52, in app
response = await func(request)
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\fastapi\applications.py", line 161, in openapi
return JSONResponse(self.openapi())
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\fastapi\applications.py", line 136, in openapi
self.openapi_schema = get_openapi(
File "d:\projects\vrms\atlis_api\venv\lib\site-packages\fastapi\openapi\utils.py", line 410, in get_openapi
return jsonable_encoder(OpenAPI(**output), by_alias=True, exclude_none=True) # type: ignore
File "pydantic\main.py", line 406, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 3 validation errors for OpenAPI
components -> schemas -> Point -> properties -> coordinates -> anyOf -> 0 -> items
value is not a valid dict (type=type_error.dict)
components -> schemas -> Point -> properties -> coordinates -> anyOf -> 1 -> items
value is not a valid dict (type=type_error.dict)
components -> schemas -> Point -> $ref
field required (type=value_error.missing)

"Enforce required keys and enforce defaults" makes code needlessly cumbersome

Regarding the change issued here
https://github.com/developmentseed/geojson-pydantic/blob/main/CHANGELOG.md#060a0---2023-04-04

# Before
Feature(geometry=Point(coordinates=(0,0)))

# Now
Feature(
    type="Feature",
    geometry=Point(
        type="Point",
        coordinates=(0,0)
    ),
    properties=None,
)

This change is unfortunate for a few reasons:

  • it violates semantic versioning, because the fundamental call has changed in an incompatible way without a major version change https://semver.org/
  • it is incorrect to say that the old system did not follow the GeoJSON spec. The GeoJSON spec is about JSON, not about Python objects. As long as the @validator and .json() work correctly, GeoJSON is met.
  • it is unpythonic to repeat identical information Feature(type='Feature", ...

We have been using geojson-pydantic for several projects, and have had to issue patches (or pin to <0.6.0) throughout.

[QUESTION] Field validation on a Point

Hi 👋

I am currently working on a little project and I want to use your package to store geolocation data. It is very helpful but I can't seem to figure out how to add restrictions to make sure that the entered coordinates don't exceed -180 to 180 for long and -90 to 90 for lat.

Currently I use 2 different variables and one populates the other but is there a way to do these 2 checks directly within the Point object ?

Code I use to make sure the coordinates are in range:

class GeoLocation(BaseModel):
    longitude: confloat(ge=-180.0, le=180.0) = Field(..., description="Longitude of the position", example=10.19543)
    latitude: confloat(ge=-90.0, le=90.0) = Field(..., description="Latitude of the position", example=52.55629)


class SomeModel(BaseModel):
    coordinates: GeoLocation = Field(..., description="GeoLocation of the charger")
    geolocation: Point = Field(..., description="GeoJSON representation of the charger's location")

I have to manually populate the geolocation field at some point before the database and I would like to avoid this. The user gets to fill the coordinates variable when making requests to the API.

Also, is there a way to ensure any kind of precision on the coordinates ?

Thanks in advance for your help

Proposal for Geometry class hierarchy reorganization

There are a few issues I see with the existing class hierarchy:

  1. GeometryCollection is defined as a Geometry in GeoJSON, but there's no relationship between it and either _GeometryBase or the Geometry Union.
  2. GeometryCollection does not implement geo_interface

In some recent code working with these classes, I had to create a special case for GeometryCollection like:

if hasattr(arg, "__geo_interface__"):
    return getattr(arg, "__geo_interface__")
else if (isinstance(arg, GeometryCollection)):
    return arg.dict()

I would like to see these reorganized more like the following:

# type is the only attribute here, rather than coordinates
class _GeometryBase(BaseModel, abc.ABC):
    """Base class for Geometry models"""

        type: str # will be defined by child classes

    @property
    def __geo_interface__(self):
        return self.dict()

# Geometry Collection extends _GeometryBase instead of BaseModel
class GeometryCollection(_GeometryBase):
    """GeometryCollection Model"""

    type: str = Field("GeometryCollection", const=True)
    geometries: List[Geometry]

# another class that only has coordinates
class _GeometryWithCoordinatesBase(_GeometryBase):
    """Base class for geometry models containing coordinates"""

    coordinates: Any  # will be constrained in child classes


class Point(_GeometryWithCoordinatesBase):
    """Point Model"""

    type: str = Field("Point", const=True)
    coordinates: Position

Geometry = Union[Point, MultiPoint, LineString, MultiLineString, Polygon, MultiPolygon, GeometryCollection]

another option instead of this Union is to define Geometry (and a new type GeometryWithCoordinates) like:

from typing import Protocol

class GeometryWithCoordinates(Protocol):
    coordinates: Position


class Geometry(Protocol):
    coordinates: Position
    type: str

Nested GeometryCollection?

I know the spec discourages them:

To maximize interoperability, implementations SHOULD avoid nested GeometryCollections. Furthermore, GeometryCollections composed of a single part or a number of parts of a single type SHOULD be avoided when that single part or a single object of multipart type (MultiPoint, MultiLineString, or MultiPolygon) could be used instead.

But I ran into an interesting case while working on pycql2. Trying to translate from WKT to GeoJSON gets weird because GEOMETRYCOLLECTION(POINT(0 0), GEOMETRYCOLLECTION(POINT(1 1), LINESTRING(0 0, 1 1))) is technically a valid WKT. And shapely will load it in, as well as output it as nested GeoJSON.

>>> from shapely import wkt, to_geojson
>>> print(to_geojson(wkt.loads("GEOMETRYCOLLECTION(POINT(0 0), GEOMETRYCOLLECTION(POINT(1 1), LINESTRING(0 0, 1 1)))")))
# Manually formatted for readability
{
  "type": "GeometryCollection",
  "geometries": [
    {"type": "Point", "coordinates": [0.0, 0.0]},
    {
      "type": "GeometryCollection",
      "geometries": [
        {"type": "Point", "coordinates": [1.0, 1.0]},
        {"type": "LineString", "coordinates": [[0.0, 0.0], [1.0, 1.0]]}
      ]
    }
  ]
}

So, despite the spec saying not to do it for interoperability reasons, I would actually say the ability to parse it is needed for interoperability.

What if we included it and throw a warning about it? Something like this:

from __future__ import annotations
import warnings

class GeometryCollection(BaseModel, GeoInterfaceMixin):
    type: Literal["GeometryCollection"]
    geometries: List[Union[Geometry, GeometryCollection]]

    ...

    @validator("geometries")
    def check_geometries(cls, geometries: List) -> List:
        """Add warnings for conditions the spec does not explicitly forbid."""
        if len(geometries) == 1:
            warnings.warn("GeometryCollection should not be used for single geometries.")
        if any(geom.type == "GeometryCollection" for geom in geometries):
            warnings.warn(
                "GeometryCollection should not be used for nested GeometryCollections."
            )
        if len(set(geom.type for geom in geometries)) == 1:
            warnings.warn(
                "GeometryCollection should not be used for homogeneous collections."
            )

Could also have it check something in the Class Config and throw validation errors instead of warnings. Maybe a Config.strict_geometry_collection. Then would be able to subclass it and set the config and use it however preferred.

"duplicate validator function" error while importing Feature class form geojson_pydantic

I've created a dataclass Abc :

from geojson_pydantic import Feature
from dataclasses import dataclass

@dataclass
class Abc:
    geojson: Feature
    myid: uuid.UUID

This model is present in commons.py file inside dtos folder.

I'm trying to import Abc from main.py and one other file.

Directory Structure:

src -->
    --> dtos
           - commons.py -> Abc
    --> anothermodule
           - create.py -> from dtos.commons import Abc
    main.py -> import dtos.commons.Abc

I get this error when I run the main.py file

File "<frozen importlib._bootstrap>", line 1109, in __import__
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/usr/local/anaconda3/envs/apsbackend/lib/python3.9/site-packages/geojson_pydantic/geometries.py", line 175, in <module>
    class Polygon(_GeometryBase):
  File "/usr/local/anaconda3/envs/apsbackend/lib/python3.9/site-packages/geojson_pydantic/geometries.py", line 186, in Polygon
    def check_closure(cls, coordinates: List) -> List:
  File "pydantic/class_validators.py", line 93, in pydantic.class_validators.validator.dec
  File "pydantic/class_validators.py", line 156, in pydantic.class_validators._prepare_validator
pydantic.errors.ConfigError: duplicate validator function "geojson_pydantic.geometries.Polygon.check_closure"; if this is intended, set `allow_reuse=True`

Support for 2D Geometry validation.

Currently the library allows positions to be z,y or x,y,z with no way to force a Geometry to only be 2d.

I'd like to propse a change where the geometries come in 2 flavours, 2D and 3D to allow for better Schemas/API specs and validation of geometries that are only supposed to be 2D.

I had shimmed the library in my own projects to add in Point2D and Polygon2D, but I think this would be better suited in the base library itself.

Thaughts?

Verbose validation error is difficult to understand

Hi, I was validating this Polygon that has a linear rings with different start/end coordinates:

{
    "type": "Feature",
    "geometry": {
        "type": "Polygon",
        "coordinates": [
            [
                [-55.9947406591177, -9.26104045526505],
                [-55.9976752102375, -9.266589696568962],
                [-56.00200328975916, -9.264041751931352],
                [-55.99899921566248, -9.257935213034594],
                [-55.99477406591177, -9.26103945526505],
            ]
        ],
    },
}

And I received the following message:

18 validation errors for Feature
geometry -> type
  unexpected value; permitted: 'Point' (type=value_error.const; given=Polygon; permitted=['Point'])
geometry -> coordinates
  wrong tuple length 1, expected 2 (type=value_error.tuple.length; actual_length=1; expected_length=2)
geometry -> coordinates
  wrong tuple length 1, expected 3 (type=value_error.tuple.length; actual_length=1; expected_length=3)
geometry -> type
  unexpected value; permitted: 'MultiPoint' (type=value_error.const; given=Polygon; permitted=['MultiPoint'])
geometry -> coordinates -> 0
  wrong tuple length 5, expected 2 (type=value_error.tuple.length; actual_length=5; expected_length=2)
geometry -> coordinates -> 0
  wrong tuple length 5, expected 3 (type=value_error.tuple.length; actual_length=5; expected_length=3)
geometry -> type
  unexpected value; permitted: 'LineString' (type=value_error.const; given=Polygon; permitted=['LineString'])
geometry -> coordinates
  ensure this value has at least 2 items (type=value_error.list.min_items; limit_value=2)
geometry -> type
  unexpected value; permitted: 'MultiLineString' (type=value_error.const; given=Polygon; permitted=['MultiLineString'])
geometry -> coordinates
  All linear rings have the same start and end coordinates (type=value_error)
geometry -> type
  unexpected value; permitted: 'MultiPolygon' (type=value_error.const; given=Polygon; permitted=['MultiPolygon'])
geometry -> coordinates -> 0 -> 0
  ensure this value has at least 4 items (type=value_error.list.min_items; limit_value=4)
geometry -> coordinates -> 0 -> 1
  ensure this value has at least 4 items (type=value_error.list.min_items; limit_value=4)
geometry -> coordinates -> 0 -> 2
  ensure this value has at least 4 items (type=value_error.list.min_items; limit_value=4)
geometry -> coordinates -> 0 -> 3
  ensure this value has at least 4 items (type=value_error.list.min_items; limit_value=4)
geometry -> coordinates -> 0 -> 4
  ensure this value has at least 4 items (type=value_error.list.min_items; limit_value=4)
geometry -> type
  unexpected value; permitted: 'GeometryCollection' (type=value_error.const; given=Polygon; permitted=['GeometryCollection'])
geometry -> geometries
  field required (type=value_error.missing)

However this does not happen if Feature is omitted:

{
    "type": "Polygon",
    "coordinates": [
        [
            [-55.9947406591177, -9.26104045526505],
            [-55.9976752102375, -9.266589696568962],
            [-56.00200328975916, -9.264041751931352],
            [-55.99899921566248, -9.257935213034594],
            [-55.99477406591177, -9.26103945526505],
        ]
    ],
}

Result:

1 validation error for Polygon
coordinates
  All linear rings have the same start and end coordinates (type=value_error)

Can you look into that?
Thanks.

Version string mismatch

When installing geojson-pydantic==0.2.2 from PyPI, the version numbering is off:

>>> import geojson_pydantic
>>> geojson_pydantic.version
'0.2.1'

Create conda-forge recipe

Hey team,

Firstly thanks for this project. It's exactly what I need (especially the 0.3.0 release).

Just wanted to request permission from you to create a conda-forge package for this. I've already created a recipe for this here. I'm happy to maintain this moving forward and an option is to also add another maintainer. Please let me know

`Feature` validates arbitrary dictionaries

Hi @vincentsarago, I stumbled across some unexpected behaviour in these pydantic models and thought I'd ask for guidance here!

  1. Feature seems to validate arbitrary dictionaries:
>>> geojson_pydantic.Feature.parse_obj({"hi this is": "not a feature"})
Feature(type='Feature', geometry=None, properties=None, id=None, bbox=None)

This seems kind of counter-productive, is this intentional? At least judging from the geojson json-schema, id should not be optional!

  1. Feature validates the list ["vh", "vv"], but not the list ["not", "feature"]. This is causing problems in my application, because the list of bands ["vv", "vh"] is incorrectly parsed into a GeoJson, even though it's just a list.
>>> geojson_pydantic.Feature.parse_obj(["vh", "vv"])
Feature(type='Feature', geometry=None, properties=None, id=None, bbox=None)
>>> geojson_pydantic.Feature.parse_obj(["not", "feature"])

ValueError Traceback (most recent call last)
File .venv/lib/python3.9/site-packages/pydantic/main.py:522, in pydantic.main.BaseModel.parse_obj()
ValueError: dictionary update sequence element #0 has length 3; 2 is required
The above exception was the direct cause of the following exception:
ValidationError Traceback (most recent call last) Cell In[4], line 1 ----> 1 geojson_pydantic.Feature.parse_obj(["not", "feature"])
File .venv/lib/python3.9/site-packages/pydantic/main.py:525, in pydantic.main.BaseModel.parse_obj()
ValidationError: 1 validation error for Feature root Feature expected dict not list (type=type_error)

This seems super strange too, do you have any ideas on why that specific list would validate and not the other? I have not been able to find another example.


I've run all these in a fresh virtual environment (Python 3.9.5) with only geojson_pydantic installed. Versions are the following:

print(geojson_pydantic.__version__)
print(pydantic.__version__)
>>> 0.5.0 
>>> 1.10.2

Thanks!

geojson-pydantic types expect dicts and not json

The JSON validator of Pydantic expect a string json-formatted.

I understand why it's a good thing to validate python dict object with a valid geojson structure, but what if I already have a geojson string and I want to validate this string with the geojson-pydantic types?

Sure I can json.loads() before but I think it should be automated by the pydantic type.
Also, validate string is coherent with the pydantic.Json validator.

Support GeometryCollection as an input for geometry field

According to spec a Geojson should be able to have a GeometryCollection as a geometry in a feature.

I propose changing this line:

Geom = TypeVar("Geom", bound=Optional[Geometry])

from

Geom = TypeVar("Geom", bound=Optional[Geometry])

to

Geom = TypeVar("Geom", bound=Optional[Union[Geometry, GeometryCollection]])

Is there a reason we don't currently support this change?

I've written some tests locally that work with this change. I'm happy to submit a PR

Module "geojson_pydantic" does not explicitly export attribute "Polygon"; implicit reexport disabled

When using this package in the following way:

from geojson_pydantic import Feature, Polygon
from pydantic import BaseModel

class MyModel(BaseModel):
    geojson: Feature[Polygon, dict]

and running mypy i seem to get the following errors:

schemas.py:5: error: Module "geojson_pydantic" does not explicitly export attribute "Feature"; implicit reexport disabled
    from geojson_pydantic import Feature, Polygon
    ^
schemas.py:5: error: Module "geojson_pydantic" does not explicitly export attribute "Polygon"; implicit reexport disabled
    from geojson_pydantic import Feature, Polygon
    ^
Found 2 errors in 1 file (checked 8 source files)

Am i using this in the right way ? the code behaves as expected, but the type declaration doesnt like me using the types from the main package, should i explicitly import them from their source location ?

add WKT geometry parser

in #64 we added .wkt property to return geometry as WKT. I know it's not the main purpose of this library but I think it will be nice to have the ability to go the other way around!

TypeError: Object of type FeatureCollection is not JSON serializable

Hi

First of all thank you for building this fantastic library.
I tried to use it in one of my project
I use FastAPI with pydantic validations

My pydantic model looks like this

from typing import Optional
from geojson_pydantic.features import FeatureCollection
from pydantic import BaseModel

class CustomLayer(BaseModel):
    is_public: Optional[bool] = False
    layer: FeatureCollection

And my route looks like this

@router.post("/", response_model=FeatureCollection, status_code=status.HTTP_201_CREATED)
async def create_item(payload: CustomLayer, access_token: Optional[str] = Header(None)):
    if not access_token:
        raise_401_exception()
    user = await check_user_credentials(access_token)
    if not user:
        raise_401_exception()
    layer_id = await customlayers_repository.create(payload, user)
    layer = await customlayers_repository.get_one(layer_id)
    if not layer:
        raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
                            detail=f"Problem occured during item creation")
    return layer.geojson

But when I run it I got some trouble with JSON serialization of the FeatureCollection Model.

Here is the TraceBack

 Traceback (most recent call last):
   File "/usr/local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py", line 396, in run_asgi
     result = await app(self.scope, self.receive, self.send)
   File "/usr/local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__
     return await self.app(scope, receive, send)
   File "/usr/local/lib/python3.8/site-packages/fastapi/applications.py", line 199, in __call__
     await super().__call__(scope, receive, send)
   File "/usr/local/lib/python3.8/site-packages/starlette/applications.py", line 111, in __call__
     await self.middleware_stack(scope, receive, send)
   File "/usr/local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 181, in __call__
     raise exc from None
   File "/usr/local/lib/python3.8/site-packages/starlette/middleware/errors.py", line 159, in __call__
     await self.app(scope, receive, _send)
   File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 86, in __call__
     await self.simple_response(scope, receive, send, request_headers=headers)
   File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 142, in simple_response
     await self.app(scope, receive, send)
   File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 86, in __call__
     await self.simple_response(scope, receive, send, request_headers=headers)
   File "/usr/local/lib/python3.8/site-packages/starlette/middleware/cors.py", line 142, in simple_response
     await self.app(scope, receive, send)
   File "/usr/local/lib/python3.8/site-packages/starlette/exceptions.py", line 82, in __call__
     raise exc from None
   File "/usr/local/lib/python3.8/site-packages/starlette/exceptions.py", line 71, in __call__
     await self.app(scope, receive, sender)
   File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 566, in __call__
     await route.handle(scope, receive, send)
   File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 227, in handle
     await self.app(scope, receive, send)
   File "/usr/local/lib/python3.8/site-packages/starlette/routing.py", line 41, in app
     response = await func(request)
   File "/usr/local/lib/python3.8/site-packages/fastapi/routing.py", line 201, in app
     raw_response = await run_endpoint_function(
   File "/usr/local/lib/python3.8/site-packages/fastapi/routing.py", line 148, in run_endpoint_function
     return await dependant.call(**values)
   File "./app/api/customlayers.py", line 21, in create_item
     layer_id = await customlayers_repository.create(payload, user)
   File "./app/db/customlayers.py", line 9, in create
     query = CustomLayersTable.insert().values(geojson=json.dumps(payload.layer), user_id=user["user_id"], is_public=payload.is_public)
   File "/usr/local/lib/python3.8/json/__init__.py", line 231, in dumps
     return _default_encoder.encode(obj)
   File "/usr/local/lib/python3.8/json/encoder.py", line 199, in encode
     chunks = self.iterencode(o, _one_shot=True)
   File "/usr/local/lib/python3.8/json/encoder.py", line 257, in iterencode
     return _iterencode(o, 0)
   File "/usr/local/lib/python3.8/json/encoder.py", line 179, in default
     raise TypeError(f'Object of type {o.__class__.__name__} '
 TypeError: Object of type FeatureCollection is not JSON serializable

Is anybody have an idea on how I can fix this ?

Polygon from_bounds method

The polygon method has the from_bounds method for building a polygon from boundaries. This is cool functionality! But often in python projects the shapely library is used. There is also a Polygon class with the from_bounds method. The construction takes place in a similar way, but the polygon is built in the other direction) Is it possible to make the construction of a polygon similar in shapely?

add top level exports

# now
from geojson_pydantic.features import Feature, FeatureCollection

# future 
from geojson_pydantic import Feature, FeatureCollection

maybe also for geometries

Mixed Dimensionality and WKT

Mixed Dimensionality is a weird edge case to be thought about.

We test WKT by comparing to shapely's output. In the case that ANY coordinate in a Geometry has a third dimension, all of the coordinates end up with the third dimension. Which I think is correct for WKT.

shapely.MultiPoint([(0,0),(1,1,1)]).wkt
"MULTIPOINT Z (0 0 0, 1 1 1)"

geojson_pydantic.MultiPoint(type="MultiPoint", coordinates=[(0,0),(1,1,1)]).wkt
# Either this currently
"MULTIPOINT (0.0 0.0, 1.0 1.0 1.0)"
# Or if using a haz_z function
"MULTIPOINT Z (0.0 0.0, 1.0 1.0 1.0)"
# Neither of which I think would be correct

This could be fixed by adding a Boolean to all the *_wkt_coordinates helper functions that defaults to False, and adding some logic to position.

def _position_wkt_coordinates(position: Position, force_z: bool = False) -> str:
    coordinates = " ".join(str(number) for number in position)
    if force_z and len(position) < 3:
        coordinates += " 0.0"
    return coordinates 

This would not affect the actual data, just output WKT better.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.