Giter VIP home page Giter VIP logo

cube-in-a-box's People

Contributors

16eagle avatar alexgleith avatar bluetyson avatar dominicschaff avatar edeo2021 avatar felipk101 avatar luigidifraia avatar n3h3m avatar pindge avatar richardscottoz avatar riyasdeen avatar stratosgear avatar whatnick avatar woodcockr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cube-in-a-box's Issues

Update requirements.txt

The numpy version needs to be upgraded numpy==1.24 otherwise make setup won't compile because of the following error (on Python 3.8):

docker-compose exec -T jupyter datacube -v system init
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/dask/array/__init__.py", line 2, in <module>
    from dask.array import backends, fft, lib, linalg, ma, overlap, random
  File "/usr/local/lib/python3.8/dist-packages/dask/array/backends.py", line 6, in <module>
    from dask.array.core import Array
  File "/usr/local/lib/python3.8/dist-packages/dask/array/core.py", line 30, in <module>
    from numpy.typing import ArrayLike
ModuleNotFoundError: No module named 'numpy.typing'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/bin/datacube", line 5, in <module>
    from datacube.scripts.cli_app import cli
  File "/usr/local/lib/python3.8/dist-packages/datacube/__init__.py", line 29, in <module>
    from .api import Datacube
  File "/usr/local/lib/python3.8/dist-packages/datacube/api/__init__.py", line 9, in <module>
    from .core import Datacube, TerminateCurrentLoad
  File "/usr/local/lib/python3.8/dist-packages/datacube/api/core.py", line 13, in <module>
    from dask import array as da
  File "/usr/local/lib/python3.8/dist-packages/dask/array/__init__.py", line 271, in <module>
    raise ImportError(str(e) + "\n\n" + msg) from e
ImportError: No module named 'numpy.typing'

Dask array requirements are not installed.

Please either conda or pip install as follows:

  conda install dask                 # either conda install
  python -m pip install "dask[array]" --upgrade  # or python -m pip install

osgeolive 2020 python3 synergy

greetings AU and good opendatacube team -- fyi the Ubuntu 'Focal' 20.04 LTS system packaging setup includes an integrated Jupyter and stable set of core libraries (e.g. gdal+rasterio). OSGeoLive #14 is reliably built on top of Ubuntu LTS (currently in pre-alpha) each year by an international team led by OSGeo President 'kalxas' Angelos Tzotsos PhD in Athens, Greece . OPENDATACUBE cube-in-a-box may benefit from simplifying the python and jupyter expectations, for some cases, and run "natively" .. without docker .. on the OSGeoLive Ubuntu Focal LTS. Please note that the Focal environment does ship natively with DASK.

the ODC team is cordially invited to check-out more details at TRAC ticket https://trac.osgeo.org/osgeolive/ticket/2247

can not found ./data/pg in this root path

- ./data/pg:/var/lib/postgresql/data

I didn't install docker-compose, so I started it manually with docker,
use like docker run -v ./data/pg:/var/lib/postgresql/data which means mount local dir ./data/pg into docker dir /var/lib/postgresql/data, but I found that I couldn't find the ./data/pg directory in this repository, as linked above.

where can i find the ./data/pg file?

Sentinel 2 notebook

Hi,

In github.com/opendatacube/cube-in-a-box/blob/master/notebooks/Sentinel_2.ipynb

Tried this from the cube I installed a while ago

import sys
import datacube

ModuleNotFoundError Traceback (most recent call last)
in

  4 

----> 5 from datacube.utils.deafrica_datahandling import load_ard
6 from datacube.utils.deafrica_plotting import rgb

ModuleNotFoundError: No module named 'datacube.utils.deafrica_datahandling'

Do I need to update the cube in a box?

ERR_CONNECTION_REFUSED on CloudFormation PublicDNS

@alexgleith, I'm at USGS research scientist and a member of the Pangeo community. At the US AWS Public Sector Summit last week in Washington DC I heard a lot about opendatacube and was excited to find out that ODC works with COGs on S3.

After watching your awesome ODC in a box video, I tried the CloudFormation magic link following the detailed AWS instructions.

Everything seemed to go well: 2019-06-18_8-21-51

But when I click the publicDNS link, I get: ERR_CONNECTION_REFUSED.

I'm guessing this is something simple like adding some extra policy somewhere, right?

Replace port

Hello how can I change the Jupiter launch port from https: // localhost to another like https: // localhost:8010?
Thanks for the answer.

Cannot connect to Postgres

Hello,

I have tried to run the commands individually and get the following error a couple of minutes after running this step:
docker-compose exec jupyter datacube -v system init

The error message:
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection timed out
Is the server running on host "postgres" (192.168.64.2) and accepting
TCP/IP connections on port 5432?

Appreciate any assistance.

Adjust time range in query of the `Sentinel_2.ipynb` file.

The query variable is

query = {
    'time': ('2020-08', '2020-10'),
    'x': (lon - buffer, lon + buffer),
    'y': (lat + buffer, lat - buffer),
    'output_crs': 'epsg:6933',
    'resolution':(-20,20),
}

but this gave me an empty datacube. I updated the query to

query = {
    'time': ('2020-01', '2020-03'),
    'x': (lon - buffer, lon + buffer),
    'y': (lat + buffer, lat - buffer),
    'output_crs': 'epsg:6933',
    'resolution':(-20,20),
}

and it worked.

Indexing Sentinel S2 Level-2A dataset: KeyError: 'sentinel:latitude_band'

Before the server upgraded the STAC version to 1.0.0-beta.2, I was able to successfully load the following Sentinel Level-2A time-series from the online earth-search.aws.element84.com/v0/ catalogue into my ODC:

$ stac-to-dc \
    --bbox='11,45,12,46' \
    --catalog-href='https://earth-search.aws.element84.com/v0/' \
    --collections='sentinel-s2-l2a-cogs' \
    --datetime='2015-06-01/2023-07-01'" 

Now I get the following error which inhibits the successful

pystac_client.warnings.DoesNotConformTo: Server does not conform to ITEM_SEARCH, There is not fallback option available for search.

Using the alternative catalogue from earth-search.aws.element84.com/v1/ with STAC 1.0.0, and switching to the available S2 Level-2A collection "sentinel-2-l2a":

$ stac-to-dc \
    --bbox='11,45,12,46' \
    --catalog-href='https://earth-search.aws.element84.com/v1/' \
    --collections='sentinel-2-l2a' \
    --datetime='2015-06-01/2023-07-01'" 

Now I get the error in the title of this issue:

08/04/2023 03:30:12: ERROR: Failed to handle item S2A_32TPQ_20161105_0_L2A with exception 'sentinel:latitude_band'
...
  File "/usr/local/lib/python3.10/site-packages/odc/apps/dc_tools/_stac.py", line 96, in _stac_product_lookup
    properties["sentinel:latitude_band"],
KeyError: 'sentinel:latitude_band'

Indeed, it should look (in this case) into "mgrs:latitude_band" (and "mgrs:grid_square") properties:

            if region_code is None:
                # Let this throw an exception if there's something missing
                region_code = "{}{}{}".format(
                    str(properties["proj:epsg"])[-2:],
+++                 properties["mgrs:latitude_band"],
+++                 properties["mgrs:grid_square"],
---                 properties["sentinel:latitude_band"],
---                 properties["sentinel:grid_square"],

If I correct this part, I then also get other problems due to assumptions on the name of the bands:

odc.apps.dc_tools.utils.IndexingException: Failed to create dataset with error The dataset is not specifying all of the measurements in this product.
Missing fields are;
{'B02', 'B01', 'B8A', 'B07', 'B03', 'B04', 'B09', 'AOT', 'WVP', 'B08', 'B11', 'SCL', 'B05', 'B12', 'B06'}
 The URI was s3://sentinel-cogs/sentinel-s2-l2a-cogs/32/T/QS/2023/6/S2B_32TQS_20230627_0_L2A/S2B_32TQS_20230627_0_L2A.json

Indeed there's no such field in the items; e.g: earth-search.aws.element84.com/v1/collections/sentinel-2-l2a/items/S2B_38XNJ_20230804_0_L2A.

Is this a catalogue error or too many assumptions made by dc_tools?
Thanks!

odc-io installation fails

docker-compose up fails during installation of odc-io (as part of odc-apps-dc-tools), probably due to a new release (odc-io 0.2).

grafik

Modifying the requirements.txt by adding pystac before odc-apps-dc-tools seems to solve the problem.

Swapped x,y coords in GDAL3 case

I know it's not merged yet, but I saw this code in Slack, and pretty sure this bit is wrong:

y, x, z = t.TransformPoint(p[1], p[0])

Should be

    x, y, z = t.TransformPoint(p[1], p[0]) 

order of projected coordinates is typically not swapped by CRS definition.

Also I would strongly suggest extracting and labeling lon/lat from p rather than using index based access, something like this up top: lon, lat = p, as this then clearly communicates the expected order of coordinates on input to the transform function.

Failure to start in Windows WSL2

Cube-in-a-box docker-compose up fails on WSL2. See logs below.

Building jupyter
[+] Building 30.7s (11/15)                                                                                                                                                                               
 => [internal] load build definition from Dockerfile                                                                                                                                                0.3s
 => => transferring dockerfile: 38B                                                                                                                                                                 0.3s
 => [internal] load .dockerignore                                                                                                                                                                   0.3s
 => => transferring context: 34B                                                                                                                                                                    0.2s
 => [internal] load metadata for docker.io/opendatacube/geobase:runner-3.0.4                                                                                                                        4.7s
 => [internal] load metadata for docker.io/opendatacube/geobase:wheels-3.0.4                                                                                                                        4.5s
 => [env_builder 1/4] FROM docker.io/opendatacube/geobase:wheels-3.0.4@sha256:4a8479bc427aecfb0a3fb52e95dc0edbf84dde9151347bc693b64a2ad4faa8ef                                                      0.0s
 => [internal] load build context                                                                                                                                                                   0.1s
 => => transferring context: 38B                                                                                                                                                                    0.1s
 => CACHED [stage-1 1/6] FROM docker.io/opendatacube/geobase:runner-3.0.4@sha256:544992c5bf60dcf2c290088c4abdc8ee5c4e4aadb6c10c7aafadebac118e5db1                                                   0.0s
 => CACHED [env_builder 2/4] RUN mkdir -p /conf                                                                                                                                                     0.0s
 => CACHED [env_builder 3/4] COPY requirements.txt /conf/                                                                                                                                           0.0s
 => CACHED [env_builder 4/4] RUN env-build-tool new /conf/requirements.txt /env /wheels                                                                                                             0.0s
 => ERROR [stage-1 2/6] COPY --chown=1000:100 --from=env_builder                                                                                                                                   25.3s
------
 > [stage-1 2/6] COPY --chown=1000:100 --from=env_builder  :
------
cannot copy to non-directory: /var/lib/docker/overlay2/0liqdipk735zo73mh1rca6yqw/merged/usr/local/man
ERROR: Service 'jupyter' failed to build

Presumably due to path issues / symlinks.

Not working on M1 Macs

Hi!
I was trying to test out cube in a box on my computer, however I failed on the first step. When running 'make setup' I get this error:
'ERROR: could not build wheels for numpy which use PEP 517 and cannot be installed directly'

Error: Got unexpected extra argument (s2_l2a)

After trying to execute
docker-compose exec jupyter bash -c "stac-to-dc --bbox='25,20,35,30' --collections='sentinel-s2-l2a-cogs' --datetime='2020-01-01/2020-03-31' s2_l2a"

following error is shown
Error: Got unexpected extra argument (s2_l2a)

In case of editing as suggested in dcc32a6
following error is raised
AttributeError: 'ItemSearch' object has no attribute 'get_all_items'

Failed to resolve driver warnings/errors

Hi,

I'm on Arch Linux with docker, docker-compose and make installed/working.

Docker version 20.10.10, build b485636f4b
Docker Compose version 2.1.1
GNU Make 4.3

Running make setup or make setup-prod result in lots of these errors in the output.

11/11/2021 04:13:25: WARNING: Failed to resolve driver datacube.plugins.index::default
11/11/2021 04:13:25: WARNING: Error was: ContextualVersionConflict(pyparsing 3.0.5 (/usr/local/lib/python3.8/dist-packages), Requirement.parse('pyparsing<3,>=2.0.2'), {'packaging'})

The cube-in-a-box starts anyway and I am able to load up Jupyter at http://localhost. However, I get the following errors in Sentinel_2.ipynb when running it in Jupyter.

Failed to resolve driver datacube.plugins.io.read::netcdf
Error was: ContextualVersionConflict(pyparsing 3.0.5 (/usr/local/lib/python3.8/dist-packages), Requirement.parse('pyparsing<3,>=2.0.2'), {'packaging'})
CPLReleaseMutex: Error = 1 (Operation not permitted)

Thanks

WARNING: Failed to resolve driver datacube.plugins.index::default

WARNING: Failed to resolve driver datacube.plugins.index::default
WARNING: Error was: ContextualVersionConflict pyparsing 3.0.5 (/usr/local/lib/python3.8/dist-packages), Requirement.parse('pyparsing<3,>=2.0.2'), {'packaging'})

it is possible to test your docker files and solve this? Usually docker is the first step in using an open source solution, it is disappointing to have a docker image that do not work.
Do you know about other docker containers that works?

Error when I run system init

When I run,

sudo docker-compose exec jupyter datacube -v system init

I'm getting the following error,

ValueError: 'default' must be a list when 'multiple' is true.

Running Kubuntu 20.04

NASADEM code and figure out of sync

just a minor one, the example notebook for NASADEM shows a part of Tasmania, as if this code were run:

data = dc.load(product="nasadem", resolution=(-0.004, 0.001), lon=(146.0, 148.0), lat=(-43.5, -41.5))

the code in the notebook is

data = dc.load(product="nasadem", resolution=(-0.001, 0.001), lon=(29.999,31.0), lat=(29.999,31.05))

the notebook as at this commit: 974e2d4

conflicting docker.list

I'm running on Jammy (ubuntu 22.04), and this line creates a docker.list in the sources:

https://github.com/opendatacube/cube-in-a-box/blob/main/scripts/install-docker.sh#L12

and with apt update that gave me

E: Conflicting values set for option Signed-By regarding source https://download.docker.com/linux/ubuntu/ jammy: /usr/share/keyrings/docker-archive-keyring.gpg != /usr/share/keyrings/docker-archive-keyring.asc
E: The list of sources could not be read.

I found similar discussion here, and removing /etc/apt/sources.list.d/docker.list worked for me, then I had to do the install step for docker-compose manually, and the user-in-docker-group part. And then it worked :)

docker/for-linux#1349

GDAL3 support in ls_public_bucket.py script

GDAL3 swaps order of Lon/Lat and generated extents are not correct when running this script in the environment with GDAL3 installed.

def get_coords(geo_ref_points, spatial_ref):
t = osr.CoordinateTransformation(spatial_ref, spatial_ref.CloneGeogCS())
def transform(p):
lon, lat, z = t.TransformPoint(p['x'], p['y'])
return {'lon': lon, 'lat': lat}

In GDAL3 output of TransformPoint is lat,lon,z instead. But that code is wrong in other ways, since we need bounding box of the geometry in EPSG:4326 and not a mere transform or the 4 corners to ESPG:4326. This also doesn't work when dataset's footprint spans lon=180 line.
opendatacube/datacube-core#886

Also shouldn't that script be in datacube-dataset-config repo?
opendatacube/datacube-dataset-config#10

Open data explorer endpoint in addition to Jupyter port

I'm currently in the process of setting up ODC with the OpenEO ODC driver but noticed that ODC cube in a box only exposes port 8888 of jupyter to the host.

I've noticed that there are quite some python3 processes opening tcp port on localhost:

!netstat -anp |grep tcp |grep LISTEN
tcp 0 0 127.0.0.1:53257 0.0.0.0:* LISTEN 187/python3
tcp 0 0 127.0.0.1:52043 0.0.0.0:* LISTEN 187/python3
tcp 0 0 127.0.0.1:60323 0.0.0.0:* LISTEN 187/python3
tcp 0 0 127.0.0.1:34277 0.0.0.0:* LISTEN 187/python3
tcp 0 0 127.0.0.1:36557 0.0.0.0:* LISTEN 187/python3
tcp 0 0 127.0.0.11:36147 0.0.0.0:* LISTEN -
tcp 0 0 127.0.0.1:42367 0.0.0.0:* LISTEN 187/python3
tcp 0 0 0.0.0.0:8888 0.0.0.0:* LISTEN 7/python3

!nmap -sS -p- 127.0.0.1 |grep tcp |awk -F"/" '{print $1}' | sort
34277
36557
42367
52043
53257
60323
8888

But connecting to those I realized that they are not HTTP ports:

!for port in nmap -sS -p- 127.0.0.1 |grep tcp |awk -F"/" '{print $1}' | sort; do echo 'GET / HTTP1.1\n' | telnet localhost $port;
done
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.

So I'm actually wondering how the following code works at all (need to dig into the source code still)

dc = datacube.Datacube(app='Sentinel_2')

If I know which port to open or what is needed to tell ODC to opten the data explorer endpoint I can adjust the compose yaml to make it work

Landsat default dataset file not found

In the Cube In a Box magic link setup

  • wget https://landsat.usgs.gov/sites/default/files/documents/WRS2_descending.zip -O /opt/odc/data/wrs2_descending.zip
    /opt/odc/data/wrs2_descending.zip: No such file or directory
    Cloud-init v. 18.5-45-g3554ffe8-0ubuntu118.04.1 running 'modules:final' at Fri, 05 Feb 2021 02:33:52 +0000. Up 57.92 seconds.
    2021-02-05 02:35:32,236 - util.py[WARNING]: Failed running /var/lib/cloud/instance/scripts/part-001 [1]
    2021-02-05 02:35:32,237 - cc_scripts_user.py[WARNING]: Failed to run module scripts-user (scripts in /var/lib/cloud/instance/scripts)
    2021-02-05 02:35:32,238 - util.py[WARNING]: Running module scripts-user (<module 'cloudinit.config.cc_scripts_user' from '/usr/lib/python3/dist-packages/cloudinit/config/cc_scripts_user.py'>) failed
    Cloud-init v. 18.5-45-g3554ffe8-0ubuntu1
    18.04.1 finished at Fri, 05 Feb 2021 02:35:32 +0000. Datasource DataSourceEc2Local. Up 157.24 seconds

comment that out I then get :- mv: cannot move 'cube-in-a-box-master' to '/opt/odc/cube-in-a-box-master': Directory not empty - as already done? - after rerunning the part-001 script again

and I comment that out and create /opt/odc/data I get to this which is probably my fault from multiple runs

Digest: sha256:99cdc036afe01255b329c72a9fca201dea00619f22fd50a24c98f2b1dd4f7891
Status: Downloaded newer image for opendatacube/geobase:runner-3.0.4
---> 473834e4aa92
Step 7/16 : ARG py_env_path
---> Running in 293a1ae4f53b
Removing intermediate container 293a1ae4f53b
---> cb13c9ae7b76
Step 8/16 : COPY --chown=1000:100 --from=env_builder $py_env_path $py_env_path
Error processing tar file(exit status 1): write /env/lib/python3.6/site-packages/skimage/feature/_texture.cpython-36m-x86_64-linux-gnu.so: no space left on device
ERROR: Service 'jupyter' failed to build

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.